That‘s certainly something they‘re gonna want to fix. I hope DF and GN pick up on this, seems like free views and I‘d love to hear what they‘ve got to say on the matter.
Edit: Also wondering if it‘s the app or if the performance hit disappears when you disable the overlay. Only flew over the article to see what games are affected how badly so mb if that’s mentioned.
Edit 2:
HUB‘s Tim tested it and found that it‘s the overlay or rather the game filter portion of the overlay causing the performance hit. You can disable this part of the overlay in the app‘s settings, or disable the overlay altogether.
He also found that this feature wasn’t impacting performance on GeForce Experience, so it’s very likely a bug that’s gonna be fixed.
To clarify: Using game filters actively can have an impact on either, but right now even when not actively using them, they cause a performance hit just by the functionality being enabled; a bug.
The only outlier where just having the app installed hit performance was the Harry Potter game.
I don’t think you’re understanding. The testing they did was presumably fine and the performance hit is probably unacceptable. But mentioning but not testing the scenarios of
Here’s the quote, for people allergic to reading the update in the article.
Update: Nvidia sent us a statement: “We are aware of a reported performance issue related to Game Filters and are actively looking into it. You can turn off Game Filters from the NVIDIA App Settings > Features > Overlay > Game Filters and Photo Mode, and then relaunch your game.”
We have tested this and confirmed that disabling the Game Filters and Photo Mode does indeed work. The problem appears to stem from the filters causing a performance loss, even when they’re not being actively used. (With GeForce Experience, if you didn’t have any game filters enabled, it didn’t affect performance.) So, if you’re only after the video capture features or game optimizations offered by the Nvidia App, you can get ‘normal’ performance by disabling the filters and photo modes.
So, TomsHW (is at least claiming that they) did indeed test this, and found that its the filters and photo mode causing the performance hit.
Still a pretty stupid problem to have, considering the old filters did not cause this problem, but at least there’s a workaround.
… I’m curious if this new settings app even exists, or has been tested on linux.
It’s not CEF that does most of the impact. It’s the contents web devs make it load and process. And web devs generally not being very competent in optimizing is just a sad reality.
web Devs aren’t ignorant to optimizing but the kind of interfaces used in web are very different to that of desktop. Cross platform technologies can work, but anything built on top of web engines is going to be a little dogshit on native platforms.
Web tech was designed around the asynchronous and comparatively slow nature of the network. Now, those same layout and rendering engines are being shoehorned into an environment where the “server” is your local disk so it’s suddenly doing a bunch of work that was intended to be done iteratively.
Same goes the other way of course. Software designed for “native first” experiences like Flutter aren’t as popular in web dev because they work on that same, but reversed, assumption of a local disk being your source.
It would be like wondering why physical game disks aren’t popular on PC - it’s a fundamentally different technology for fundamentally different expectations and needs.
but anything built on top of web engines is going to be a little dogshit on native platforms.
Hard disagree on “little”.
Software designed for “native first” experiences like Flutter aren’t as popular in web dev because they work on that same, but reversed, assumption of a local disk being your source.
Popularity should not be dictated by what web devs prefer. As long as they build for desktop, I won’t pardon excessive resource usage. And I’m not talking about Flutter. Better performance oriented frameworks exist, see sciter.
I had initially lowered it a bit at some point. I didn’t realize it was steam recording for a while and spent a day or two trying driver updates and various things. next time I have a chance I’ll try a significant decrease just for testing.
I know people complain about Nvidia and Linux but one of the best parts of my experience with it was never having to deal with GFE. Just a bunch of project managers trying to make themselves useful by shovelling needless slop into your GPU driver.
I used to only use this for game recording. But, it got a glitch where games record with a red tint ever since I upgraded my monitor. Thankfully, every single gaming helper app seems to feature recording now, so I just switched to another.
Sounds like something adjusted something in the nvidia control panel and the monitor is balancing that out with a low red value.
Maybe worth to take a look.
That’s interesting and all, but I still don’t see a reason to upgrade my PS5 to a Pro, and frankly it wouldn’t even be that interesting for the price as a new player either.
Are there like any games that will really make use of the new hardware? Other than perhaps upgraded framerates and better 4K support. The average console player probably isn’t going to care that much, not for the giant price increase over minimal gains.
I feel like all games on this generation will still be limited to the base PS5 anyway, can’t imagine hardware matters much until the next generation consoles.
I know, but I was being impatient before. Ragnarok is already on PC and I kinda forgot about it. I’ll look into it once a sale hits, but even then it’ll be a debate with myself over the psn stuff.
People who don’t have a gaming PC but still want to game would be the next target audience in line, since they wouldn’t have another machine to play third-party games on anyway, so the exclusive would just be a bonus on top.
But I don’t think they’re even interested in paying so much extra for features they don’t even care about. Perhaps a smooth high framerate in casual shooters would be something they’d care for, but that can easily be achieved on base PS5 with at least 60+ FPS. I don’t think they’re the ones that care about true 4K, 120Hz/FPS or slightly better textures.
The only thing I can think of that people are hyped up for is GTA6. I fear that Rockstar might sell out to Sony and deliver a shitty 30FPS locked, low resolution and texture version of the game on older PS5 models on purpose, just to “push the hardware” of the newest model. But then again, they also couldn’t even be arsed to unlock framerate for RDR2 on PS5, not even after so many years.
I’m already decided. I’m not buying GTA 6. And GTA 6 was the whole reason I bought the PS5 to begin with.
Over the past year I’ve seen how rockstar are making moves to make GTA 6 even more of a pay to win multiplayer experience, and less focused on the $60 single player experience. All of this at a $60 or more price point to start with I’m sure.
If you want to be pay to win, you can’t also be AAA pricepoint to buy the game. I personally don’t play pay to win games, but when you charge for the base game it goes from being a sketchy game mechanic, to being an outright scam.
You know what I’m playing right now on PS5? Transportation Fever 2. Fuck off Rockstar. You lost me as a lifelong customer since the first GTA on PS1.
I haven’t read much into GTA 6 so far, only seen the trailer basically.
I do hope the singleplayer will still be as good as previous games, although I definitely would expect them to try and cash in on online even more.
T2 and Rockstar definitely fucked up with GTA 5 too. Originally there were supposed to be singleplayer expansions. Which they of course dropped in favor of how popular online got. And then they even proceeded to ban mods that took multiplayer-only cars to singleplayer, fucking disgusting move.
I’ll wait and see how the singleplayer is. I never bought GTA5 for its multiplayer, it only got less appealing the more they added to it too. The only part that interested me much later on were the RP servers, it genuinely looked fun on some of the moderated ones, so maybe Rockstar will try to get into that, but if online is just a carbon copy of GTA5 I won’t even bother.
I think you can expect about the same as with the PS4 Pro. Maybe finally this time it will be a smooth actual 4k (ok actually, UHD) gaming experience. But that’s kinda what we said last time too, so I don’t know.
Developers would still have to optimize their games to get the most out of the hardware, unless we’re talking about a game that was already performing suboptimal and throwing raw power at it will hide the surface level problems so it looks smoother.
I would love to see all this horsepower being used to actually make the games better by design, like pathfinding and NPC behaviour. The last big breakthrough we had was raytracing, which proved that it wasn’t photorealism that makes it look better, but accurate lighting and shadows. For the consoles it was using an SSD for almost instant loading times.
But I digress. I’m not upgrading my PS5 either, but I can see the value for power users that play competitively or something.
There is a PS5 version with no disc drive. You can even currently buy a disc drive for it, buy the disc version side plates, remove your discless sideplate, connect the disc drive, and then attatch the disc drive version of the side plates…and you have the disc version of ps5 that they would have sold you in stores.
I still refuse to believe they’re not a fake term used to fluff up tech announcements and make shit sound more powerful than it is because that’s a fucking stupid name that nobody should use
That’s like saying clock rate and core count are fake terms. Sure, by themselves they might not mean much, but they’re part of a system that directly benefits from them being high.
The issue with teraflops metric is that it is inversely proportional (almost linearly) to the bit-length of the data, meaning that teraflops@8-bit is about 2x(teraflops@16-bit). So giving teraflops without specifying the bit-length it comes from is almost useless. Although you could make the argument that 8-bit is too low for modern games and 64-bit is too high of a performance trade off for accuracy gain, so you can assume the teraflops from a gaming company are based on 16-bit/32-bit performance.
Oh the 16gigs is for devs/games and the 2gb is exclusively for the system. Was wondering how they were able to get by with only 2 gigs of ram and 16gigs of vram originally lmao.
tomshardware.com
Aktywne