I always find it fascinating when they have to make a trailer for an incredibly violent video game but still have it be approved for general audiences. It has to be edited to perfection, down to the frame, to show violence without any of the violence. 😅
You’re getting downvoted but this will be correct. DLSSFG looks dubious enough on dedicated hardware, doing this on shader cores means it will be competing with the 3D rendering so will need to be extremely lightweight to actually offer any advantage.
I wouldnt say compete as the whole concept of frame generation is that it generates more frames when gpu resouces are idle/low due to another part of the chain is holding back the gpu from generating more frames. Its sorta like how I view hyperthreads on a cpu. They arent a full core, but its a thread that gets utilized when there are poonts in a cpu calculation that leaves a resouce unused (e.g if a core is using the AVX2 accerator to do some math, a hyperthread can for example, use the ALU that might not be in use to do something else because its free.)
It would only compete if the time it takes to generate one additional frame is longer than the time a gpu is free due to some bottleneck in the chain.
You guys are talking about this as if it’s some new super expensive tech. It’s not. The chips they throw inside tvs that are massively cost reduced do a pretty damn good job these days (albit, laggy still) and there is software you can run on your computer that does compute based motion interpolation and it works just fine even on super old gpus with terrible compute.
Yeah, it does, which is something tv tech has to try and derive themselves. Tv tech has to figure that stuff out. It’s actually less complicated in a fun kind of way. But please do continue to explain how it’s more compute heavy
Also just to be very clear, tv tech also encompasses motion vectors into the interpolation, that’s the whole point. It just has to compute them with frame comparisons. Games have that information encoded into various gbuffers so it’s already available.
People made the same claim about DLSS 3. But those generated frames are barely perceptible and certainly less noticeable than frame stutter. As long as FSR 3 works half-decently, it should be fine.
And the fact that it works on older GPUs include those from nVidia really shows that nVidia was just blocking the feature in order to sell more 4000 series GPUs.
You aren't going to use these features on extremely old GPUs anyways. Most newer GPUs will have spare shader compute capacity that can be used for this purpose.
Also, all performance is based on compromise. It is often better to render at a lower resolution with all of the rendering features turned on, then use upscaling & frame generation to get back to the same resolution and FPS, than it is to render natively at the intended resolution and FPS. This is often a better use of existing resources even if you don't have extra power to spare.
because I think the post assumes that the GPU is always using all of its resources during computation when it isn’t. There’s a reason why benchmarks can make a GPU hotter than a game can, as well as the fact that not all games pin the gpu performance at 100%. If a GPU is not pinned at 100%, there is a bottleneck in the presentation chain somewhere. (which means unused resources on the GPU)
I still think it’s a matter of waiting for the results to show up later. AMD for RDNA3 does have an AI engine on it, and the gains it might have in FSR3 might be different in the same way XeSS does with branching logic. Too early to tell given that all the test suite tests are RDNA3, and that it doesn’t officially launch til 2 weeks from now.
Frame generation is limited to 40 series GPUs because Nvidias solution is dependant on their latest hardware. The improvements to DLSS itself and the new raytracing stuff work on 20/30 series GPUs. That said FSR 3 is fantastic news, competition benefits us all and I’d love to see it compete with DLSS itself on Nvidia GPUs.
They’re advertising that this has support for 120fps, ultrawide resolutions, and RTX. I am feeling cautiously optimistic that From Software actually nailed the PC support this time.
Now if only they could backport those features to Elden Ring!
Early word from reviewers and people that participated in some of the early access events is that the game is crazy well optimized. I think it was Fighting Cowboy who said all of his gameplay on PC was at 4k 120fps. He also said it runs at around 50fps with high settings on the Steamdeck.
Now if only they could backport those features to Elden Ring!
I wish… Elden ring actually has support for these features but you can’t enable them while playing online because it tampers with files. There are mods for enabling widescreen and unlocked FPS that have been around for ages and work perfectly.
This could totally get me to finally get going and build a new gaming rig.
I wish the team luck. I’m sure it’s not am easy task, looking at other, similar projects like Black Mesa, but the demo is stunning. Would love to see what Ravenholm looks like…
Don’t worry, even with a beefy config it doesn’t run very well (talking i9 10th gen, 32gb ddr4 and 3080RTX) so I doubt even half life 2 will run decently.
Ok? It doesn’t have RTX though, so of course it’s gonna run fine. It’s source engine after all.
They were saying it would run badly if BM had RTX like Portal RTX, because Portal RTX is already very demanding.
The cool thing about pathtracing is it doesn’t really matter how complex a scene is. The bad thing about pathtracing is it doesn’t really matter how complex a scene is.
If your card can’t run remixed HL2, it also won’t be able to do HL1
It really looks like Microsoft made the worst call with their Series S compatibility mandate. Now games come out so late that as an Xbox owner, you’re automatically a Patient Gamer, without the upsides. That is, if a port is released at all.
These days you can play games like Death Stranding more than half a year earlier on your iPhone.
This has nothing to do with the Series S. Helldivers is a fully Sony owned IP. I suspect Xbox asked for this as a show of good grace in exchange for Halo.
I don’t think that’s true, the game ran very well on Series X on release. I know because I sunk like 90 hours on it in the first week. It ran like shit in all last Gen hardware including PC, because people with the then new 30 series Nvidia graphics cards were also reporting the game ran fine. I think CDPR just did not optimize it at all.
Yeah you said it. Sony did not make them release the game for the PS4 and yet they did and it had lots of issues. Hence they did not optimize well for previous gen
Ps4 and series s are basically the same hardware like ps5/seriesX, m$ mandated that it be on all consoles so I don’t blame CDPR for just say screw it and show this is why at some point you shouldn’t support old hardware.
Sure it kinda ruined CDPRs immage short term, but MS did drop that mandate shortly after.
Why, yes, I’m broke, but I doubt Stellar Blade has anything to do with it. It’s mostly because of the shifting economic and political landscape of today.
I love ONI and the base building but I sometimes definitely get annoyed when, through no fault of my own, a dupe gets confused and pisses in the water supply.
So making it a Terraria-like and holy shit I am gonna log another hundred plus hours.
youtube.com
Ważne