Ugh games of this era are gonna age like milk with this forced upscaling and blurry TAA smear shite.
More compression and upscaling… How about just better graphics? How about you make a console that can do path tracing that you can get going with a fairly cheap PC setup.
All these years and these consoles still run 720p30fps like the PS3, but it’s ok with some people because it’s using AI to be dishonest and not just lying like back in the good old days with fish AI.
Forced upscaling and blurry TAA is compensating for the fact that they can’t push graphics much further on the hardware we have. The current hardware progression has stagnated, combined with the fact that we are seeing more diminishing returns in graphics as they improve, requiring more power to deliver less of a noticeable difference.
But it doesn’t mean these games won’t look great when you disable the fakeness and run it with brute force GPU power 10 years from now.
I honestly think the current graphics we can achive are fine and where the true improvements should come from are better animation and actually good art direction.
I’m no expert on the matter, but I know this yt channel argues that the technology is already available. The thing is, big players like unreal engine devs make sub-optimal decisions when implementing these new features, leaving a lot of games being blurry and/or mal-ajusted simply by not knowing any better. Of course, art direction will always be important for a games graphics, but when the vast majority of tools available make things look bad by default, it makes sense that people will assume a better result is just not available yet.
That’s the guy who’s asking for a million dollars to “fix” unreal engine 5 despite having 0 programming experience and sends out dcma strikes for any videos that call him out on it, lol
I think the primary reason for the GPU stagnation has been the AI / GPU compute bubble over the past 5 years.
So much on-die space has been diverted away from raw rasterisation power towards CUDA, that it has artificially held back GPU progress.
When we do see the current AI bubble burst (and it does feel like we’re fast approaching that point, due to all the recent incestuous business dealings), hopefully we can see some innovation return to the sector.
arstechnica.com
Najstarsze