Thanks for taking the time to put this together. I dunno if you are aware but there’s also a steam deck sub here and they’d probably be happy if this was posted there too.
Ironically, Zelda Link to the Past ran at 60fps, and Ocarina of Time ran at 20fps.
The same framerates are probably in the Horizon pictures below lol.
Now, Ocarina of Time had to run at 20fps because it had one of the biggest draw distances of any N64 game at the time. This was so the player could see to the other end of Hyrule Field, or other large spaces. They had to sacrifice framerate, but for the time it was totally worth the sacrifice.
Modern games sacrifice performance for an improvement so tiny that most people would not be able to tell unless they are sitting 2 feet from a large 4k screen.
Had to, as in “they didn’t have enough experience to optimize the games”. Same for Super Mario 64. Some programmers decompiled the code and made it run like a dream on original hardware.
The programming knowledge did not exist at the time. Its not that they did not have the experience, it was impossible for them to have the knowledge because it did not exist at the time. You can’t really count that against them.
Kaze optimizing Mario 64 is amazing, but it would have been impossible for Nintendo to have programmed the game like that because Kaze is able to use programming technique and knowledge that literally did not exist at the time the N64 was new. Its like saying that the NASA engineers that designed the Atlas LV-3B spacecraft were bad engineers or incapable of making a good rocket design just because of what NASA engineers could design today with the knowledge that did not exist in the 50s.
One of the reasons I skipped the other consoles but got a GameCube was because all the first party stuff was buttery smooth. Meanwhile trying to play shit like MechAssault on Xbox was painful.
I never had trouble with MechAssault, because the fun far outweighed infrequent performance drops.
I am a big proponent of 60fps minimum, but I make an exception for consoles from the 5th and 6th generations. The amount of technical leap and improvement, both in graphics technology and in gameplay innovation, far outweighs any performance dips as a cost of such improvement. 7th generation is on a game by game basis, and personally 8th generation (Xbox One, Switch, and PS4) is where it became completely unacceptable to run even just a single frame below 60fps. There is no reason that target could not have been met by then, definitely now. Switch was especially disappointing with this, since Nintendo made basically a 2015 mid-range smartphone but then they tried to make games for a real game console, with performance massively suffering as a result. 11fps, docked, in Breath of the Wild’s Korok Forest or Age of Calamity (anyehwere in the game, take your pick,) is totally unacceptable, even if it only happened one time ever rather than consistently.
I’m usually tolerant of frame drops, especially when they make hard games easier (like on the N64), but I agree it has gotten much worse on recent consoles. Looking at you, Control on PS4 (seems like it should just have been a PS5 game with all the frame drops; even just unpausing freezes the game for multiple seconds).
Has anyone ever really noticed how samey everything looks right now? It’s a bit hard to explain, because it’s not the aesthetics of any kind of art style used, but the tech employed and how it’s employed. Remember how a lot of early 3D in film just looked like it was plastic? It’s like that, but with a wider variety of materials than plastic. Yet every modern game kinda looks like it’s made using toys.
Like, 20 years from now I think it would be possible to look at any give game that is contemporary right now and be able to tell by how it looks when it was made. The way PS1 era games have a certain quality to them that marks when they were made, or how games of the early 2000’s are denoted by their use of browns and grays.
Oh yeah this isn’t a complaint, because I think it looks good. It’s just I notice it, and it probably is from almost everything being made on UE5 these days. However, I think MGSV was one of the first games to have this particular look to it, and that’s on its own in-house engine (FOX Engine). It could just be how the lighting and shadowing are done. Those two things are getting so close to photorealism that it’s the texturing and modeling work that puts things (usually human characters) into the uncanny valley. A scene of a forest can look so real… And then you put a person walking through it and the illusion is lost. lol
It’s everyone using UE-based mocap tools that cause the hyperrealistic-yet-puffy faces, is what I suspect he’s talking about, along with the same photogrammetry tools/libraries.
Honestly the biggest thing missing in general lighting is usually rough specular reflections and small scale global illumination, which are very hard to do consistently without raytracing (or huge light bakes)
Activision has a good technique for baking static light maps with rough specular reflections. It’s fairly efficient, however it’s still a lot of data. Their recent games have been in the 100-200 gb range apparently. I’m sure light bakes make up a good portion of that. It’s also not dynamic of course.
So, what I’m saying is, raytracing will help with this, hardware will advance, and everyone will get more realistic looking games hopefully.
Games look samey because Game Studios don’t have ideas anymore. They just try to sell 20 h of playtime - that is essentially empty. It’s literally just a bunch of materials and “common techniques” squashed into a sellable product. In the early times of gaming, people had ideas before they had techniques to implement them. Nowadays, we have techniques and think the ideas are unimportant. It’s uninspired and uninspiring. That’s why.
I’m still cross (not really tho) that when I reached out to you all last year, you never wanted me to interview you!!!
RetroDECK has been my one true emulation love for so long now, and I’ve adored all your recent updates. You should all be so proud! I went into a little more detail for the users here on RetroDECK on my latest news post here on Lemmy too!
This is true of literally any technology. There are so many things that can be improved in the early stages that progress seems very fast. Over time, the industry finds most of the optimal ways of doing things and starts hitting diminishing returns on research & development.
The only way to break out of this cycle is to discover a paradigm shift that changes the overall structure of the industry and forces a rethinking of existing solutions.
The automobile is a very mature technology and is thus a great example of these trends. Cars have achieved optimal design and slowed to incremental progress multiple times, only to have the cycle broken by paradigm shifts. The most recent one is electrification.
path tracing is a paradigm shift, a completely different way of showing a scene to that normally done, it’s just a slow and expensive one (that has existed for many years but only started to become possible in real time recently due to advancing gpu hardware)
Yes, usually the improvement is minimal. That is because games are designed around rasterization and have path tracing as an afterthought. The quality of path tracing still isn’t great because a bunch of tricks are currently needed to make it run faster.
You could say the same about EVs actually, they have existed since like the 1920s but only are becoming useful for actual driving because of advancing battery technology.
Yea, it’s doing that. RT is getting cheaper, and PT is not really used outside of things like cyberpunk “rt overdrive” which are basically just for show.
Except it’s being forced on us and we have to buy more and more powerful GPUs just to handle the minimums. And the new stuff isn’t stable anyways. So we get the ability to see the peach fuzz on a character’s face if we have a water-cooled $5,000 spaceship. But the guy rocking solid GPU tech from 2 years ago has to deal with stuttering and crashes.
This is insane, and we shouldn’t be buying into this.
It’s not really about detail, it’s about basic lighting especially in dynamic situations
(Sometimes it is used to provide more detail in shadows I guess, but that is also usually a pretty big visual improvement)
I think there’s currently a single popular game where rt is required? And I honestly doubt a card old enough to not support ray tracing would be fast enough for any alternate minimum setting it would have had instead. Maybe the people with 1080 ti-s are missing out, but there’s not that many of them honestly. I haven’t played that game and don’t know all that much about it, it might be a pointless requirement for all I know.
Nowadays budget cards support rt, even integrated gpus do (at probably unusable levels of speed, but still)
I don’t think every game needs rt or that rt should be required, but it’s currently the only way to get the best graphics, and it has the potential to completely change what is possible with the visual style of games in the future.
Edit: also the vast majority of new solid gpus started supporting rt 6 years ago, with the 20 series from nvidia
That’s my point though, the minimums are jacked up well beyond where they need to be in order to cram new tech in and get 1 percent better graphics even without RT. There’s not been any significant upgrade to graphics in the last 5 years, but try playing a 2025 AAA with a 2020 graphics card. It might work, but it’s certainly not supported and some games are actually locking out old GPUs.
Often the lighting systems used require some minimum amount of processing power, and to create a lower graphics setting you would need a whole separate lighting technique
If you think about it, the gaming GPUs have been in a state of crisis for over half a decade. First shortages because everybody used them to mine bitcoins, then the covid chip shortages happened and now AI is killing cheaper GPUs. Therefore many people are stuck with older hardware, SteamDecks, consoles and haven’t upgrades their systems and those highly flammable $1000+ GPUs will not lead to everyone upgrading their PCs. So games are using older GPUs as target
Horizon Zero Dawn was a stunning game, I did pretty much the same
I’m kinda annoyed bc my 2 BFFs JUST got PlayStations like for Xmas. I’ve been on PS4+PS5 for a long while now and played both Horizons for free. I really wanted to tell them to give Zero Dawn a whirl just to show what the PS5 could do with it… but for full price? Eh… I’ll leave that up to them.
The point isn’t about cross generation games. It’s about graphics not actually getting better anymore unless you turn your computer into a space heater rated for Antarctica.
I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn.
Really? I’ve played both on PS5 and didn’t notice any real difference in performance or graphics. I did notice that the PC Version of Forbidden West has vastly higher minimum requirements though. Which is the opposite of performance gains.
Who the fuck cares if leaves are actually falling off or spawning in above your screen to fall?
And BG3 has notoriously low minimums, it is the exception, not the standard.
If you want to see every dimple on the ass of a horse then that’s fine, build your expensive computer and leave the rest of us alone. Modern Next Gen Graphics aren’t adding anything to a game.
What those leaps do result in, however, is major performance gains.
Which many devs will make sure you never feel them by “optimizing” the game for only the most bleeding edge hardware
Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.
See, if the games were made with a performance first mindset, that’d be possible already. Not to dunk on performance gains, but there’s a saying that every time hardware gets faster, programmers make their code slower. I mean, you can totally play emulated SNES games with minimal impact compared to leaving the computer idling.
Saying “diminishing returns” is like saying that fire burns you when you touch it.
Unless chip fabrication can figure a way to make transistors “stack” on top of one another, effectively making 3D chips, they’ll continue to be “flat” sheets that can only increase core count horizontally. Single core frequency peaked in early 2000s, from then on it’s been about adding more cores. Even the gains from a RTX 5090 vs a RTX 4090 aren’t that big. Now compare with the gains from a GTX 980 vs a GTX 1080
Well I play a lot of Street Fighter and I think I’ve perfected a real winner of a control method; but it’d also be good for Minecraft so I can try and fuck a creeper
Technically an original source code was adopted to SNES, even including some (most?) glitches, so I’d say it’s more like a port or remaster than remake, even though graphics and audio were remade.
lemmy.world
Najstarsze