To be faaaaaaaaaaaiiiiir, a lot of that was tied up in the switch from overhead isometric view to first-person view.
Fallout 1/2 didn’t focus on graphics, they were in many ways point-and-click adventures. A lot of things you had to hover over for “flavor text” and every once in a while something only four pixels wide exists that you need to notice.
So the gameplay actually actively eschewed graphics in favor of things like flavor text and reading.
Further, the switch to first person broke the SPECIAL system, because how to you even manage a gun skill in a first person shooter without it feeling absurd? It made sense in isometric, even if it was often frustrating to miss an enemy when you had a 79% chance to shoot them in the balls. Putting that in a first person when you mag dump into someone standing right in front of you and half your shots feels a lot less realistic, and can quickly become frustrating in a more fast-paced first-person-shooter environment. The SPECIAL system feels absolutely slapped on as an afterthought in Fallout 3.
Also, the writing in Fallout 3 was that shitty Bethesda writing. The writing was just subpar compared to the prior two installments. Especially the fucking stupid ass end of the game.
I’d say a lot of those complaints were driven more by the perspective switch than anything else.
Valheim was one of the best selling games and is still a huge success. Indies are getting better and more popular to the point that even big companies like Nexon are indiewashing their studio and pretending that Dave the Diver is an indie game with pixel art instead of a work of one of the biggest publishers there is. In my experience most of the gamers nowadays are people that grew up on minecraft, terraria or probably more likely today - roblox.
So basically no, I don't think so. Maybe big studios want you to believe that and it might be true for a casual FIFA or CoD gamer but for anyone else, there are more options than ever and the supply of good smaller simpler games is just overwhelming, the days are too short to even keep track of them anymore.
I didn’t actually know about Dave the Driver being a big publisher until just now. I felt that game was kinda under-developed for how hyper it was and now I’m even more disappointed.
It only has like 6 major areas and the levels didn’t have that much variety. Plus the side content is fairly under polished. I enjoyed it for the first 60ish percent but was kinda forcing myself to finish it by the end.
It used to be that people argued that it’s worth getting the new game console because “better graphics”. The console wars hasn’t gone anywhere, it’s just expanded.
In any case, in regards to just installing a game and playing it, no, not really. When I was playing games in college in 2012 it was still a time when you would open a game and go to the settings menu to adjust settings.
Sometimes it was just turning off motion blur, but there was always settings to change to try to reach a stable 60FPS.
Nothing changed, it just expanded. Now instead of 60FPS it’s a variable 60-240FPS. Instead of just 720p-1080p resolution, unless it’s portable, it’s 1080p minimum otherwise variable up to 4k. Instead of “maxing out” we now have raytracing which pushes software further than our hardware is capable.
These aren’t bad things, they’re just now 1) slightly marketed, 2) more well known in the social sphere. There isn’t anything stopping you from opening up the game and going right away, and there’s nothing stopping other people from wondering about frame timings and other technical details.
Sure, focusing on the little things like that can take away from the wider experience, but people pursue things for different reasons. When I got Cyberpunk 2077 I knew that there were issues under the hood, but my experience with the game at launch was also pretty much perfect because I was focused on different things. I personally don’t think a dip here and there is worth fretting over, but some people it ruins the game for them. Other people just like knowing that they’re taking full advantage of their hardware, hence figuring out the utilization of their components.
There’s one last aspect not mentioned. Architectures. 10 years ago games would just boot up and run… But what about games from 10 years before then? Most players not on consoles were having to do weird CPU timing shenanigans to be able to boot up a game from (now 20) years ago. We’re in the same boat now with emulation, which while emulation is faring better, X360/PS3 generation games that had PC ports are starting to have issues on modern Windows. Even just 5 or 6 years ago games like Sleeping Dogs wouldn’t play nice on modern PC’s, so there’s a whole extra aspect of tinkering on PC that hasn’t even been touched on.
All this to say, we are in the same boat we’ve always been in. The only difference is that social media now has more knowledge about these aspects of gaming so it’s being focused on more.
The one thing I do agree with though is that this is all part of software development. Making users need better hardware, intentional or not, is pretty crazy. The fact that consoles themselves now have Quality vs Performance modes is also crazy. But, I will never say no to more options. I actually think it’s wrong that the console version of games often are missing settings adjustments, when the PC counterpart has full control. I understand when it’s to keep performance at an acceptable level, but it can be annoying.
I don’t really relate as I typically linger two or more years behind the cutting edge games and tech so by the time I get it my hardware can easily run it and I can actually just install the game and play.
That and all tge good games float to the top of the pile in that time so I rarely end up spending money on something I don’t enjoy.
I feel it’s a bit like any hobby. You’d see casual film enjoyers and then those who refuse to watch unless it’s a bluray on their 4k Dolby Vision TV with 1000 nits OLED brightness. There are some who just enjoy listening to music on their airpod knockoffs by streaming on YouTube music and then there are those who buy $500 headphones with high quality gold plated aux wire and a custom DAC and use some obscure format to really enjoy music. There are some who enjoy team sports and then there are those who know personal routine of each player and the wetness of the grass or the year of the ball’s manufacturing and its impact on throw.
Stalker 2 had bugs on launch yet it easily sold 1 million copies. Black Myth Wukong uses frame gen to achieve 60 fps on PS5 and otherwise it locks to 45 fps, yet it has broken all records. Elden Ring is still a stuttery mess on PC and barely hits 60 fps on consoles, even the $700 one, yet it’s beloved.
These people aren’t the ones talking about resolutions and frame rates on X, but just playing the damn game in millions.
Just like millions use sub par TV settings and stream music or don’t have much clue about team sports but still have a great time.
Elden Ring is still a stuttery mess on PC and barely hits 60 fps on consoles, even the $700 one, yet it’s beloved.
I have an old ass PC and a PS5 with the game on both and they run smooth as shit unless you’re using raytracing, which literally doesn’t even change the visuals in the game; it just makes it slower.
Stalker 2 is a busted mess. The performance issues have been fixed mostly after 3 patches, but the game itself shits itself once you get to a certain story mission. Literally nothing works beyond that point. The A-Life system does not work, scripted events are all jacked up, IDK if anyone else is getting this but every now and then I have my secondary weapon replaced with a random other weapon that I didn’t even have in my inventory, sound effects don’t play properly, the hud completely disappears, and so many more things that make me glad I’m only playing through GamePass and didn’t actually buy the game. There’s a good game under the mess, somewhere. But they should have just bit the bullet and delayed it another month or two instead of releasing what they did for the holidays.
The reason they sold so many copies though, is because pre-ordering. People bought them before they ever saw the game in action. And games like Stalker 2 are the reason why you shouldn’t pre-order. Because the chances of getting burned by busted-ass shit like this is increasingly more common. Again, because people pre-order the fuck out of games.
I definitely don’t see a fixation of performance lol
The reliance on AI upscaling and frame generation, while the entire game takes up half or your entire SSD shows that optimization is an after thought. These solutions make everything look pretty and smooth, at the cost of how it actually feels to play (input lag up the fucking ass that makes the game feel way worse). Couple that with the myriad of performance issues the majority of AAA games have at launch.
The focus is entirely on making something visually good looking that will sell millions in pre-orders alone.
I see all the graphics technologies as an extra bonus to the gaming experience. It might make the game experience slightly better; but it alone doesn’t make the best experience.
I’ve tried various team based shooters over time like Dirty Bomb, Overwatch, Paladins, and even now I try newer stuff, like Marvel Rivals. However everytime I feel get bored of them, I hop onto TF2 and no other team based shooter gives the the same satisfying gameplay loop compared to it.
Regardless of graphics; a well made skill based game can keep players for a long time inside it; i mean just look at aoe2, it’s the chess of strategy games and a ton of people still play it to this day.
Yeah, I’m sick of it as well. Having to guess whether my rig will play something at a framerate that won’t make me sick because a dev studio chose pretty graphics (that aren’t really much better than AAA 10 years ago) over good optimization.
Most of the games I play are relatively undemanding for this reason. That and because indie games don’t have as much monetization.
There are a lot of phenomenal indie games. There also are still a couple of really good AAA games, but AAA gaming isn’t what it used to mean. In fact I’d be careful with AAA by default unless reviews state that the game is actually good. Ubisoft even tried to establish an “AAAA quality” game with Skulls & Bones or how it’s called and it’s a total flop.
The real quality these days lies in indie games or (mostly) independent gaming studios. I think it’s kind of safe at this point to just assume by default that Bethesda, Microsoft, EA, Activision-Blizzard and so on simply cannot produce actual good games anymore (there may be some exceptions, but again, wait for independent reviews, and unless it was independently verified, don’t trust them to produce a good game).
Another problem is the sheer mass of games flooding the market, because it means that true gems aren’t found so easily. But they exist. There’s no shortage of great games, you just have to look harder, and look in the right places.
The rabbit hole of looking harder and being amazed by what exists will probably never end if persons keep looking (till they become proficient enough to be the ones making those things).
bin.pol.social
Aktywne