Look, don't take it personally, but I disagree as hard as humanly possible.
Claiming that realism "makes every game look the same" is a shocking statement, and I don't think you mean it like it sounds. That's like saying that every movie looks the same because they all use photographing people as a core technique.
If anything, I don't know what "realism" is supposed to mean. What is more realistic? Yakuza because it does these harsh, photo-based textures meant to highlight all the pores or, say, a Pixar movie where everything is built on this insanely accurate light transfer, path traced simulation?
At any rate, the idea that taking photorealism as a target means you give up on aesthetics or artistic intent is baffling. That's not even a little bit how it works.
On the other point, I think you're blending technical limitations with intent in ways that are a bit fallacious. SotC is stylized, for sure, in that... well, there are kaijus running around and you sometimes get teleported by black tendrils back to your sleeping beauty girlfirend.
But is it aiming at photorealism? Hell yeah. That approach to faking dynamic range, the deliberate crushing of exteriors from interiors, the way the sky gets treated, the outright visible air adding spacing and scale when you look at the colossi from a distance, the desaturated take on natural spaces... That game is meant to look like it was shot by a camera all the way. They worked SO hard to make a PS2 look like it has aperture and grain and a piece of celluloid capturing light. Harder than the newer remake, arguably.
Some of that applies to GoW, too, except they are trying to make things look like Jason and the Argonauts more than Saving Private Ryan. But still, the references are filmic.
I guess we're back to the problem of establishing what people mean by "realism" and how it makes no sense. In what world does Cyberpunk look similar to Indiana Jones or Wukong? It just has no real meaning as a statement.
Hell, no. 120 fps wasn't even a thing. That flash in the pan moment was when 1080p60 was the PC standard and 720p30 the console standard and the way the hardware worked you could hit max specs on a decent PC every time. It lasted like three or four years and it was wonderful.
By the point we started going above the NTSC spec on displays the race was lost. The 20 series came out, people started getting uppity about framerate while playing some 20 year old game and it all went to crap on the PC front.
As for AA, I don't think you remember FXAA well, or at least in relation to what we have. ML upscaling is so much sharper than any tech we had a couple of gens ago, short of MSAA (and frankly even MSAA). The problems that have become familiar in many UE5 games are not intrinsic to the tech, they have a lot to do with what the engine does out of the box and just how out of spec some of the feature work is.
I feel like people have gotten stuck with some memes (no motion blur! DLSS bad! TAA bad!) that are mostly nostalgic of how sharp 1080p used to look compared to garbage-tier sub 720p, sub 30 fps console games. It's getting to the point where I have so many major gripes with a lot of modern games but I feel it becomes one of those conversations you can't have in public because it gets derailed immediately.
In any case I think we can at least agree that it's been an awkward couple of generations of PC hardware and software for whatever reason and GPUs, engines and displays need to get realigned in a way where people can just fire up games and expect them to look and run as designed.
I agree that it's a meme comparison anyway. I just found it pertinent to call out that remasters have been around for a long time.
I don't know that I agree on the rest. I don't think I'm aware of a lazy game developer. That's a pretty rare breed. TAA isn't a bad thing (how quickly we forget the era when FXAA vaseline smearing was considered valid antialiasing for 720p games) and sue me, but I do like good visuals.
I do believe we are in a very weird quagmire of a transitional period, where we're using what is effectively now a VFX suite to make games that aren't meant to run in real time on most of the hardware being used to run them and that are simultaneously too expensive and large and aiming at waaay too many hardware configs. It's a mess out there and it'll continue to be a mess, because the days of a 1080Ti being a "set to Ultra and forget it" deal were officially over the moment we decided we were going to sell people 4K monitors running at 240Hz and also games made for real time raytracing.
It's not the only time we've been in a weird interaction of GPUs and software (hey, remember when every GPU had its own incompatible graphics API? I do), but it's up there.
Well, yeah, but again, that's not new, and it's something every game has to do, better or worse.
I'm aging myself here, but if you must know, the time that stands out most to me in the "graphics over gameplay" debate is actually... 8 bit micros, weirdly.
There was a time where people mostly just looked at how much of a screen a character filled, or whether the backgrounds scrolled and just bought that, while a subset of the userbase and press was pleading to them to pay at least some consideration to whether the game... you know, could be played at all.
But hey, I'll split the difference. Instead of SMB 1, which was a launch game and literally wasn't running on the same hardware (because mappers), we can do Mario 3 instead.
Or, hear me out, let's not do a remaster at all for current gen leaps. Here's a PS4 vs PS5 sequel one.
It doesn't work as well, though, since taking the absolutely ridiculous shift from 2D to 3D, which has happened once and only once in all of gaming history, is a bit of a cheat anyway.
Oh, and for the record, and I can't believe I'm saying this only now, LttP looks a LOT better than OoT. Not even close.
Nothing was going harder for visuals, so by default that's what was happening. They were pushing visuals as hard as they would go with the tech that they had.
The big change isn't that they balanced visuals and gameplay. If anything the big change is that visuals were capped by performance rather than budget (well, short of offline CG cutscenes and VO, I suppose).
If anything they were pushing visuals harder than now. There is no way you'd see a pixel art deck building game on GOTY lists in 2005, it was all AAA as far as the eye could see. We pay less attention to technological escalation now, by some margin.
Absolutely they went for realism. That was the absolute peak of graphics tech in 2004, are you kidding me? I gawked at the fur in Shadow of the Colossus, GTA was insane for detail and size for an open world at the time. Resi 4 was one of the best looking games that gen and when the 360 came out later that year it absolutely was the "last gen still looked good" game people pointed at.
I only went for that year because I wanted the round number, but before that Silent Hill 2 came out in 2001 and that was such a ridiculous step up in lighting tech I didn't believe it was real time when the first screenshots came out. It still looks great, it still plays... well, like Silent Hill, and it's still a fantastic game I can get back into, even with the modern remake in place.
This isn't a zero sum game. You don't trade gameplay or artistry for rendering features or photorealism. Those happen in parallel.
Really? Cause I don't know, I can play Shadow of the Colossus, Resident Evil 4, Metal Gear Solid 3, Ninja Gaiden Black, God of War, Burnout Revenge and GTA San Andreas just fine.
And yes, those are all 20 years ago. You are now dead and I made it happen.
As a side note, man, 2005 was a YEAR in gaming. That list gives 1998 a run for its money.
Not everybody around me. Nobody around me has ever mentioned a touchpad outside of threads about the touchpad. It's not a thing.
Everybody on Steam seems to be playing controller games on sticks, though. At least from the data Steam shares. Which matches reviews at the time (and later, when people had to pay attention to them on the Deck), the way games on Deck are put together by devs, the low sales of the Controller, the changes to the Vive controller, the lack of other hardware manufacturers doing dual touchpads and pretty much every other piece of info at scale we have beyond anecdote.
Man, online chatter sucks and does bad things to people. I think I'm done with this. Have fun with the dual pads Valve bestowed upon you. I don't need you to change your mind about their popularity, but man, there's going to be a Smithers moment for you at some point on something else and it sure would be good if you thought back to this.
Hah. Man, you were fuming about that one for a while, huh?
I said at the very tippy top of this thread that
I know some people swear by them, I just don't think they're worth the space they take up as a pointer device
and later
People who like these do tend to be loud and proud about it, so they stand out more
It's no surprise that there are people swearing by them loudly and proudly. In fact, there are more people doing that than the opposite, because most people just... you know, ignore the whole thing altogether and haven't through about the Steam Controller in a decade.
The reason I was pulling quotes for you is that you denied the touchpad reception in the OG Controller was mixed and that Valve was presenting them as a stick substitute, which was demonstrably incorrect.
It means official full controller support with the default config. There are few games that provide official controller support over Steam Input in the first place, even fewer that have any touchpad custom inputs by default and I'm not even sure if there are any that are Steam Verified. At a glance it's... what, just Rimworld again? Maybe some first party stuff left over from the Steam Machines fiasco? Sims is only Playable. Civ VII, which you called out earlier, I suspect incorrectly, has official all-stick support, what with having launched on consoles day and date. I haven't checked it because I haven't bought it yet, so if I'm wrong let me know. Civ VI doesn't have default controller support, but it's only Playable as well. In fact, if you have a list of verified games with touchpad default support I'd love to see it. I'm genuinely curious.
Look, you get to live in this very specific alternate reality where the only difference is people love dual touchpads as a main input system. That's fine, you're not hurting anybody. I get hung up on it because blatant misrepresentations on social media are fairly upsetting these days and because I'm still not over having had to use the dumb touchpads on the Vive for a couple of years back there.
But man, is it exhausting to watch it act as a proxy of some much more important crap in real time.
Hm. That is an interesting read, I don't know if I see it. For fast iGPUs it's been all AMD for a while. Nvidia has been threatening to build a faster one, but it seems they may be targeting integrated, fully branded devices for AI instead of gaming or general use.
Intel has started competing there, but so far it's not been a popular pick with handheld manufacturers.
My understanding is this generation there are more powerful parts but they're expensive to implement and they many not be as good at low wattages, but I guess we'll have to wait a while to know for sure. Either way I don't see a reason why there would be downward pressure on prices. Less upwards pressure than Nvidia just throwing a number at the 5080 and 5090 presumably selected from a bingo card, for sure, but still not necessarily down in price to performance.
Not most users, not even as Valve intended (on the Deck, at least).
They literally reserved the green "Verified" badge for games with full controller support and are the only ones eligible for the "Great on Deck" tab. Mouse and Keyboard games get the yellow "Playable" tag instead and a warning on boot.
See, that's the sad part about actually looking things up. It takes time, people get to nitpick it to death and then some guys will just... you know, say stuff.