Obligatory dril tweet about how you never “Gotta hand it to them”
But yeah. It is infuriating how often people just spew nonsense about “it is Unreal so it runs like shit” just like it was when “It is Unity so it runs like shit” and so forth. I very much do not think people need to “code (their) own engine” to understand things but… it would be a good idea to to do some basic research on the topic.
I can’t speak for blands 4 specifiaclly. But with a lot of these games? People don’t realize that the game is very much being optimized with upscaling and even framegen in mind. You can say you hate that if you want to. But these studios aren’t targeting 120 FPS at 4k. They are targeting 120 FPS at 2k or even 60 FPS at 2k. In large part because… there still isn’t a lot of value in actually targeting 4k on any console or even most PCs. So it gives them more or less a “single” target. And some techniques don’t scale anywhere near as well (also 2k to 4k is not linear).
It is just that people grew up in the golden age where everything was targeting a PS4 and kind of has been for almost a decade at this point. So the idea of futzing with your graphics settings and checking the different AA (now upscaling) models is kind of anathema to a lot of users. And consoles STILL try to hide all of that.
You can expect whatever you want. That isn’t what is being sold. That’s why so much of the nVidia bullshit insists that reviewers and partners report upscaling and framegen as the “normal” results (and why outlets like Gamers Nexus are on their shitlist).
And the “norm for over 20 years” was that games targeted a specific range of hardware “power”. That is why so much work was put into engines to support higher res models in first person view that would only be one per scene and lower res models that other characters have. And scaling detail based on distance and so forth. But it also meant that we didn’t have the ridiculously high res hero models that might only exist for cutscenes because no computer of the era would be able to run that performantly in real time.
Personally? I would LOVE if more effort were put in for scalability (which Unreal is ironically really good at because it targets film). But that gets back into what the target audience is and if it is worth putting the effort in for “Mega Ultra Super Epic” settings when nobody is going to use them. And… considering that one of the biggest issues is people refuse to put more research in than “I set it to High”… yeah.
But also… that isn’t NOT what we have now. We have the games that are designed to look good at 4k asset wise. And, in a couple years (maybe longer, maybe never) we will have the oomph to actually run those natively at 4k/120 and it will be BEAUTIFUL (and have less weird fur textures appearing out of nowhere or weird cape physics when the model barfs). Which isn’t TOO different than building a new PC, playing Dwarf Fortress because it is funny, and then booting up a Total War and losing your mind for a bit.
That’s what was being sold for over three decades. I give 1000 dollars worth of graphics card, I can run anything at max settings with no compromises. Nvidia and videogame devs enshittified this shit so much that you require stuff that should be reserved for LOW END graphics cards to run the game properly. Fuck that.
Hey buddy. I got a 2070 that I’ll give you a real bargain on. Only 900 USD plus s&h. It’ll play Crysis at max settings with absolutely no compromises. Hit me up.
But, in all seriousness: This is why (minimally biased) reviews are a thing. Outlets like Gamers Nexus or Hardware Unboxed do a spectacular job of actually benchmarking against common games of the current era (and a few oldies) and comparing them against a large database of other cards. Don’t just assume that you can give Jensen a grand and get what you want. Make informed decisions.
Also I am not actually sure if 900 is ACTUALLY a steal for a 2070 at this point and didn’t care enough to check for my joke. SImilarly, I want to say a grand is closer to a 4070 than a 4090 at this point? Shit is REAL fucked which is even more why you should actually look at what you are buying rather than “30 years ago if I gave someone a thousand bucks they would suck me off AND let me play every game I could ever want at a full 640*40 and 30 FPS!”
Also I want to say Crysis is actually more CPU bound these days since it was the height of the Pentiums and single core performance is nowhere near the priority it used to be. But too lazy to check.
I can’t decipher whatever word salad you’re trying to say here and I don’t care. There is no excuse for a top of the line GPU from LAST YEAR to not get at least 80 FPS on High. Most gamers are poor and have stuff like 3060s, 4060s, and RX 6600s. I’m not a rich fucker who can burn 2000 dollars every year on a 600 watt GPU to get a pitiful 50 FPS.
Just how an NDA won’t protect a(n) person/entity from legal repercussions when committing crimes from being “leaked” doesn’t mean it’ll hold in court, just how these highly illegal and predatory EULAs shouldn’t either. There’s a reason why this bullshit only happens in America, this EULA elsewhere wouldn’t work unless it’s modified to not contain this blatant shit of a thing. It’s a shame we’re the few dumb enough to allow our own downfall while parading our aggressors. 🫠🫠🫠
This looks like a very solid remaster where they also fixed some glaring issues with the original like the outragious leveling (and therefore to some degree the scaling-) system. I am not a huge Bethesda fan but they did many things right with this. I hope that this remaster will be available on GOG at some point as well.
I’d laugh if this weren’t so sad. Sounds like Sony is doing what they can to can any developer under their umbrella. Although, that’s more of a Microsoft thing, sad to see this happen so frequently
Fun fact, The Suicide Squad (2021) was a box office flop, whereas Suicide Squad (2016), the only academy award winning DCEU film, was a box office smash hit!
insider-gaming.com
Ważne