Obligatory dril tweet about how you never “Gotta hand it to them”
But yeah. It is infuriating how often people just spew nonsense about “it is Unreal so it runs like shit” just like it was when “It is Unity so it runs like shit” and so forth. I very much do not think people need to “code (their) own engine” to understand things but… it would be a good idea to to do some basic research on the topic.
I can’t speak for blands 4 specifiaclly. But with a lot of these games? People don’t realize that the game is very much being optimized with upscaling and even framegen in mind. You can say you hate that if you want to. But these studios aren’t targeting 120 FPS at 4k. They are targeting 120 FPS at 2k or even 60 FPS at 2k. In large part because… there still isn’t a lot of value in actually targeting 4k on any console or even most PCs. So it gives them more or less a “single” target. And some techniques don’t scale anywhere near as well (also 2k to 4k is not linear).
It is just that people grew up in the golden age where everything was targeting a PS4 and kind of has been for almost a decade at this point. So the idea of futzing with your graphics settings and checking the different AA (now upscaling) models is kind of anathema to a lot of users. And consoles STILL try to hide all of that.
You can expect whatever you want. That isn’t what is being sold. That’s why so much of the nVidia bullshit insists that reviewers and partners report upscaling and framegen as the “normal” results (and why outlets like Gamers Nexus are on their shitlist).
And the “norm for over 20 years” was that games targeted a specific range of hardware “power”. That is why so much work was put into engines to support higher res models in first person view that would only be one per scene and lower res models that other characters have. And scaling detail based on distance and so forth. But it also meant that we didn’t have the ridiculously high res hero models that might only exist for cutscenes because no computer of the era would be able to run that performantly in real time.
Personally? I would LOVE if more effort were put in for scalability (which Unreal is ironically really good at because it targets film). But that gets back into what the target audience is and if it is worth putting the effort in for “Mega Ultra Super Epic” settings when nobody is going to use them. And… considering that one of the biggest issues is people refuse to put more research in than “I set it to High”… yeah.
But also… that isn’t NOT what we have now. We have the games that are designed to look good at 4k asset wise. And, in a couple years (maybe longer, maybe never) we will have the oomph to actually run those natively at 4k/120 and it will be BEAUTIFUL (and have less weird fur textures appearing out of nowhere or weird cape physics when the model barfs). Which isn’t TOO different than building a new PC, playing Dwarf Fortress because it is funny, and then booting up a Total War and losing your mind for a bit.
That’s what was being sold for over three decades. I give 1000 dollars worth of graphics card, I can run anything at max settings with no compromises. Nvidia and videogame devs enshittified this shit so much that you require stuff that should be reserved for LOW END graphics cards to run the game properly. Fuck that.
Hey buddy. I got a 2070 that I’ll give you a real bargain on. Only 900 USD plus s&h. It’ll play Crysis at max settings with absolutely no compromises. Hit me up.
But, in all seriousness: This is why (minimally biased) reviews are a thing. Outlets like Gamers Nexus or Hardware Unboxed do a spectacular job of actually benchmarking against common games of the current era (and a few oldies) and comparing them against a large database of other cards. Don’t just assume that you can give Jensen a grand and get what you want. Make informed decisions.
Also I am not actually sure if 900 is ACTUALLY a steal for a 2070 at this point and didn’t care enough to check for my joke. SImilarly, I want to say a grand is closer to a 4070 than a 4090 at this point? Shit is REAL fucked which is even more why you should actually look at what you are buying rather than “30 years ago if I gave someone a thousand bucks they would suck me off AND let me play every game I could ever want at a full 640*40 and 30 FPS!”
Also I want to say Crysis is actually more CPU bound these days since it was the height of the Pentiums and single core performance is nowhere near the priority it used to be. But too lazy to check.
I can’t decipher whatever word salad you’re trying to say here and I don’t care. There is no excuse for a top of the line GPU from LAST YEAR to not get at least 80 FPS on High. Most gamers are poor and have stuff like 3060s, 4060s, and RX 6600s. I’m not a rich fucker who can burn 2000 dollars every year on a 600 watt GPU to get a pitiful 50 FPS.
I’m not sure if they can anymore. Civ 7 broke me on how it shoe-horned in systems to make money that ultimately broke what was a tried and tested formula.
I remember when 3 came out, i still had a ps4. The game was literally unplayable, like literally crashing every 2 minutes. Couldn’t get past the first 10 minutes of the game. Im not surprised that even on ps5 the new one is crap. It’s like they gave up after part 2
I don’t understand how Sony would allow a game on their platform that doesn’t actually run. Like surely they will require to provide some kind of advanced copy for them to review?
I’ve read reports that people can’t get more than 30fps on low settings on 4000 series cards. I’m definitely not one to expect sweet 120fps on ultra on launch day, but a 4000 card not even getting low settings? They failed. Hard.
I mean, Cuphead is one of the more visually impressive games… ever. It has an art style and it works it to the fullest extent. And Silksong is also quite gorgeous.
This comes up every other year or so it seems. People think 2D means “low effort” and get angry that so many fighting games moved on to 3D. But the reality is that actually making sprites is a VERY labor intensive process that often requires a deep understanding of the entire rendering pipeline (including the hardware it is displayed on). At this point, “most” online people are aware that many of the NES/SNES sprites were specifically made with CRT “blurring” in mind but it goes way beyond that. So that is why franchises like Street Fighter just have “generic” 3d models.
And ArcSys more or less made their entire model (wait for it) simple-ish 3d models with cell shading and very specific lighting systems to appear 2D even though they aren’t. Which is why stuff like the super attacks always look so impressive as you do the zoom and spin around.
I can’t speak for blands 4 since 3 (and the pre-sequel…) were so aggressively obnoxious that I just replay 2 every 3 or 4 years. But just look at this thread where you have weirdos saying that UE5 games look like they are from 2016. People are deeply stupid and games need to pop and sizzle to stand out. And while I don’t know (or care) if blands 4 succeeded… just look at ArcSys for how you can use modern engines to make cellshaded 3d models look AMAZING.
It is very specifically for actually 2D games, but check out some of Cobra Code’s videos on youtube. They have put a LOT of work into how to use UE5 to make sprite based 2D games look GOOD and it is actually fascinating. And that is still sidestepping the initial sprite work for the most part.
And as another example of why sprites are actually a ridiculous amount of work to get right. Mina The Hollower (?) is the latest game from the Shovel Knight devs. And… most of us who tried the demo on a high resolution display felt REALLY weird because of the way the sprites and animations looked upscaled (I saw a breakdown of why. I did not understand it). The devs are putting in the work to fix that ahead of launch but it really speaks to the kinds of problems that come up when you go from testing on a Steam Deck or in a debug window to stretched across a 1440p display at 120-ish Hz.
I’m definitely not one to expect sweet 120fps on ultra on launch day
I fucking am. I didn’t pay $1700 for a graphics card to have potato graphics. I’m glad there’s been plenty of bad press on this, not that I would have bought it this close to launch anyway.
No, I disagree with that. It’s always been perfectly normal to have ultra graphics a bit out of reach so that the game will look great on future graphics cards, and 120fps is a ridiculously high number that should never be expected even with top of the line graphics cards for a brand new release. (Assuming 2 or 4k)
However, a 1 generation out of date graphics card should be able to easily play most things on High settings at a decent framerate (aiming for 60) on 4k settings, which Borderlands failed at horribly. Medium to low settings on a 4000 series card sounds like a gutpunch to me.
Have to partly disagree. The loading screens, whilst usually brief, were very annoying.
Elite Dangerous, No Man’s Sky or Space Engineers accomplish similar tasks without needing those, or having to limit the accessible planet area.
I understand that those were limitations of the engine, but it could just be speculation.
In short, there was a legal dispute between Pitchford and a former counsel for Gearbox. As part of a pattern of suit-countersuit, the former employee alleged that Pitchford had left a USB stick at a local restaurant which contained proprietary company info as well as underage pornography. Pitchford confirmed that all of the above, with the notable exception of the “underage” part. Given nothing came of it, and he was remarkably candid about what type of porn was actually on the USB, I’m inclined to believe him.
insider-gaming.com
Aktywne