John Carmack: We couldn’t figure out how to perfect virtual texture streaming. I’d walk backwards and turn and the world would reload textures. It just comes down to a problem with the implementation.
Randy: The technique is fine you just need to play differently.
I see a lot of folks trying to blame this on Unreal, but that makes no sense in light of other Unreal games being smooth for the visual fidelity, and Gearbox having worked with Unreal for literally forever.
This is all on Gearbox, and their CEO/devs throwing gas in the fire via Twitter.
It’s honestly insane. There is clearly internal dysfunction at Gearbox, yet their CEO and leads are allowed to damage their brand to their hearts content with… no repercussions? WTF is Embracer (their parent) even doing to miss that?
UE5 by default uses a lot of flashy tech that is supposed to improve performance, but a lot of it only does so in scenarios that are already extremely unoptimized. Using more traditional methods tends to achieve the same fidelity at a fraction of the performance cost. But there’s no time for optimization, and these fancy options “just work”, so there ya go.
The end result is a poorly running blurry mess of a game, but at least it’s out on schedule I guess.
I looked up some videos from YouTube sleuths on why so many UE5 games suck. For any studio previously using UE3 or 4, they had to relearn/recreate nearly their entire workflow again. 5 very much changed damn near everything. But also that 5 has all this tech that everyone assumes works in all scenarios and is a miracle, when in reality it’s still software tech and has very real limitations and best use cases that studios ignore. Larger studios “should” be able to trial and error while burning through $ to figure it out, but usually management doesn’t give them enough time. Smaller studios can’t afford to have many many months of downtime learning to re-adapt everything. It’s just so damn complex that very few have had time and $ to just trial and error figure out its limitations and to work within them.
It SHOULD get better and better as time goes on, though. The tech pieces in 5 keep getting improvements, and theoretically people should eventually start to adapt to it correctly, and the knowledge should spread as devs move to different studios for new work.
What you on about, there’s always been crappy game releases. There’s a reason “can it run crisis” became a meme. That game is a lot older than 10 years old and it was a unoptimized mess when it was released.
To be fair, Crytek said it wasn’t unoptimized, but graphically over tuned on purpose, so that even years after release new hardware could finally play the game to its full potential and keep it a relevant graphical benchmark. That on launch only a fraction of gaming PCs would come even close to playing it on max settings with high fps was intended.
If that was a stupid idea in hind sight is another matter 👀.
I heard nothing in CryEngine 2 was multithreaded because they bet on processors getting better single core performance instead of getting more cores (which is what happened). Not sure about the gpu load though
I feel like there have always been buggy releases. But I do feel they have gotten more frequent and have become the actual norm, with people being impressed when AAA releases don’t have deal breaking bugs on release
Yeah, in the 15 minutes it takes to see if changing the setting caused any performance issues, I can easily just boot up Maze Mice and get through roughly 2 rounds with zero complications whatsoever. No need to change any settings from default or wait absurd lengths of time just to play a game without stuttering and other performance issues.
Also, your game is piss poorly optimized if you can’t get shader compiling working properly without tanking your experience in game.
I’ve played a handful of games that precompile shaders at boot up without it taking 15 minutes, and they try to hide at least some of it behind the splash screens and such. This is absurd. If pre-compilation or caching is needed, just fucking do it.
On top of what you said, that any company with the funds of Gearbox has no excuse for not being able to optimize it to happen during runtime without tanking FPS.
When you launch a game, and… your drivers have changed, or there’s been a substantial update to the game… it just tells you its compiling shaders before properly launching the game.
15 full minutes is pretty terrible though.
I think my worst ever is around 5 to 10, and that is when I am intentionally fucking about with mods and different versions of Proton and changing up Proton/Wine prefixes with new attempts at finding a working Windows component/requirement for some nonsense that by all rights should not work at all, lol.
I don’t understand exactly why but they’re not storing the computed shaders so you constantly have to redo it. But it also shouldn’t be taking that long anyway. It takes the new battlefield maybe 30 seconds to do this, so something weird is going on in the background.
Here, pick from various amounts of these 3 options, these have been the explanations for basically every ass-tier AAA game in the past couple of years:
UE5 is very flashy, ‘developer friendly’ garbage that explodes in complexity when you try to do any serious modification/customization of the engine or render pipeline
None of these AAA devs that are supposed to be experts in UE5 actually are
Management is beyond incompetent and tells devs to do things that are actively bad/harmful/destructive/broken.
I really liked the first 2 games in the borderlands series, but TPS and 3 are just… Not great. I really want to try the Tiny Tina game, but I’ve heard that it as well is not great.
Found with a single search. Short answer they can, long answer it’s more complicated. In any case, runtime compilation should never a be a thing on console.
Here’s a dumb tech question: This happens for so many games, shaders compiling holding up the process. But after an initial compile, it seems like this is written to a file and doesn’t happen on every boot. So can they not simply include pre-compiled shaders?
They run differently based on the hardware you have. They can and might precompile for consoles, but there’s no way for them to know what everyone’s different pc set up is until it’s installed.
Fun fact: Steam (at least on Linux) shares caches between users with the same hardware.
Easier to happen on the Deck since it’s the same hardware for all, but even on my desktop PC I’ve seen it downloading and uploading shaders often.
If all the shaders are compiling in the background and it‘s stutter free (minus traversel stutters, I guess) after that, I actually find that reasonable. If I can get rid of stutters by idling in the game for 15m while doing something else, then sure.
But I have a hunch that it‘s still not a smooth ride after.
At least this is the most reasonable thing I‘ve heard from GB since release lol
My experience with it has been solid, but I do run high end hardware that is muscling past a lot of stuff.
I think as usual there is some confusion between compilation stutters and the game just being very heavy for the way it looks (which it is). People online seem to be scattershot about it.
And then there's the people talking about it who don't care but like to be mad online, which is also a thing.
And then there's the weird dev that keeps mouthing off for no reason in ways that can't possibly help.
Lots of things on this one.
Still I don't think you're expected to idle for fifteen minutes. That's the point of the background compilation. You can still play more or less fine. Particularly on first boot the first fifteen of this should be a bunch of cutscenes anyway, and those lock at 30 (which I don't like at all and so many games do now for some reason).
Yeah sure but why didn’t they put a “Shader Compilation” loading screen then?
Many games have one that tell you what’s happening and give you an option to skip, better than having to find out via a tweet…
ign.com
Gorące