Like many UE games over the years, they didn’t properly optimize Unreal itself for their use, and there were already several ini tweaks up on the Nexus to remedy this the day of launch.
Went from 27 average fps when in exterior cells to a solid 60, with an unsupported GPU by just using one of these ini tweaks.
This is such a common problem with games on any iteration of Unreal Engine, and has been for over 2 decades. Since it’s so common to see, I wonder if the documentation just sucks.
Yeah, I wonder if that’s perhaps the result of basically stapling the old game engine onto UE5 in order to preserve the core gameplay. Back when Oblivion was first released, multicore CPUs were incredibly rare, so it’s likely the engine was not built to take advantage of them. But ever since then, most of the improvements in CPUs have come in the form of adding more cores rather than increasing clock speed, and it’s by no means trivial to convert single-threaded code into multi-threaded. Most likely it would require a complete rewrite, which they’d probably want to avoid in order not to introduce more bugs.
But of course, it could also just be UE5’s fault, since even a single core on a modern CPU should not be slower than a 2006 model.
For day one, performance is actually fine. I have much bigger gripes than getting fps dips in the open zones. Like levelling ffs. I have 100 strength, willpower, and blades, but am doing less damage to mobs now than I was doing in the beginning of the game. Or levelled loot drops and quests.
The key to oblivion is to pick tag skills that you won’t use. If your build is a stealth archer, pick block blunt and restore. You only level when your tagged skills level, so your archery illusion and sneak will be 100 but your character will be sub level 10 so you’ll basically be a god
Sort of. The new leveling system has minor skills contribute to your levels, to a lesser degree. IIRC it’s something like 10 major levels or 20 minor levels (or some combination thereof) to get a character level.
The level system doesn’t work that way anymore. Now when you level up, it doesn’t matter what skills you leveled up when you get a new level, you always get 12 points (called “virtues”) to spread around to any stat. Luck, however, takes 4 “virtues” to level one point, while the others are just 1:1 and you can add up to 5 at a time.
I can level up entirely through using Agility linked skills but then put my stat points into Strength and Intelligence instead of agility.
The real issue has to do with the level scaling on enemies still being the worst of any Elder Scrolls game because they didn’t change anything about that from the OG. So once you’re level 50, everything has the best weapons and armor on them.
If somebody didn’t realize it was almost certainly going to run poorly the second it was revealed to use UE5, I wouldn’t even know what to say to them.
Fortnite, Wukong, Tekken 8, Layers of Fear, Firmament, Everspace 2, Dark and Darker, Abiotic Factor, STALKER 2, Jusant, Frostpunk 2, Satisfactory, Expedition 33, Inzoi, Immortals of Aveum, Starship Troopers: Extermination, Ninja Gaiden 2 Black, Lords of the Fallen, Robocop, Myst (UE5 remake), Riven (UE5 remake), Palworld, Remanant 2, Hellblade 2, Subnautica 2… and the list keeps growing.
When a big studio skips QA and releases a broken game, it’s not the engine’s fault, it’s the studios fault. As long as consumers tolerate broken games that can maybe be fixed later (if we’re lucky) then companies will keep releasing broken, unfinished, unpolished, untested games. Blaming UE5 is like blaming an author’s word processor for a poorly written novel.
Idk I think the only one of those on that list that I’ve played that ran well enough that I’d consider it ok was tekken and I’m assuming that’s more because it’s a fighting game.
No I’m saying that it being a fighting game meant that it’s much easier to optimize because you have such a fixed camera angle and few characters on screen.
It’s because there’s a lot more optimization you can do on a fighting game vs a big open world, it just doesn’t have to render that much comparatively.
Fighting games would run well on a fucking smart fridge. They’re by far the least performance hungry game genre. There is no live loading of assets, the Background scenery is 100% static and there are usually just two characters on the screen on any given moment. It would take actual effort to fuck up the performance of something so simple
Well that’s a pretty shit list. You have there games that aren’t using UE5 (Layers of Fear 2), that are known to have poor performance (STALKER 2), that just released into early access (Inzoi) and that haven’t even released into early access (Subnautica 2).
I’d throw half the list out the window, actually probably more because the other half of the list are mostly games I don’t know enough to evaluate their performance.
I didn’t know the original game was remade. I assumed you meant layers of fear 2 because the original layers of fear wasn’t even on Unreal Engine and Layers of fear 2 is on UE4. Nothing I said was explicitly wrong. It was wrong in the context only because you weren’t precise with what you’re saying.
And how nice of you to pick out the one thing I was wrong on while completely ignoring all the other examples. For instance how the fuck can you put Subnautica 2 on that list when it’s not even in early access?
I’m not blaming UE5, but I’m capable of pattern recognition. There’s a pattern of developers not fixing UE5 issues and releasing games with them still present. The fault lies with both game developers and UE developers.
UE is the most popular gaming engine, so it’s used on the most projects and has a high amount of visibility. No matter which engine you build a game with, there are many factors to keep in mind for performance, compatibility, and stability. The engine doesn’t do that for you.
One problem is that big studios build games for consoles first, since it’s easiest to build for predictable systems. PC then gets ignored, is minimally tested, and patched up after the fact. Another is “Crysis syndrome”, where developers push for the best graphics they can manage and performance, compatibility, and stability be damned - if it certifies for the target consoles, that is all that matters. There is also the factor of people being unreasonable about their hardwares capabilities, expecting that everything should always be able to run maxxed out forever… and developers providing options that push the cutting edge of modern (or worse, hypothetical future) hardware compounds the problem. But none of these things have anything to do with the engine, but what developers themselves make on top of the engine.
A lot of the responses to me so far have been “that’s stupid because” and then everything after “because” is related to individual game development, NOT the engine. There is nothing wrong with UE, but there are lots of things wrong with game/software development in general that really should be addressed.
I’m having relatively good performance in 6600rx on Linux but after a while theres some sort of GPU memory leak (would be my guess) where fps halves until the game is restarted.
youtube.com
Aktywne