Elden Ring really got this right by playing the cool cut scene once and then every attempt after goes straight into the fight. Could’ve done with closer sites/shrines in a few of the fights, though
The game is rendered at a lower resolution, this saves a lot of resources.
Then dedicated AI cores or even special AI scaler chips get used to upscale the image back to the requested resolution.
I get that much. Or at least, I get that’s the intention.
This is a fixed cost and can be done with little power since the components are designed to do this task.
This us the part I struggle to believe/understand. I’m roughly aware of how resource intensive upscaling is on locally hosted models. The necessary tech/resources to do that to 4k+ in real time (120+ fps) seems at least equivalent, if not more expensive, to just rendering it that way in the first place. Are these “scaler chips” really that much more advanced/efficient?
Further questions aside, I appreciate the explanation. Thanks!
Eh, EA can certainly be a problem, but it’s also an incredibly useful resource for devs operating in good faith, opening up the field for talent that would otherwise be priced out of making a game at all. Personally, I’m ok ignoring money grabs if it means the barrier of entry for resource starved talent is lowered.
The dev seems to have a good publisher that’s on their side, which is nice to see. I find it bizarre that this rebuttal comes in response to the CEO of Hinterland Studios, the devs of Long Dark, which was in early access itself for ages. Dunno if they think they’re above it all now, but you’d think they would at least be sympathetic of devs facing that kind of shit. Probably just CEO saying CEO shit. Hopefully the Manor Lords dev doesn’t let it get to them much, or at all.