This is one case where I almost posted a summary from another source that linked to it, but there seems to be a (loose?) norm here around posting the original sources.
That only works when hardware companies have access to the game game before the release to test, and it doesn't sound like that was the case with Starfield. Intel, to their credit, got a patch out before most people even had access.
Many developers probably aren't even testing on Arc because it's new and that only exaggerates the problem. If they don't test, they can't alert Intel to the issue.
My point aren't standards like most people think of them. They're often poorly documented, have lots of ambiguity, and many times they contain bugs/mistakes themselves. Intel potentially fully supported DX12 as far as we know, and these are all just weird exceptions.
I don't really fault Intel for this more than say Bethesda. Both AMD and NVidia still supply a number of game-specific fixes for crashes, glitches, performance improvements in their drivers for new and old games. They've been doing this for decades now and still has to release driver updates to fix game specific issues.
The fact that the API is a "standard" doesn't mean everything is clear-cut. Developers often use these APIs in unexpected ways which were never accounted for or even mentioned in the documentation.
A lot of the article is focused on how games journalism has adapted to meet the current business environment (read advertising). Gaming is certainly not alone in that. Newspapers were hit a long time ago, and we've seen the same issues there too.
I'm curious -- what value do most people get from games journalism? Would people really miss if pcgamer, kotaku, or eurogamer just disappeared?
I'd really love to see a detailed balance sheet for some of these orgs to see what the actual operating costs are and how much is going to exec salaries.
There was a recent game announcement that was Amd sponsored that has both (I think it was the avatar game?). I think it's very likely many of the games are time or budget constrained, and so when they're given money from AMD they implement that first and if they've got time or previous code add DLSS.
This feels like the old console money that Sony & Microsoft would give where developers focused some extra optimization or early engine design around one platform because of extra funding. If I recall, Sony gave a bunch of money to xplatform games.
I get his general frustration with the F2P and making bank on microtransactions, but I think the Larian story somewhat contradicts that even though the road to BG2 was long and difficult. They've slowly been refining the work since the 90s and you can see this reflected in the reviews their games got. Sure, BG3 with that scale was still a risk, but it's built on so much knowledge they've built from the Divinity series that at least some of that seems mitigated.
Oh it definitely is. I mean we've had many re-releases of a 2600 including multiple Flashback consoles.
AFAIK, only one was moddable to use old catridges, but none shipped with that capability. But many came with a 20+ games and were significantly cheaper.