Arguably the issue isn't so much in the gaming landscape as the history behind the game.
His previous "game" was a crypto/NFT scam where they sold land (as NFTs) to people who were speculating they could make IRL money from renting out this "land" to plebs.
It was a complete failure. I played it for an hour on a free sever (or something similar). The gameplay was complete shit, it was tedious and grindy (like mobile titles), a repetitive and uninspired experience, graphics were ugly, the multiplayer elements didn't really make much sense.
All the NFT buyers got destroyed while Molyneux collected tens of millions.
Allegedly that money was used to "invest" in Masters of Albion.
But the real kicker is that Masters of Albion seems to be based on that NFT scam game engine (which IMO is beyond saving).
I wouldn't be surprised if this a managed exit from his previous crypto/NFT scam "game".
The only time I ever tried loot boxes was with TF2 and Dota2 back in the early to mid 2010s.
I very quickly realized that this wasn't what I was looking for in gaming. These days I mostly play indie games where monetisation is not issue. Even gave up on Paradox because I am not okay with their DLC approach. I don't mind paying for DLC, but one has to look at their release of Cities: Skylines 2 to see that they've really become the "EA of Europe".
If they have the resources (I am assuming Cloudpunk performed significantly above their initial projections), I have no issues with them taking longer (as long as they aren't stuck in development hell).
I would argue that this actually makes his opinion more relevant, as executive management is more likely to think like Ybarra, as opposed to someone that regrets the decline of Blizzard.
Playstation 2 supported Linux and it ran MIPS, which was the architecture used in SGI Indigo systems, perhaps most famously seen in Jurassic Park's "it's a UNIX system" scene.
It seems that even Playstation One has an (early) public Linux port.
I guess it depends on how you define PC. But older consoles also used CPUs that were found in desktops (and laptops), although they weren't PCs in the strict sense.
Sega Mega Drive (and other consoles) used the Motorola 68K that was also used on Macintosh and Amiga.
Game Gear and Sega Master System used the Zilog Z80 which was also used in the ZX Spectrum and other computers.
N64, PSOne, PS2 used MIPS CPUs which were often used in high-end computer systems (SGI).
Consoles did often have custom GPUs though.
That being said, these days both Playstation and Xbox are literally locked down, custom form factor, AMD Ryzen CPU + Radeon GPU PCs.
The combat was honestly subpar (especially guns), but the quest design, character design, conversation and skill/clan system was super well done (I would argue these are critical elements for gameplay).