A lot of PCs can't do a lot of games. That is precisely the point.
If you look at the Steam hardware survey at any given point in time, mass market GPUs are typically mid-range parts two to three generations old. And even then, those are still the most popular small fractions of a very fragmented market.
The average PC is an old-ass laptop used by a broke-ass student. Presumably that still is a factor on why CounterStrike, of all things, is Steam's biggest game. It sure was a factor on why WoW or The Sims were persistent PC hits despite looking way below the expectations of contemporary PC hardware.
The beginning of competent console ports in the Xbox 360 era revolutionized that. Suddenly there was a standard PC controller that had parity to mainstream consoles and a close-enough architecture running games on a reliably stable hardware. Suddenly you didn't need to target PC games solely to the minimum common denominator PC, the minimum common denominator was a console that was somewhat above average compared to low-end PCs.
In that scenario you can just let people with high-end hardware crank up resolution, framerate and easily scalable options while allowing for some downward scaling as well. And if that cuts off some integrated graphics on old laptops... well, consoles will more than make up the slack.
Sure, there are PC exclusives because they rely on PC-specific controls or are trying to do some tech-demoy stuff or because they're tiny indies with no money for ports or licensing fees, or because they're made in a region where consoles aren't popular or supported or commercially viable.
But the mainstream segment of gaming we're discussing here? Consoles made the PC as a competitive, platform-agnostic gaming machine.
Nobody was complaining about the Switch CPU. It was a pretty solid choice for the time. It outperformed the Xbox 360 somewhat, which is really all it needed to do to support last-gen ports. Like I said, the big annoyance that was specifically CPU-related from a dev perspective was the low thread count, which made cramming previous-gen multithreaded stuff into a fraction of the threads a bit of a mess.
The point of a console CPU is to run games, it's not raw compute. The Switch had what it needed for the scope of games it was running. On a handheld you also want it to be power efficient, which it was. In fact, the Switch didn't overclock the CPU on docked, just the GPU. Because it didn't need it. And we now know it did have some headroom to run faster, jailbroken Switches can be reliably clocked up a fair amount. Nintendo locked it that low because they found it was the right balance of power consumption and speed to support the rest of the components.
Memory bandwidth ended up being much more of a bottleneck on it. For a lot of the games you wanted to make on a Switch the CPU was not the limit you were bumping into. The memory and the GPU were more likely to be slowing you down before CPU cycles did.
And theoretically you can install Windows on a Steam Deck. Not making something specifically unsupported doesn't mean you're not building your business model around the default use case.
For the record, Nintendo games can be legally run on an emulator, much as Nintendo may protest this. It's a pain in the ass to do so without technically breaking any regulation, but it sure isn't impossible, and the act of running the software elsewhere isn't illegal.
Best we can tell this is an embedded Ampere GPU with some ARM CPU. The Switch had a slightly weird but very functional CPU for its time. It was a quad core thing with one core reserved for the OS, which was a bit weird in a landscape where every other console could do eight threads, but the cores were clocked pretty fast by comparison.
It's kinda weird to visualize it as a genre thing, though. I mean, Civ VII not only has a Switch 2 port, it has a Switch 1 port, too. CPU usage in gaming is a... weird and complicated thing. Unless one is a systems engineer working on the specific hardware I wouldn't make too many assumptions about how these things go.
I mean, the PC market has grown, don't get me wrong. Consoles use to be the only thing that mattered and that's no longer the case. You can't afford to ignore PCs anymore.
But consoles still drive a majority of revenue for a majority of games, to my knowledge. And the Switch is a huge market by itself.
More importantly, PC gamers should be extremely invested in console gaming continuing to exist. Console gaming is a big reason PC gaming is viable. They provide a static hardware target that can be used as a default, which then makes it the baseline for PC ports. With no PS5 the only games that make sense to build for PCs are targeting integrated graphics and lowest-common-denominator CPUs. That's why PC games in the 2000s used to look like World of Warcraft even though PCs could do Crysis.
Consoles also standardized a lot of control, networking and other services for games. You don't want a PC-only gaming market.
They're NOT cheaper. There is exactly one cheaper PC handheld, and it's the base model of the LCD variant of the Deck.
And the reason for that is that Valve went out of its way to sign a console maker-style large scale deal with AMD. And even then, that model of the Deck has a much worse screen, worse CPU and GPU and presumably much cheaper controls (it does ship with twice as much storage, though).
They are, as the article says, competitive in price and specs, and I'm sure some next-gen iterations of PC handhelds will outperform the Switch 2 very clearly pretty soon, let alone by the end of its life. Right now I'd say the Switch 2 has a little bit of an edge, with dedicated ports selectively cherry picking visual features, instead of having to run full fat PC ports meant for current-gen GPUs at thumbnail resolutions in potato mode.
Nintendo got to the Switch via the Wii U and through the realization that they could package similar hardware with affordable off-the-shelf parts and still drive a TV output that was competitive with their "one-gen-old-with-a-gimmick" model for home consoles.
It was NOT a handheld with AAA games, it was a home console you could take with you. That is how they got to a point where all the journalists, reviewers and users that spent the Vita's lifetime wondering who wanted to play Uncharted on a portable were over the moon with a handheld Zelda instead.
So yeah, turns out the read the article has is actually far closer to what happened than yours, I'm sorry to say.
That was less a Nintendo thing than a retailer thing. Retailers didn't take kindly to being undercut, first parties got to keep more of the revenue, so there was literally no incentive anywhere to make digital cheaper.
But let's be clear, everybody involved except for the retailer made a lot less for a physical copy in that scenario. The real thing that changed here is Nintendo isn't afraid of not having shelf space anymore.
And while key-in-cart means retailers still keep a cut, storage costs on Switch cartridges are HUGE, so there's still an incentive to get users to subsidize storage.
Physical games weren't cheaper at MSRP, but retailers were known to put them on sale or lower their price permanently more frequently than Nintendo's eShop.
Not true. Switch licensing works the exact same way as Playstation licensing. All accounts on your "main" console can play the same digital game, even offline. Your account can play digital games on any console as long as it's logged in and connected to the Internet.
It has tons of emotes (or things that can double as emotes) and multiplayer. In a world where making game characters expressive was not a thing, much less at the player's command, they felt like puppets.
Hah. As a kid I used to just hang out or make up stories in Lucasarts games, like Monkey Island and especially Maniac Mansion. I know I wasn't alone, because there were multipe contemporary games built around that idea, including form Lucas, even before The Sims came out. Toe Jam and Earl 2: Panic on Funkotron was also a good, weird roleplaying avenue.
And I did engage in some amount of "let's make my house in this map editor" back when games came with map editors. We all did, I think.
Oh, and some games I'd play just to listen to the music. It's hard to argue this was unintended, though, given how many games had sound test modes. I remember I'd fire up Panzer Dragoon just to gawk at the intro, which I realize seems silly if you look at it now.
You can enter a text prompt and they spit out a texture based on it, which sure seems to just be a good old image generation model. They do generate mesh from images, which probably has some ML involved, although it's harder to tell how much is just good old photogrammetry, and they do face and body animation from video source. I think that's all part of the Unreal Engine 5 metahuman package, which I'm pretty sure does use some machine learning. Oh, and I am pretty sure a bunch of the writing and character AI has been machine-created, be it in real time or baked offline.
Part of the problem is that people aren't super clear on what "AI" is supposed to mean, so it's hard to know what they're supposed to be angry about. The texture generation thing at least is clearly in the GenAI danger zone.