They kind of been for a while. At least architecture wise but with the new Xbox ROG handheld you can access your steam library or install a different OS like Bazzite altogether. You can‘t play your Xbox games on it though, which probably means the days of Xbox as a console are over. But Xbox players must‘ve known they won‘t be able to transfer their game library to new device anyway.
Not all Xbox players, clearly. With Microsoft doubling down on not having any physical media containing the full retail game on disk from Xbox 1 onwards, every one of those accounts are hosed unless they have an archive of all the downloaded files needed.
My advice to them - mod your consoles (360 and below) if you have them, rip your discs, and get ready for when Xbox services (for console hardware) are sunset.
To be honest, once that starts happening I wont be worried about my digital library. I’ll be so pissed that ill just stop spending any money on any games ever again. I’ll be back to sailing the high seas.
I’ve always been okay with keeping gaming libraries digital - but I think the larger console population might be okay with that too if we could disconnect digital games from account-based ownership - the kind where a company can go “Oh, whoops, we lost the license to this fart sound effect. We’re going to have to remove this game from your library.”
That’s called releasing your games on GOG, modding in missing assets, or easy piracy, all PC-centric features.
There is no way in hell any console platform would allow that level of untethered ownership ever again. Nintendo was the last one, and it’s gone with the switch 2.
Have been since the 8th generation, unless you count Nintendo, whose 8th gen (Wii U) was a PowerPC Architecture (Mac), and 9th gen onward is a glorified Android phone with a dedicated GPU.
I guess it depends on how you define PC. But older consoles also used CPUs that were found in desktops (and laptops), although they weren't PCs in the strict sense.
Sega Mega Drive (and other consoles) used the Motorola 68K that was also used on Macintosh and Amiga.
Game Gear and Sega Master System used the Zilog Z80 which was also used in the ZX Spectrum and other computers.
N64, PSOne, PS2 used MIPS CPUs which were often used in high-end computer systems (SGI).
Consoles did often have custom GPUs though.
That being said, these days both Playstation and Xbox are literally locked down, custom form factor, AMD Ryzen CPU + Radeon GPU PCs.
Playstation 2 supported Linux and it ran MIPS, which was the architecture used in SGI Indigo systems, perhaps most famously seen in Jurassic Park's "it's a UNIX system" scene.
It seems that even Playstation One has an (early) public Linux port.
I mean… Yes… But you wouldn’t be able to run steam or install games designed for the PC platform on the hardware, which is what the 8th generation consoles (at least a hacked PS4, Xbox 1 is unhacked) are able to do. I suppose even with that in mind, the Original Xbox and PS2 would count because Doom, but still.
Yeah I define PC less by the hardware but whether I can install whatever OS I want and whatever programs I want without restrictions, which consoles don’t let you do. And consoles these days are way more powerful than PCs from decades ago yet still crippled when it comes to expected PC functions.
Apple announced the transition in 2005 but began using Intel chips in 2006, but still, PowerPC is best known as the chips that powered Macs for a little over a decade.
I’m not the person you responded to — I actually did not know that the Wii U used PowerPC. I did know that the Xbox 360 did and have made that argument.
It’s a big egregious to call it a Mac (though I do, mostly in jest), but, that is the connection.
Of note, the PowerPC chips were made by IBM (and Motorola according to the article I linked — I did not know that before). So, a former Apple competitor. And now (since mid-2020) Apple competes with Intel, which they switched to from PowerPC. So, bit of a tangent at this point, but these rivalries we have as users are partnerships that come and go in the business world.
I’m confused by your first sentence - the last machines they made that used PPC were in 2005. To me it reads like you’re correcting me but saying exactly the same thing..?
The fact that Macs stopped using the architecture twenty years ago makes it bit of an odd connection, I would argue. As you say, the 360 used the architecture far more recently and over 84 million of those were sold. It’s not like it was some obscure device.
The main reason I used the comparison is because no PC analog outside of Apple’s space (Unless you count Linux on PowerPC?) used the architecture. x86 has a strong association with Windows, PC gaming, and “PCs” as a whole, while PowerPC’s most iconic use in the personal computing space was in consoles and in Apple’s lineup. Because of that, I chose to mention the PowerPC Mac line.
But the Switch and beyond use ARM, the architecture Macs have used for the last five years?
It just seemed an odd thing to mention given how long it’s been since Macs used PPC. I know they used to, but I’m old enough to have used 68000k Macs too so of course I remember that time.
When I think of portable ARM devices, my mind immediately snaps to cell phones and the Android ecosystem (which is what the Switch was compared to and even successfully hacked to run Android on).
Which is fair enough and totally reasonable - it was purely in the context of that comment it seemed odd. You had a device that actually uses the architecture that Macs use and one that used an architecture that they don’t but… yeah. It’s not important, it just made me chuckle.
Just for clarification the Wii, Wii U, 360, and PS3 all used the Cell Broadband Engine which is a PowerPC derivative. The original PowerPC was made by the AIM Alliance which stands for Apple, IBM, Motorola. Apple and Motorola had a long history of collaboration as all Apple machines had used Motorola processors up to that point.
They kinda always were, tbh. Just with some kind of unique limitation specific to each console that prevented them from being used for any purpose other than playing a specific brand of video game.
I believe if a new generation of consoles are released they’re just going to be glorified streaming boxes. Sony has pretty much hit a plateau as far as graphics go, there’s not much you can improve upon now. And sure people are cancelling and complaining about gamepass but Microsoft might just go the route of “we’ll tell you what you want, and if you want this exclusive you’ll have no choice but to subscribe”.
So you’ll pay for this streaming box, you’ll get maybe a couple months of the subscription for free, then you’ll pay $30 to $50 a month to get access to their catalog. Sony and Nintendo will follow suite. If you want to actually “own” anything you’ll play on PC and even then that’s truly up for debate. Like do you actually own the content you buy on Steam?
Steam as a platform is easy to crack (that’s why so many steam game repacks circulate), but there’s a degree of expected stability due to their reputation for the Steam platform’s longevity built over two decades. Additionally, their status as a private company with no external pressure puts them ahead of MS, Sony, and Nintendo in terms of future turbulence.
You don’t need to get just new games. Do what a ton of people do: wait till they’re on sale. There is literally no hurry, wait till any game reaches the price point you think its worth. And then you get the best possible version of the game, both in terms of patches, most of the time with DLC included and the modding community has had time to make stuff (if that’s relevant).
they just need to keep the same graphics and improve optimization and stop relying on dlss and frame gen.
that shit works well, but id be really impressed to just play a game that looks great and throw on ray tracing and still dont need to use dlss to get above 100 fps.
It’s because of how the tech works. It uses the previous frame to render the next, which leads to ghosting. It’s not as bad with DLSS4 but it’s still there
So many folks online seem to be upset that there are very few games are exclusive to only the latest consoles. They want “next gen” games but fail to realize that the product they want would not have a large enough market for the development costs.
Nintendo still makes plenty of exclusives. They’re not yet obsessed with live service games. Sony and Microsoft have wasted insane amounts of time developing trash that will never see the light of day.
Nice idea to make Xbox and PC one platform, but given how badly the ROG Xbox handheld was implemented (since you can’t play Xbox games on this “Xbox”), good luck with that.
Problem isn’t that, since the handheld is just running windows it can only run games with a PC release. If the game was never ported to PC they currently have no way to play it. They really thought through the whole release real well.
Honestly I’m surprised they’re still thinking about that. The last major release they had was RDR2 in 2018. That was 7 years ago now. In that time not only have handhelds and more PC devices exploded but also PCs as a whole thanks to COVID. It’s just more popular than it ever was. If they go forward with console only they better have a massive kickback from Sony now that Xbox isn’t playing as much, otherwise they’re missing out on tons of sales.
Edit: Actually writing that out, I bet the PC port is planned a year behind and always was, but GTA 6 was planned years ago before the boom. I wonder if we’ll see RDR3’s PC come at the same time.
Yes. Kinda. Some games from the X1 era aren’t available on PC, like Halo 5. X360 games that are backwards compatible but weren’t actually released for PC, like Red Dead Redemption, won’t be available either.
The only occasion I could buy that a “console makes the exclusives” is when the costs are so high that the investors decide a $60 price tag isn’t enough.
That can be alleviated with DLC, or live service bullshit; or it can become an incentive to buy a particular console.
Then, when someone is braindead and doesn’t want a big epic award winning adventure, they’ll use that same console to play Fortnite. Thus, God of War helps sell VBucks or whatever.
It’s a weird analysis, but even though we no longer see console exclusives and it’s seen as a pro consumer move, I also think it was just a way for managers to boost one quarter’s revenue, and it wasn’t really good for the console ecosystem as a whole, especially considering how it would fund future exclusive epics.
Im guessing speed. To me, one of my favorite improvements over the ps4 was load speeds of everything. It felt like I went back in time to SNES days when walking into the next room was instant, unlike on the ps1 for the same games.
Right? Consoles used to exist because specialized equipment would perform better than general consumer electronics. That hasn’t really been true for a long time now.
That’s not exactly accurate, it’s ignoring significantly that hardware in consoles is mostly static. The consistency and limitations on hardware undoubtedly is an important part of that equation
Quest is cheap and good hardware, but its software layer is dystopian hell. Obviously, I mean, it’s meta.
I love the cheap access to decent PC and embedded VR on my quest 3, I absolutely hate this OS and its constant corporate spam. I know, this is kinda why its so cheap. Doesn’t mean I have to like it.
Xbox consoles have been mostly sold out in Finland. Some retailers still have some stock left, mostly Series S, but it seems that Microsoft has exited Finland.
Yeah, for me it’s not even just the creative freedom, but an actual fuzzy feeling that me and the devs are having fun together. Open-source games also hold a special place in my heart for that reason, no matter how scrungy they are.
Yeah, I might be showing my age, but my interpretation of “a better game” was right away “a more fun game”, which got followed up with the thought: Did it make them more fun?
I feel like we had fun figured out pretty well in the last century already. And in many ways, the higher specs are used to add realism and storytelling, which I know many people enjoy in their own way, but they’re often at odds with fun, or at least sit between the fun parts of a game.
Like, man, I watched a video of the newest Pokémon game and they played for more than an hour before the tutorial + plot exposition was over. Practically no fun occurred in that first hour.
Just imagine putting coins into an arcade cabinet and the first hour is an utter waste of time. You’d ask for your money back.
I would love to buy a game at a reasonable price that I actually have a chance of finishing in a weekend, or maybe one marathon session. A game with a great story and good gameplay that isn’t drawn out over 30 hours.
As for graphics, I’m quite happy if they drop all the shiny new bullshit that you have to watch a Digital Foundry video on just to even know it exists in the game, and rather focus on a good art style.
And it goes without saying, pay the people who make the games more, and the mega corp CEO’s less.
Shinobi: Art of vengeance. It is a 2D brawler with stages that end with a boss, but they have metroidvania elements to add value for replaying them. It is about 10 hours to go start to finish while doing a lot of the back tracking stuff. Simple story but incredible combat and good level design(only one level I thought was a bit meh), the art and animations are fantastic as well. Very much a new take on the old games, and has a demo to try as well.
The “paid more to work less” part is not tenable. The games that fit that bill that you’re thinking of represent less than 1% of their peers. They are outliers, not a sustainable industry; the exception, not the rule. For every Silksong there are maybe 100 that make just enough to make ends meet, and 1000 duds that will never pay for themselves that you’ve never heard of.
What you’re saying is you want fewer steady incomes and more lottery winners. Sure, that’d be nice, but it’s not a sustainable strategy.
Ex. Wildgate launched recently. They deliberately opted to sell the game for a flat $30 rather than going F2P/P2W. As a result, they regularly get reviewed negatively by people saying “dead game, greedy devs won’t lower the price to compete with F2P games” and “the cosmetics you unlock by playing look better than the ones you can buy” (yes, there are people unironically posting those as negative reviews).
So at least understand why the most common strategy is often exploitative, and why it’s actually not a simple solution that a bunch of armchair experts have figured out in a comments section.
Moore’s Law was originally formulated as the cost per integrated component being cut in half every x months. The value of x was tweaked over the decades, but settled at 24.
That version of the law is completely dead. Density is still going up, but you pay more for it. You’re not going to build a console anymore for the same cost while increasing performance.
High end PC’s can still go up, but only by spending more money. This is why the only substantial performance gains the last few GPU generations has been through big jumps in cost.
Important to note, the current chip fabrication process of 5nm is very close to limits imposed by the laws of physics. Unless a wholly different chip making process is invented that can go even smaller, we might be looking at the actual limit of the tech.
gamesradar.com
Aktywne