Some games already use P2P, or provide servers for the community to run, so only the private servers would need replicating. Even in that case, I’d argue that having “some” common API, would make it easier than chasing around everyone’s different implementations.
The nice thing about Steam, is that it’s “too big to clamp down”:
People used to 🏴☠️ on the high seas, for many reasons.
Steam came up as a “single point of sale”, at the same time as Netflix was doing the same for movies and series.
Over time, companies tried to carve out chunks of the pie, restoring some of the original fragmentation…
…but while Netflix has been torn to shreds of its former glory, Steam is still the main “single point” for games…
…with a “single point” DRM
Steam’s DRM only exists because game updates keep coming out with constantly updating DRM versions. The moment Steam tried to act against its clients, and they decided to leave Steam, every Steam game copy at that moment, would get cracked all at once.
Maybe EA, MS, Nintendo, Sony, etc. don’t see that as a great thing… and that’s why they’ve been setting up their own stores… but I think it’s AWESOME! 😁
Well, this has been a blast from the past. Haven’t set up all the drivers, or an internet connection, but with the turbo button it’s been the fastest Win98 install I’ve ever done 😆
Because traditionally there were few Linux devices.
Android 15 is going to change that: it comes with a virtual machine API and a Linux Terminal running Debian for ChromeOS compatibility.
Soon, the most popular consumer OS in the world will be Linux:
3.3 billion: Android / Linux
2.2 billion: Apple iOS/macOS *NIX
1.6 billion: Windows
400 million: Windows 11 + WSL 2.0
250 million: gaming consoles
"millions": SteamOS Linux
Wine might still make sense to keep things standardized for some time, and as a compatibility layer for older games, but native Linux games will also work on the Linux solutions for Android, Apple, and Windows.
It might be fine for non-interactive stuff where you can get all the frames in advance, like cutscenes. For anything interactive though, it just increases latency while adding imprecise partial frames.
Texture packs or not, IMHO the key point is they’re optional, not a requirement for the game to be playable. Games that depend on photorealism, are bound to end up in deep trouble.
The objective part is in whether it matches what the creator intended.
Sometimes they intended crisp contours, like in ClearType; sometimes they intended to add extra colors; sometimes they designed pixel perfect and it looked blurry on CRT; very rarely they used vector graphics or 3D that can be rendered at better quality by just throwing some extra resolution.
Many artists of the time pushed this tech to these limits, “objectively better” is to emulate that.
Have you checked the examples…? I feel like we’re going in circles. There are cases where the CRT looks objectively better, supporting examples have been provided, technical explanation has been provided… it’s up to you to look at them or not.
If you wish to discusd some of the examples, or the tech, I’m open to that. Otherwise I’ll leave it here. ✌️
All pixels are a “blur” of R, G, and B subpixels. Their arrangement is what makes a picture look either as designed, or messed up.
For rendering text, on modern OSs you can still pick whichever subpixel arrangement the screen uses to make them look crisper. Can’t do the same with old games that use baked-in sprites for everything.
It gets even worse when the game uses high brightness pixels surrounded by low brightness ones because it expects the bright ones to spill over in some very specific way.