Up to a certain point in the early to mid 2000s, virtually all home console and PC games were designed for CRT displays. I’m not sure where you’re getting the idea from that the type of display that was used by 99% of gamers on these systems was somehow not influencing the art design and technology of games.
I’m currently toying around with ares (the only fully cycle-accurate SNES emulator) and it has a lovely selection of CRT shaders (that are also available for other emulators). Try out crt-maximus-royale (or the half-res-mode variant). At least to me, the latter looks perfect, with just the right amount of blur, distortion, bloom and scanlines - and it comes with lovely details, like the bezel reflecting the image in real time and speaker grills filling the rest of the screen.
Someone uploaded a gallery with various games to reddit that shows just how versatile this shader is:
Now I’m curious what your criteria are. Do none of the shaders shown in the video appeal to you? To me at least, they look remarkably close to several types of old CRT TVs that I remember.
Thanks. I shall avoid the motion blur variant as best as I can, because that’s one of several aspects of this device I do not remember fondly.
I borrowed a friend’s Game Boy for an afternoon when I was a kid and I was so disappointed by it (primarily the screen, but also poor ergonomics and the limited nature of its games) that I lost nearly all interest in gaming for a year.
I had never even heard of this game, but reading the description on Wikipedia, it sounds absolutely fascinating. Thanks for the recommendation, I’ll definitely give it a go!
I think this tracks. Last time I checked, it had eerily similar performance at 1080p as a GTX 1080 at 1440p (same settings otherwise), at least with games that don’t need more than 4GB of VRAM, like Assassin’s Creed Origins.
I’ve done this with other games (to the Deck and to and from other devices), but that’s not something you need to worry about. Streaming a game doesn’t have a noteworthy impact on the frame rate anymore in the age of GPUs with built-in encoding circuitry. Provided you have a half-decent home network, it’s hard to distinguish it from playing directly.