“In a sense, Nintendo is the victim of its own strategic foresight. With the Switch, it was the first to spot that the narrowing gap in processing power between mobile and at-home devices had enabled a unification of handheld and home gaming experiences.”
I was out after this. This is patently wrong. Crucially, Nintendo capitalised on the failure of the vita using the exact same strategy but with a caveat: 3rd party memory cards.
The PSVita had the power to play former gen games in a compact format and MUCH better connectivity than the switch. It failed on the stupid memory cards. Nintendo did not. That’s pretty much it. Sony had the AAA handheld market with the PSP and blew it. I’d be very surprised if something like this wasn’t uttered by an MBA regard in sony’s corpo structure:
“If we divide our playerbase between handheld and dedicated living room console too much it will damage our business”.
So instead of capitalising on a massive library of games that could easily have been ported to a handheld format (the PS4 had 1,4TFlops, we’ve surpased that on mobile before the PS5 launched) SONY decided to double down on AAA and subsequently in live service games, and here we are…
If someone can create a handheld AAA console is a team lead by mark cerny with the support of AMD. To this day I don’t know how we end up with PS portal instead…
So here we are, Sony carved out a niche (AAA and fidelity) from the Nintendo handheld success, and just decided to sit on their hands with it. There was exactly 0 foresight from Nintendo. They knew from the beginning the living room was lost to either MS or Sony to begin with.
Nintendo got to the Switch via the Wii U and through the realization that they could package similar hardware with affordable off-the-shelf parts and still drive a TV output that was competitive with their "one-gen-old-with-a-gimmick" model for home consoles.
It was NOT a handheld with AAA games, it was a home console you could take with you. That is how they got to a point where all the journalists, reviewers and users that spent the Vita's lifetime wondering who wanted to play Uncharted on a portable were over the moon with a handheld Zelda instead.
So yeah, turns out the read the article has is actually far closer to what happened than yours, I'm sorry to say.
Yes, that’s why they took an ARM based Tegra (like the vita with the powerVR from imagination tech) unlike the in-house wiiu tech… Why look at evidence when we can ignore it and just BS to defend my fav plastic box maker…
Also, the WiiU is basically the PSP remote play in one package, 6y later…
C’mon man, do Nintendo fanboys really have to ape Apple fanboys for everything. Next thing you’re going to tell me how palworld should be sued to the ground…
They took the Tegra because it was sitting in some Nvidia warehouse and they could get it for cheap, or at least get it manufactured for cheap. At least that's what the grapevine says about how that came together. It does fit Nintendo's MO of repurposing older, affordable parts in new ways.
I always get a kick of being called a Nintendo fanboy. For one thing, I don't fanboy. Kids fanboy, and I haven't been one of those in ages. I don't root for operating systems or hardware. I don't even root for sports teams.
For another, back when I was a kid I was a Sega kid. My first Nintendo console was a Gamecube. I was an adult at that point. As a teenager I had a Saturn. I stand by that choice to this day. Better game library than the Dreamcast. Fight me.
But that doesn't change what happened. The Wii U bombed extremely hard, but there was certainly something to the idea of flipping screens. The Switch is ultimately a tweaked Nvidia Shield and little else. The R&D around it clearly went into seamlessly switching the output from handheld to TV and the controllers from attached to detached. And you know what? They killed it on that front. People don't give enough thought to how insane it is that the Switch not only seamlessly changes outputs when docked, but it also overclocks its GPU in real time and switches video modes to flip resolution, typically in less time than it takes the display to detect the new input and show it onscreen.
It's extremely well tuned, too. If you hear devs talk about it, in most cases it takes very little tuning to match docked and handheld performance because the automatic overclock is designed to match the resolution scale.
The Switch didn't succeed (and the Wii U didn't fail) at random. Similar as some of the concepts at play are, the devil is in the detail. Nintendo sucks at many things, but they got this right. Competitors stepping into this hybrid handheld space ignore those details at their peril, and that includes the Switch 2.
At least that’s what the grapevine says about how that came together.
This is when I stopped reading because this is demonstrably false. The 214 scratches the Cortex 53 cores and is semi-custom hardware. That also ignores the obvious deal to cheapen the Tegras, which was basically handing NVIDIA the Chinese market on a silver platter, which Nintendo really didn’t cater at all…
AMD had nothing low power/long battery to offer but the jaguar at the time, so Nintendo had to deal with one of the most hated companies in order to get a competitive mobile chip, rather than doing it in-house with licensed off the shelf ARM chips like before. They took a page from SONY and went with a custom GPU based solution, but lacking a solid hardware department (AMD did a lot of the heavy lifting over the years) they just went with NVIDIA because there was almost no other game in town at that price (see Chinese market above, no one else was trying to get into streaming for the Chinese market and needed a strong game library).
That’s it
Edit: regarding output switching… You must be using an apple phone and never heard of MHL… Jesus… It’s like with Apple fans, shit exists for a decade but they honestly think it was Apple that came up with it. M8, and let’s not start with the joycons, they are pretty shit, prone to failure and the design is so garbage that even Nintendo spent R&D not to use that trash sliding mechanism again…
I would recommend continuing to read, then. Or re-reading. None of the detail you provided contradicts what I said at any point.
In fact, the ultimate takeaway is exactly the same. Feel free to substitute all that detail at the point where you "stopped reading" and keep going from there. It's as good a response as you're going to get from me.
Although, since you're going to be anal about the historical detail, it's incorrect that Nintendo "didn't cater at all" the Chinese market, they had a presence there through the iQue brand all the way up to the 3DS and these days they ship the Switch there directly through Tencent. I wasn't in the room to know what the deal with Nvidia was. I have to assume the Shield ports were both low hanging fruit and some part of it, but I seriously doubt it was a fundamental part of the deal to not compete with them there, considering that it took them like two years after the Switch launch and just one after they stopped running their own operation to partner up with Tencent. You'd think "handing the Chinese market on a silver platter" would include some noncompete clause to prevent that scenario.
In any event, we seem to agree that Nvidia was the most affordable partner that could meet the spec without making the hardware themselves. So... yeah, like I said, feel free to get to the actual point if you want to carry on from there.
The Vita had far more problems than just memory cards. You came very close to identifying what the real problem was, Sony couldn't sustain supporting two separate platforms at once. And conversely, Nintendo unifying onto a single platform was what saved the Switch.
There’s a lot here, and yes, the total addressable market for the Steam Deck is currently less than either Switch will sell in a single quarter, but the video game market is a very different thing now than it was in early 2017. The Switch was the only game in town; now it’s not. Live service games make up a significant amount of what the average consumer wants, and those customers largely play on PC for all sorts of reasons. The Switch 2 is no longer priced cheaply enough that it’s an easy purchase for your child to play with, abuse, and possibly break. The console market in general is in the most visible decline it’s ever been in, also for all sorts of reasons, and those handhelds from Sony and, at least, Microsoft are likely to just be handheld PCs as well.
Development on blockbuster system sellers has slowed way down, which comes hand in hand with there just not being as many of them, which makes buying yet another walled garden ecosystem less appealing. This walled garden has Pokemon and Mario Kart, so Nintendo’s not about to go bankrupt, but if we smash cut to 8 years from now and the Switch 2 sold more units than the Switch 1, I’d have to ask how on earth that happened, because it’s looking like just about an impossible outcome from where we stand now.
Also, there’s this quote:
But, although Microsoft has now been making Xbox consoles for over 20 years, it has consistently struggled to use that experience to make PC gaming more seamless, despite repeated attempts
Look, I’m no Microsoft fanboy. Windows 10 was an abomination that got me to switch to Linux, and Windows 11 is somehow even worse. The combination of Teams and Windows 11 has made my experience at work significantly worse than in years prior. However, credit where credit is due: Microsoft standardized controller inputs and glyphs in PC games about 20 years ago and created an incentive for it to be the same game that was made on consoles. It married more complex PC gaming designs with intuitive console gaming designs, and we no longer got bespoke “PC versions” and “console versions” of the same title that were actually dramatically different games. PC gaming today is better because of efforts taken from Microsoft, and that’s to say nothing of what other software solutions like DirectX have done before that.
Still, the reason a Microsoft handheld might succeed is because it does what the Steam Deck does without the limitations of incompatibility with kernel level anti cheat or bleeding edge software features like ray tracing (EDIT: also, Game Pass, the thing Microsoft is surely going to hammer home most). Personally, I don’t see a path for a Sony handheld to compete.
live service games make up a significant amount of what the average consumer wants, and those customers largely play on PC for all sorts of reasons
You are leaving out the elephant in the room: smartphones.
So, so, so many people game on smartphones. It’s technically the majority of the “gaming” market, especially live service games. A large segment of the population doesn’t even use PCs and does the majority of their computer stuff on smartphones or tablets, and that fraction seems to be getting bigger. Point being the future of the Windows PC market is no guarantee.
I don’t think the people gaming on smart phones are the same demographic that would compete with the Switch 2 or a handheld PC. It’s not a lot of data, but take a look at how poorly Apple’s initiative for AAA games on iPhone has been going. There are more problems with that market than just library. The PC market has been slowly and steadily growing for decades while the console market has shrunk.
I am vastly oversimplifying a lot, but… Perhaps mobile gaming, on aggregate, is too shitty for its own good? It really looks that way whenever I sample the popular ones.
I suspect it’s more that the time people can and do spend playing phone games has just about zero overlap with PC games. You play phone games while on the bus or on the toilet, you play PC games while at home behind your desk.
I think a huge reason so many people with a Steam Deck also have a Switch is that the Switch had a 5 year head start. Hades did really well on Switch, but I can’t imagine anyone choosing that version of the game if they had a Steam Deck, and the same applies to Doom, The Witcher 3, etc. I have a Switch and a Steam Deck, but I haven’t used one of those machines in years.
Really wild to go from this vibe at the end of the seventh generation of consoles to the one we’re at now. For me, and many other people that like high quality gaming experiences, mobile games have completely vanished.
They're NOT cheaper. There is exactly one cheaper PC handheld, and it's the base model of the LCD variant of the Deck.
And the reason for that is that Valve went out of its way to sign a console maker-style large scale deal with AMD. And even then, that model of the Deck has a much worse screen, worse CPU and GPU and presumably much cheaper controls (it does ship with twice as much storage, though).
They are, as the article says, competitive in price and specs, and I'm sure some next-gen iterations of PC handhelds will outperform the Switch 2 very clearly pretty soon, let alone by the end of its life. Right now I'd say the Switch 2 has a little bit of an edge, with dedicated ports selectively cherry picking visual features, instead of having to run full fat PC ports meant for current-gen GPUs at thumbnail resolutions in potato mode.
We don’t really know this. It is possible that the CPU will be trash. Nintendo’s devices don’t really support genres that require CPU power (4X, tycoon, city-builder, RTS, MMO etc.).
While we don’t have detailed info on the Switch 2 CPU, the original Switch CPU was three generations behind at the time of the console’s release.
Best we can tell this is an embedded Ampere GPU with some ARM CPU. The Switch had a slightly weird but very functional CPU for its time. It was a quad core thing with one core reserved for the OS, which was a bit weird in a landscape where every other console could do eight threads, but the cores were clocked pretty fast by comparison.
It's kinda weird to visualize it as a genre thing, though. I mean, Civ VII not only has a Switch 2 port, it has a Switch 1 port, too. CPU usage in gaming is a... weird and complicated thing. Unless one is a systems engineer working on the specific hardware I wouldn't make too many assumptions about how these things go.
If you primarily play CPU bound strategy games, you can very much make conclusive statements about CPU performance. For example, Cities in Motion 1 (from the studio that created Cities: Skylines), released in 2010, can bring a modern CPU to its knees if you use modded maps, free look and say a 1440p monitor (the graphics don’t actually matter). Even a simple looking game like The Final Earth 2 can bring your FPS to a crawl due to CPU bottlenecks (even modern CPUs) in the late game with large maps. I will note that The Final Earth 2 has an Android version, but that doesn’t mean the game (which I’ve played on Android) isn’t fundamentally limited by CPU performance.
It very much is a genre thing. Can you show me a game like Transport Fever 2 on the Switch? Cities: Skylines?
The OG switch CPU was completely outdated when released and provides extremely poor performance.
The switch was released in 2017. It’s CPU, the cortex A57, was released in 2012. It was three generation behind the cortex A75 that was released in 2017.
The Switch CPU had very poor performance for 2017, it was 3 generations behind then current ARM/cortex releases.
It is very likely the CPU in the Switch 2 will also be subpar by modern standards.
I.e. You don’t know that the Steam Deck has a worse CPU and considering Nintendo’s history with CPUs, it is not impossible for the Switch 2 CPU to be noticeably worse than the Steam Deck.
Nobody was complaining about the Switch CPU. It was a pretty solid choice for the time. It outperformed the Xbox 360 somewhat, which is really all it needed to do to support last-gen ports. Like I said, the big annoyance that was specifically CPU-related from a dev perspective was the low thread count, which made cramming previous-gen multithreaded stuff into a fraction of the threads a bit of a mess.
The point of a console CPU is to run games, it's not raw compute. The Switch had what it needed for the scope of games it was running. On a handheld you also want it to be power efficient, which it was. In fact, the Switch didn't overclock the CPU on docked, just the GPU. Because it didn't need it. And we now know it did have some headroom to run faster, jailbroken Switches can be reliably clocked up a fair amount. Nintendo locked it that low because they found it was the right balance of power consumption and speed to support the rest of the components.
Memory bandwidth ended up being much more of a bottleneck on it. For a lot of the games you wanted to make on a Switch the CPU was not the limit you were bumping into. The memory and the GPU were more likely to be slowing you down before CPU cycles did.
The Switch CPU performs extremely poorly as far as gaming is concerned. Case in point, you cited Cities: Skylines, a quick web search suggests performance is terrible on the Switch and it seems to have been abandoned shortly after release.
While I don’t doubt the Switch 2 CPU will be sufficient for games released by Nintendo, from a broader gaming perspective (gaming is not only Nintendo), it is likely the Switch 2 CPU will also be subpar and will perform worse than the Steam Deck (which is a handheld and its CPU is also subject to efficiency requirements). Whether Nintendo users know/care/don’t care about this is irrelevant. We are talking about objective facts.
What "standards" are you comparing it to? The Switch 1 was behind home consoles, but that's not really a fair comparison. There was nothing similar on the market to appropriately compare it to, no "standard".
Five years later the Steam Deck outperformed the Switch, because of course hardware from five years later would. But the gap between the 2017 Switch and 2022 Deck is not so vast that you can definitively claim in advance to know that the 2025 Switch 2 definitely has to be worse. You don't know that and can't go claiming it as fact.
All we know so far is that the Switch 2 does beat the Deck in at least one major attribute: it has a 1080p120 screen, in contrast to the Deck's 800p60. And it is not unlikely to expect the rest of the hardware to reflect that.
OP claimed the Steam Deck’s CPU was definitely worse than the Switch 2 (this was an explicit, categorical statement).
Considering the Switch’s history (Cortex A57 used in the OG Switch being three generation behind in 2017), it’s not unreasonable to speculate that the Switch 2 CPU is likely to be extremely weak from a gaming perspective (I never brought up compute or synthetic benchmarks).
Exactly what hardware at a similarly competitive price point and form factor are you comparing it to when you say it's behind?
The Switch 1 didn't use the very best top of the line parts that money could buy, but if that's what you're fixating on then you're missing the fact that neither did the Steam Deck. The Switch made compromises to hit a $300 price point in 2017, and the Deck made compromises to hit a $400 price point in 2022.
Portable devices using ARM CPU cores, even ones for ~$350, like the Xiaomi F1 released in 2018. It came with a new Snapdragon 845 SoC that included an Adreno 630 GPU.
It didn’t have the form factor of the Switch, I will give you that. My point is that the Switch had a very weak CPU when compared to similar devices even in the same price band for its time.
So it's not a similar device. Comparing to phones is rather misleading, given that phones do not have active cooling and wouldn't actually be able to run the kinds of games the Switch hardware could without catching on fire in the process. They aren't gaming hardware.
It’s a portable gaming device. It is in the same market.
You can play complex strategy games that require strong CPUs like Project Highrise, The Final Earth 2, Mega Mall Story 2 on mobile.
You won’t be able to run The Final Earth 2 even with the standard mobile population limit on a Switch because it uses an ancient CPU and it’s a quad core.
Don’t limit yourself by Nintendo PR and marketing. The gaming world (portable or otherwise) is not limited to Nintendo.
I think the trailer and Steam page makes it pretty clear that this isn’t just aimed at furries. Not that furries won’t jump on it - we will, but it’s not just for furries.
Yeah, 100%. It’s just that usually when something like this gets a large-ish negative reaction it’s because people associate it with furries or some other “taboo” fetish/lifestyle.
I also find that furry stuff gets way more hate than it deserves (which is none hate). I say boo to that! So long as stuff is consensual and nobody gets hurt (who doesn’t want to, lookin’ at you BDSM), then let people enjoy things.
Side note: It’s hilarious how for years people were cheering on Captain Kirk for banging green alien chicks, but cat ears and a tail is a no-go. Cross-species stuff is cool so long as they’re from another planet? What if it was planet Yiff? On the topic of aliens, do we even know if Superman has a human-like penis? Maybe Kryptonians bust onto egg clutches, who’s to say?
Anywho, people are silly and really like policing other people’s likes.
Honestly, as someone close to the game, it’s more of “we can’t make the casuals and competitive players happy while catering to collectors all at the same time” and Gavin is using Pauper to test something that has ramifications for the rest of the game.
But here’s the catch, Pauper has been incredible this whole time! These bans/unbans are dope! I think this will work and it does set the precedent for the idiot business people that you can manage a format independent of design teams and stockholders.
First time for a “trial unban” where they go back to the ban list if they don’t positively impact the format? Yes, this is the first time an official “trial” has occurred in the competitive history of the game. The only time they have come close to this was the original ban of High Tide when paper and online ban lists merged.
I’m not talking about unbans in general and neither is the article.
I concur that unbans are usually a reaction to power creep. However, Modern has always been a mismanaged format since it’s inception. The premise of banning the top decks so that Modern was different from “Old Extended” (because Extended at the time became a “Double long Standard” instead of a rotating Type 2 Format) damaged the genesis of the format, which inevitably led to Grixis Twin’s dominance. I do think the format back then could’ve benefitted from Fae/Sculpter/Thopter-Sword/Affinity being legal and providing variance.
I’ve almost completely stopped keeping up with M:TG these days. I used to be into it but it just feels like it’s completely jumped the shark these days.
so i sold out of mtg around covid and they started committing to “universe beyond”. but all signs point to their stupid shit being profitable for the time being. and the new UB stuff will now be going through standard will help their new player onboarding a bit.
Lord of the Rings was massively successful, and I’ve been seeing even more buzz for Final Fantasy than there was LotR. Universes Beyond is certainly here to stay.
exactly. but you can’t just keep farming UB forever. you need to onboard them into the ecosystem and keep them hooked in. time will tell if these people only cared about their specific media or if enough latch on to mtg the game.
I enjoyed the early mythos of M:tG, but I already started losing interest when they went all Marvel with their whole Gatewatch thing.
Even though individual sets (like LOTR) have been well made and successful, the whole Universes Beyond thing just further dilutes the identity of Magic too much for me. I’m sick of endless exploitation of existing IPs from all over the entertainment business. I understand why they do it from a financial perspective, but it doesn’t appeal to me at all.
Add to that the endless garbage of Secret Lair drops like goddamn Spider-Man and SpongeBob and I think MTG just isn’t for me anymore.
they’ve unbanned cards in the past. most recently(not in this ban/unbanning) mox opal and splinter twin in modern. SFM and mind sculptor were also unbanned years ago. unbanning cards is not unheard of.
I can’t tell if this is just Wizards of the Coast panicking and flailing because they are out of good ideas, or if they are actually carefully analyzing and re-evaluating older cards because the balance and synergy of the current cards allow for the use of these older cards without being game breaking.
Hearthstone was doing this about a year ago when I quit. It was actually great for the game and really shook things up in the Wild format where you could play any set of cards. But Blizzard shit the bed on that one like usual, oh well.
Maybe that’s part of the problem? HDR implementation on my Samsung sets is garbage, I have to disable it to watch anything. Too bad too, because the picture is gorgeous without it.
Smart TV having absolutely horrible default settings and filters that ruin any viewing experience has little to do with HDR because the TV isn‘t even processing HDR images most of the time. That stuff is already mixed and there‘s not much any device can do to give you details in the darks and brights back. It‘s a much different story when you‘re actually processing real color information like in a video game. HDR should absolutely help you see in the dark here.
I WISH it was the default settings. I went through every calibration and firmware update I could find. Even the model specific calibrations on rtings.com. Nothing made a difference.
It appears to just be a flaw in Samsung’s implementation. After going through all the Samsung forum information, the only suggestion that’s guaranteed to work is “turn it off”.
I got a samsung monitor last year too (it was the cheapest hdr option and I keep seeing reddit praise them) and it has such a terrible hdr experience. When hdr is on either dark colors are light grayish, brights are too dark, darks are crushed, everything’s too bright or colors are over saturated. I’ve tried every combination of adjusting brightness / gamma from the screen and/or from kde but couldn’t figure out a simple way to easily turn down the brightness at night without some sort of image issue popping up. So recently I gave up and turned hdr off. Still can’t use the kde brightness slider without fucking up the image but at least the monitor’s brightness slider works now.
Also if there are very few bright areas on the screen it further decreases its overall screen brightness, which also affects color saturation bcz of course.
Also also just discovered freesync and vrr are two different toggles in two different menus for some fucking reason and if you only enable freesync like I did you get a flickering screen
I really wish there was a ‘no smart image fuckery’ toggle in the settings.
I didn’t really understand the benefit of HDR until I got a monitor that actually supports it.
And I don’t mean simply can process the 10-bit color values, I mean has a peak brightness of at least 1000 nits.
That’s how they trick you. They make cheap monitors that can process the HDR signal and so have an “HDR” mode, and your computer will output an HDR signal, but at best it’s not really different from the non-HDR mode because the monitor can’t physically produce a high dynamic range image.
If you actually want to see an HDR difference, you need to get something like a 1000-nit OLED monitor (note that “LED” often just refers to an LCD monitor with an LED backlight). Something like one of these: www.displayninja.com/best-oled-monitor/
These aren’t cheap. I don’t think I’ve seen one for less than maybe $700. That’s how much it costs unfortunately. I wouldn’t trust a monitor that claims to be HDR for $300.
When you display an HDR signal on a non-HDR display, there are basically two ways to go about it: either you scale the peak brightness to fit within the display’s capabilities (resulting in a dark image like in OP’s example), or you let the peak brightness max out at the screen’s maximum (kinda “more correct” but may result in parts of the image looking “washed out”).
See my “set 2” links above. (at the time) $3,200 8K television, “If you want the brightest image possible, use the default Dynamic Mode settings with Local Dimming set to ‘High’, as we were able to get 1666 nits in the 10% peak window test.”
Nope, it does have wide color gamut and high-ish brightness, wouldn’t buy unless reviews said it was ok. But it does some fuckery to the image I can only imagine could be to make non-hdr content pop on windows but ends up messing up the image coming from kde. I can set it up to look alright in either in a light or dark environment but the problem is I can’t quickly switch between them without fiddling with all the settings again.
Compared to my cooler master a grayscale gradient on it has a much sharper transition from crushed bright to gray but then gets darker much slower as well, to a point where unless a color is black it appears darker on the cm despite it having an ips screen. Said gray also shows up as huge and very noticable red green and blue bands on it, again unlike the cm which also has banding but at least the tones of gray are similiar.
Also unrelated but just noticed while testing the monitors, max sdr brightness slider of kde seems to have changed again. Hdr content gets darker on the last 200 nits while sdr gets brighter. Does anyone know anything about that? I don’t think that’s how it’s supposed to work
3 months edit: I might’ve been wrong about this. At the time I had both monitors connected to the motherboard (amd igpu) since the nvidia driver had washed out colors. Since the cooler master worked I assumed the amd drivers were fine. But a while back I ended up plugging both into the nvidia gpu and discovered that not only were the nvidia drivers fixed, but with it the Samsung didn’t have the weird brightness issue neither.
Edit edit: Even though the brightness is more manageable it’s still fucked. I’ve calibrated it with kde’s new screen calibration tool and according to it the brightness tops out at 250 nits. However it is advertised and benchmarked to go up to 600 and I’ve measured 800 ish using my phone sensor, and it looks much brighter than an sdr 200 nit monitor. Which makes me think even though it is receiving hdr signal, it doesn’t trust the signal to be actually hdr and maps sdr range to its full range instead; causing all kinds of image issues when the signal is actually hdr.
And just to make sure it’s not a linux issue I’ve tried it with windows 10 too. With amd gpu hdr immediately disables itself if you enable it and with nvidia gpu if you enable hdr all screens including ones not connected to it turn off and don’t work until you unplug the monitor and reboot. Cooler master just works
Yeesh sounds like your monitors color output is badly calibrated :/. Fixing that requires an OS level calibration tool. I’ve only ever done this on macOS so I’m not sure where it is on Windows or Linux.
Also in general I wouldn’t use the non-hdr to hdr conversion features. Most of them aren’t very good. Also a lot of Linux distros don’t have HDR support (at least the one I’m using doesn’t).
It’s one of those things where it looks good where in like the case of a video game, the GAME’s implementation of it is good AND your Console/PCs implementation is good AND your TV/Monitor’s implementation is good. But like unless you’ve got semi-deep pockets, at least one of those probably isn’t good, and so the whole thing is a wash.
polygon.com
Najnowsze