Maybe that’s part of the problem? HDR implementation on my Samsung sets is garbage, I have to disable it to watch anything. Too bad too, because the picture is gorgeous without it.
Smart TV having absolutely horrible default settings and filters that ruin any viewing experience has little to do with HDR because the TV isn‘t even processing HDR images most of the time. That stuff is already mixed and there‘s not much any device can do to give you details in the darks and brights back. It‘s a much different story when you‘re actually processing real color information like in a video game. HDR should absolutely help you see in the dark here.
I WISH it was the default settings. I went through every calibration and firmware update I could find. Even the model specific calibrations on rtings.com. Nothing made a difference.
It appears to just be a flaw in Samsung’s implementation. After going through all the Samsung forum information, the only suggestion that’s guaranteed to work is “turn it off”.
I got a samsung monitor last year too (it was the cheapest hdr option and I keep seeing reddit praise them) and it has such a terrible hdr experience. When hdr is on either dark colors are light grayish, brights are too dark, darks are crushed, everything’s too bright or colors are over saturated. I’ve tried every combination of adjusting brightness / gamma from the screen and/or from kde but couldn’t figure out a simple way to easily turn down the brightness at night without some sort of image issue popping up. So recently I gave up and turned hdr off. Still can’t use the kde brightness slider without fucking up the image but at least the monitor’s brightness slider works now.
Also if there are very few bright areas on the screen it further decreases its overall screen brightness, which also affects color saturation bcz of course.
Also also just discovered freesync and vrr are two different toggles in two different menus for some fucking reason and if you only enable freesync like I did you get a flickering screen
I really wish there was a ‘no smart image fuckery’ toggle in the settings.
I didn’t really understand the benefit of HDR until I got a monitor that actually supports it.
And I don’t mean simply can process the 10-bit color values, I mean has a peak brightness of at least 1000 nits.
That’s how they trick you. They make cheap monitors that can process the HDR signal and so have an “HDR” mode, and your computer will output an HDR signal, but at best it’s not really different from the non-HDR mode because the monitor can’t physically produce a high dynamic range image.
If you actually want to see an HDR difference, you need to get something like a 1000-nit OLED monitor (note that “LED” often just refers to an LCD monitor with an LED backlight). Something like one of these: www.displayninja.com/best-oled-monitor/
These aren’t cheap. I don’t think I’ve seen one for less than maybe $700. That’s how much it costs unfortunately. I wouldn’t trust a monitor that claims to be HDR for $300.
When you display an HDR signal on a non-HDR display, there are basically two ways to go about it: either you scale the peak brightness to fit within the display’s capabilities (resulting in a dark image like in OP’s example), or you let the peak brightness max out at the screen’s maximum (kinda “more correct” but may result in parts of the image looking “washed out”).
See my “set 2” links above. (at the time) $3,200 8K television, “If you want the brightest image possible, use the default Dynamic Mode settings with Local Dimming set to ‘High’, as we were able to get 1666 nits in the 10% peak window test.”
Nope, it does have wide color gamut and high-ish brightness, wouldn’t buy unless reviews said it was ok. But it does some fuckery to the image I can only imagine could be to make non-hdr content pop on windows but ends up messing up the image coming from kde. I can set it up to look alright in either in a light or dark environment but the problem is I can’t quickly switch between them without fiddling with all the settings again.
Compared to my cooler master a grayscale gradient on it has a much sharper transition from crushed bright to gray but then gets darker much slower as well, to a point where unless a color is black it appears darker on the cm despite it having an ips screen. Said gray also shows up as huge and very noticable red green and blue bands on it, again unlike the cm which also has banding but at least the tones of gray are similiar.
Also unrelated but just noticed while testing the monitors, max sdr brightness slider of kde seems to have changed again. Hdr content gets darker on the last 200 nits while sdr gets brighter. Does anyone know anything about that? I don’t think that’s how it’s supposed to work
3 months edit: I might’ve been wrong about this. At the time I had both monitors connected to the motherboard (amd igpu) since the nvidia driver had washed out colors. Since the cooler master worked I assumed the amd drivers were fine. But a while back I ended up plugging both into the nvidia gpu and discovered that not only were the nvidia drivers fixed, but with it the Samsung didn’t have the weird brightness issue neither.
Edit edit: Even though the brightness is more manageable it’s still fucked. I’ve calibrated it with kde’s new screen calibration tool and according to it the brightness tops out at 250 nits. However it is advertised and benchmarked to go up to 600 and I’ve measured 800 ish using my phone sensor, and it looks much brighter than an sdr 200 nit monitor. Which makes me think even though it is receiving hdr signal, it doesn’t trust the signal to be actually hdr and maps sdr range to its full range instead; causing all kinds of image issues when the signal is actually hdr.
And just to make sure it’s not a linux issue I’ve tried it with windows 10 too. With amd gpu hdr immediately disables itself if you enable it and with nvidia gpu if you enable hdr all screens including ones not connected to it turn off and don’t work until you unplug the monitor and reboot. Cooler master just works
Yeesh sounds like your monitors color output is badly calibrated :/. Fixing that requires an OS level calibration tool. I’ve only ever done this on macOS so I’m not sure where it is on Windows or Linux.
Also in general I wouldn’t use the non-hdr to hdr conversion features. Most of them aren’t very good. Also a lot of Linux distros don’t have HDR support (at least the one I’m using doesn’t).
It’s one of those things where it looks good where in like the case of a video game, the GAME’s implementation of it is good AND your Console/PCs implementation is good AND your TV/Monitor’s implementation is good. But like unless you’ve got semi-deep pockets, at least one of those probably isn’t good, and so the whole thing is a wash.
The game looks fun, but I personally can not get into a world that switches between fantasy and sci-fi so much. The mix just isn’t my thing. Magic and tech don’t mix well in my head.
I don’t think the game wanted to paint an “unbridgeable gap” here, as the author says. The way Mio and Zoe get more into each other’s stories is exactly the testament to the way this gap can be closed through a unique shared experience, and to the way one genre can enrich the other.
I play Split Fiction with my girlfriend, and she is a fantasy fangirl, while I am very sci-fi, so the characters land just perfectly. And I can’t help but notice that, as Mio and Zoe get more open-minded and try to look into the root of how those two preferences formed, me and my girlfriend also get more passionate for each other’s interests.
And that’s one of the most powerful things about the game. It helps to deconstruct our notions and perceptions about both genres, and become more open to each other’s vision.
I guess I’m the odd one out then. I’m a huge Sci-Fi fan. Ender’s Game still stands as my favorite book after all these years. But I’m not too crazy about fantasy. I’ve bounced off of books, shows, and movies that my friends and family loved. They just seemed to be mediocre stories with fantasy paint on it and people who like Wizards were able to gloss over the holes.
It’s not unheard of for people to not be interested in the other genre. But those people are outnumbered by consumers who just want the new thing.
There’s nothing wrong with that. I tend to lean more to sci-fi myself. But the premise argues that it and fantasy are somehow different, when they’re not. It’s a criticizes generative AI, which is valid, but doesn’t question why the two genres have to be at odds when it obviously has a blend of both.
Tell that to everything I like, star trek is fantasy adjacent, star wars too, all of chinese fantasy (especially the movies) are technically fantasy but have so much stuff that works like scifi just using magic engines and shit, low magic matches epic scifi, its literally just is science driving the cool thing or is magic, its set dressing, it can be swapped. Harry potter can be the same story but with a scifi setting, idk what im even saying im rambling at this point.
Yeah, stuff like venture bros, tom strong, etc. with science heroes, next to fantasy stuff, shows how its pretty much the same stuff just different devices, science fiction explanations are as realistic as fantasy explanations for how things function, its all bs and not real science typically either way, I love hard magic where they explain it like its science and theres a deep logic to the world and how everything works. Like brandon sanderson stuff
People might have a preference, sure, but that’s not what’s happening in Split Fiction; the game makes it seem like sci-fi writers think fantasy isn’t a form of legitimate artistic expression, and vice versa. It’s hard to imagine any fan of either genre today being that hardline about the other.
Check out Some Desperate Glory by Emily Tesh.
Ender’s Game was my favorite book for many years but I can’t recommend Card’s books any more.
Luckily, Some Desperate Glory ticks all the boxes and then some.
I have long standing issues with fares and the narrative shortcuts and tropes he uses.
But yeah. Watched the first hour or so on a stream and it was pure nonsense. It shows a complete lack of understanding of what SFF even is (there is a reason we just call it “Science Fiction and Fantasy”) but even what writing is. One character can never shut the fuck up about how “I am going to get published. Were you published. PUBLISHED” because apparently this dystopic future where machine learning steals ideas from people and combine them into the best stories ever told doesn’t have ebooks.
I got on a Whitest Kids You Know kick recently and they were talking about when they jacked off one of the guys on stage in a massage parlor skit. And I think it was Trevor (RIP) who couldn’t stop laughing about how they were dumbass kids who had no idea how ANYTHING worked and that was the basis for so many of their skits.
And yeah. That is definitely the rosetta stone to fares et al’s writing. Whether they are talking about undercover cops or writing or what it means to be a child of divorce.
See, I feel like both Sci-Fi and Fantasy have enough different that the should be sperate genres. I think the combining them is to the detriment of both. I can’t tell you the amount of times I’ve given up looking for something new to watch because I click Sci-Fi and every listing is Lord of the Rings.
Which gets to the crux of it. Unless you are reading REALLY hard sci-fi, most of the tech boils down to “a wizard did it”.
Like, the OT of Star Wars is 100% a fantasy series and it was only the EU (and later the PT) that tried to explain the tech and make it more sci-fi. Similarly, a lot of the litrpg writers think Sanderson is a softy and go ridiculously hard on explaining their magic systems in greater detail than actual textbooks.
And, at the end of the day, they are all just different shades of speculative fiction that primarily use magic/tech/magitech as a plot device to explore the impact on society of whatever metaphor the big bad is.
Author’s website, which, after playing a little text game, takes you to a better place to buy it from than Bezos’s fetid swamp - curiousvideogamemachines.com
I am so excited for this game! I went into the original Citizen Sleeper without any prior knowledge of the game, ended up finishing it in two (pretty long) sessions.
I don’t really see the point in comparing them. They’re different devices for different markets.
The Switch is for people who want to play first party Nintendo titles. That’s really the only reason for its existence. Without Nintendo’s first party lineup, the Switch would be just another Arm based handheld, and a fairly unremarkable one at that.
The Switch is all about exclusivity, the Steam Deck is the exact opposite. Not only is the Steam client, and the massive library of games that it gives gamers access to, available on scores of x86 devices and hardware configurations, the Steam Deck operating system will soon be available pre-installed on multiple, third party devices, and it will be available for anymore to download and install on any device they want.
They’re not just different devices, they’re vastly different company philosophies.
What a horseshit article. 90% of the “comparisons” are “We don’t know yet”. “It’s up in the air on the switch”. Only concrete thing I saw is that it has 2 USB-C ports.
I personally dont get why they went with Nvidia again, they make cheap mobile processors and I doubt this one will be any different. Imo they should have gone with a low end AMD APU based on Ryzen 9000.
polygon.com
Gorące