Unfortunately, that’s the anti-scalper countermeasure. Crippling their crypto mining potential didn’t impact scalping very much, so they increased the price with the RTX 40 series. The RTX 40s were much easier to find than the RTX 30s were, so here we are for the RTX 50s. They’re already on the edge of what people will pay, so they’re less attractive to scalpers. We’ll probably see an initial wave of scalped 3090s for $3500-$4000, then it will drop off after a few months and the market will mostly have un-scalped ones with fancy coolers for $2200-$2500 from Zotac, MSI, Gigabyte, etc.
The existence of scalpers means demand exceeds supply. Pricing them this high is a countermeasures against scalpers…in that Nvidia wants to make the money that scalpers would have made .
No, it's a direct result of observing the market during those periods and seeing the lemmings beating down doors to pay 600-1000 dollars over MSRP. They realized the market is stupid and will bear the extra cost.
Nvidia is just doing what every monopoly does, and AMD is just playing into it like they did on CPUs with Intel. They’ll keep competing for price performance for a few years then drop something that drops them back on top (or at least near it).
I just don’t get why they didn’t think of quickly porting the Xbox interface over to desktop Windows. Should have been an easy fix to make handheld gaming on Windows more appealing.
It’s a lot more than that. SteamOS isn’t just Steam Big Picture Mode. There’s some special sauce in there to capture the active window so that you never lose focus of the game window and such. If you just run a Windows machine set to boot into Steam Big Picture on startup, you’ll find lots of times that you have to break out a keyboard to Alt+Tab for a variety of reasons that SteamOS never encounters. And given the other problems Windows has introduced over the past decade, that’s the least of their problems now.
Nvidia claims the 5070 will give 4090 performance. That’s a huge generation uplift if it’s true. Of course, we’ll have to wait for independent benchmarks to confirm that.
The best ray tracing games I’ve seen are applying it to older games, like Quake II or Minecraft.
They’ve already said it’s all because of DLSS 4. The 5070 needs the new 4x FG to match the 4090, although I don’t know if the 4090 has the “old” 2x FG enabled, probably not.
The performance improvements claims are a bit shady as they compare the old FG technique which only creates one frame for every legit frame, with the next gen FG which can generate up to 3.
All Nvidia performance plots I’ve seen mention this at the bottom, making comparison very favorable to the 5000 series GPU supposedly.
On the site with the performance graphs, Farcry and Plague Tale should be more representative, if you want to ignore FG. That’s still only two games, with first-party benchmarks, so wait for third-party anyway.
Eh I’m pretty happy with the upscaling. I did several tests and upscaling won out for me personally as a happy middle ground to render Hunt Showdown at 4k vs running at 2k with great FPA and no upscaling or 4k with no upscaling but bad FPS.
Legitimate upscaling is fine, but this DLSS/FSR ML upscaling is dogshit and just introduces so many artifacts. It has become a crutch for developers, so they dont build their games properly anymore. Hardware is so strong and yet games perform worse than they did 10 years ago.
I mean this is FSR upscaling that I’m referring to. I did several comparisons and determined that it looked significantly better to upscaling using FSR from 2K -> 4k than it did to run at 2k.
Hunt has other ghosting issues but they’re related to CryEngine’s fake ray tracing technology (unrelated to the Nvidia/AMD ray tracing) and they happen without any upscaling applied.
From personal experience, I’d say the end result for framegen is hit or miss. In some cases, you get a much smoother framerate without any noticeable downsides, and in others, your frame times are all over the place and it makes the game look choppy. For example, I couldn’t play CP2077 with franegen at all. I had more frames, but in reality it felt like I actually had fewer. With Ark Survival Ascended, I’m not seeing any downside and it basically doubled my framerate.
Upscaling, I’m generally sold on. If you try to upscale from 1080p to 4K, it’s usually pretty obvious, but you can render at 80% of the resolution and upscale the last 20% and get a pretty big framerate bump while getting better visuals than rendering at 100% with reduced settings.
That said, I would rather have better actual performance than just perceived performance.
I wouldn’t say fuck upscaling entirely, especially for 4k it can be useful on older cards. FSR made it possible to play Hitman on my 1070. But yeah, if I’m going for 4k I probably want very good graphics too, eg. in RDR2, and I don’t want any upscaling there. I’m so used to native 4k that I immediately spot if it’s anything else - even in Minecraft.
And frame generation is only useful in non-competetive games where you already have over 60 FPS, otherwise it will still be extremely sluggish, - in which case, it’s not realy useful anymore.
The point is, hardware is powerful enough for native 4K, but instead of that power being used properly, games are made quickly and then upscaling technology is slapped on at the end. DLSS has become a crutch and Nvidia are happy to keep pushing it and keeping a reason for you to buy their GPUs every generation, because otherwise we are at diminishing returns already.
It's useful for use on older hardware, yes, I have no issue with that, I have issue with it being used on hardware that could otherwise easily run 4K 120FPS+ with standard rasterization and being marketed as a 'must'.
Those genres aren’t really known for having brutal performance requirements. You have to play the bleeding edge stuff that adds prototype graphics postprocessing in their ultra or optional settings.
When you compare non RT performance the frame delta is tiny. When you compare RT it’s a lot bigger. I think most of the RT implementations are very flawed today and that it’s largely snake oil so far, but some people are obsessed.
I will say you can probably undervolt / underclock / power throttle that 4090 and get great frames per watt.
I think both of these could be great movies, but could also be done extremely poorly (especially helldivers). I hope they get people who like and understand the IPs for these ones.
He’s a 50-50 actor. When he’s good, like in Kickass and Bullettrain, he’s real good. But when he’s bad he’s absolutely horrible, like Godzilla and every superhero movie he’s been in.
Edit. I want to add that I didn’t even realize it was him in Bullet Train until half way through. He’s pretty entertaining in it.
The movie itself suffers from a “im sooo smart and clever” syndrome that some movies have. Johnson and whoever played the girl are the good parts of that movie.
People think Xbox is shit at the moment. Making things more like xbox isnt received as positively as it used to be. When I was a windows user and I saw anything Xbox related come up I knew I was about to deal with some clunky bullshit that was not wanted and poorly implemented.
So, some time ago I saw a video about Microsoft about how they had gotten their OS so bad that now nobody wants to use it by always going all in into the newest big thing (Vr and win 10 mixed reality, tablets and win 8) and i think the same is going to happen here. microsoft will try to make their whole OS just like steamOS, and fail hardly at both the normal consumers and the handheld market
theverge.com
Aktywne