Uh, this looks an awful lot like a guide written by an LLM (or at the very least, a copy-paste template) posted on an unofficial site intended to hoodwink people looking for the real one. Judging by search results, there’s a few of these sites.
I don’t think we should repost content from them, or encourage them, personally. EDIT: Pretty sure this user’s a bot actually. Don’t click these links, folks.
Unlike Nvidia they will not be artificially restricting production. It’s already in backrooms and shelves they’ve been bringing them over from manufacturers for months now. There will probably still be some scalping but it’s expected to be enough supply to actually meet demand.
AMD have said, the 9070 series is going to be up to 4070 ti performance. Leaks have also shown the performance between 7800XT and 7900XTX, so you can trust these numbers of course.
As for the price, AMD can say what they want, but how they’ve handled this launch so far doesn’t sound promising. A vendor, who has the cards already, speculated about the price, and it was horrible.
Things have changed since then, but until AMD has released concrete numbers, all these leaks are useless.
I also don’t know what you mean with the artificial restriction of production of the cards. Because NVIDIA is mainly producing AI cards for servers and workstations? AMD will be doing the exact same thing, since that’s where the majority of the money is.
With the 5070 at a 550 MSRP I wouldn't be suprised to see AMD matching that for similar performance. Given all the delay shenanigans it'd be shocking for them to deliberately wait for the 5070 info and then launch with a more expensive part.
How much you end up having to pay to get one is anybody's guess, of course, as MSRP is increasingly meaningless. Since they've had cards with retailers for a while and have been delaying there may actually be some stock at launch, though. We'll see.
The idea that it would "smoke the 5070" and "nearly match the 5080" is probably just fanboyism or they wouldn't have ducked out from directly pitching it after the 5070 reveal (and if they had a 500 dollar 5080 competitor they wouldn't be cancelling their high end cards this gen).
In any case, it's immensely dumb to fanboy for multibillion dollar chip manufacturers. I just hope people can buy good, affordable GPUs from multiple manufacturers at some point. I own GPUs from Intel, AMD and Nvidia and would really want them all to remain competitive in as many pricing segments as possible.
Oh, they're absolutely not retaking a huge chunk of the dedicated GPU market. I think what's realistic to expect if they have a good launch (readily available stock, competitive performance and price) is that they may regain a couple points of desktop install base and at least get to sell that they're moving in the right direction instead of abandoning that space altogether. Maybe some growth on handhelds and competitive iGPUs for laptops and tablets so it makes sense for them to continue to develop the gaming GPU business aggressively at least.
I dont fanboy, I chase value. My house is a mix of AMD/Nvidia/Intel/ARM and I use the right architecture wherever I can.
If im guilty of anything its Hopeium but honestly all the leaks I have been following show suggest that the 9070 and 9070 XT will offer same or better performance vs 5070 class cards at prices far below street cost. Nvidia’s MSRPs are a lie and the market shows that. NO AIB can produce these cards for the prices Nvidia has them selling at without rebates on the back end. Thats why you saw a very small number of MSRP models and no plans to restock. Everything else out there on the market is over MSRP.
We know AMD has stock in back rooms and warehouses, we have the pictures and confirmation from rank and file staff at places like Microcenter. We also know that AMD’s Instinct MI series is not selling as well as Nvidia so they are not as incentivized to divert silicon to higher margin products.
So, yeah it might be another round of shitty nvidia prices and stock, but we wont see the same scalping for AMD, and the value offered by AMD will embarrass nvidia once you turn off DLSS. Why else would they lie and say the 5070 matches 4090 performance other than to try to sell as much as they can before the 9070 forces them to drop prices.
That is a rather astonishing mix of really granular quoting of more or less accurate facts and borderline conspiracy theorist level misinformation. You rarely see this stuff outside political channels, I'm... mildly impressed.
AMD absolutely does have stock in back rooms, largely because they have been doing a somewhat undignified dance of waiting to see what Nvidia does to decide what they're pricing their current gen at. Most educated guesses out there are that they were going to price higher, were caught on the wrong foot with Nvidia's MSRP announcement and had to work out how to re-price cards that were already in the retail channel. And now Nvidia is in turn delaying the 5070 to interfere with AMD's new dates. Because both of these companies suck.
On the plus side for consumers, there's some hope that the 9070 will be repriced somewhat affordably and that it won't underperform against at least the 5070, if not the 5070Ti. We'll see what reviews have to say about it.
Your summary of why the launch was so light includes some real stuff (yeah, partners struggle to match Nvidia's aggressive pricing and have terrible margins), but that's not why there was no stock of the 5090 (most reports suggest the GPUs were simply not being manufactured early enough to provide chips to anybody. 5080s were both more readily available and less appealing, so they're easier to find, which kinda pokes big holes in that hypothesis. Manufacturing timelines seem to also explain why restocking will be slow.
I'm also very confused about why you'd "turn off DLSS". Are you allowing people to use FSR, at least? That's a weird proviso. The reason they would misrepresent the impact of MFG is obviously good old marketing. Even if AMD didn't exist, the 40 series does and they have a big issue with justifying a lot of the 50 series line against it. With the 5080 falling well behind the 4090 they have a clear incentive for suggesting you can match the 4090 in cheaper cards. This doesn't tell you anything about the performance of the 9070 one way or the other. It does tell you a lot of the performance of the 5080, though.
See, this is why this sort of propagandistic speech works so well, it takes for ever to even cover all the misrepresentations and all this is going to do is get you to double down on some of these unsubstantiated statements and turn it into a "matter of opinion". It doesn't even need to be on purpose, it's just easier to produce than to counter.
Aaaand now I made myself sad.
In any case, here's hoping the 9070 is a competitive option and readily available. They've apparently scheduled that delayed event for the 28th, so I'll be curious to see what they bring to the table officially.
I dont think this is misinformation, just a difference of opinion and interpretation of what is known. I also openly admit that I have a tint on this release because I am really hoping that AMD does it right with reasonable pricing and performance.
As for DLSS/FSR I prefer not to use either because where I actually want faster frames and benefit the added latancy is not worth it, and where I dont care about the extra frames I prefer higher quality details. I find both framegen tech to be a poor service to the end user and thats why I dislike Nvidia’s marketing of the 5070 as equivalent to a 4090 when turning on DLSS. I also dislike their texture compression as an excuse to keep vram artificially low to prevent people from using consumer GPUs for running LLMs.
Ah, so you meant DLSS to mean specifically "DLSS Frame Generation". I agree that the fact that both upscaling and frame gen share the same brand name is confusing, but when I hear DLSS I typically think upscaling (which would actually improve your latency, all else being equal).
Frame gen is only useful in specific use cases, and I agree that when measuring performance you shouldn't do so with it on by default, particularly for anything below 100-ish fps. It certainly doesn't make a 5070 run like a 5090, no matter how many intermediate frames you generate.
But again, you keep going off on these conspiracy tangents on things that don't need a conspiracy to suck. Nvidia isn't keeping vram artificially low as a ploy to keep people from running LLMs, they're keeping vram low for cost cutting. You can run chatbots just fine on 16, let alone on 24 or 32 gigs for the halo tier cards, and there are (rather slow) ways around hard vram limits for larger models these days.
You don't need some weird conspiracy to keep local AI away from the masses. They just... want money and have people that will pay them more for all that fast ram elsewhere while the gaming bros will still shell out cash for the gaming GPUs with the lower RAM. Reality isn't any better than your take on it, it's just... more straightforward and boring.
I dream that the reason AMD delayed their launch and are being so cryptic, is because they saw how underwhelming the 5080 was and decided to make a card (perhaps a 9070 XT) that matches its performance at the price of a 5070 or something.
Now I don’t think that will happen. Their previous market strategies have been very uninspired. But there’s certainly an opening here to make a play for market share and make Nvidia look like greedy fools.
AMD cards are not scalped like nvidea. During the pandemic going to wait outside of microcenter they always had AMD cards available. No one wants them!
I would never expect anything day 1, ever. Scalpers ruined day 1. And though I would be surprised if they were out of stock a week later. Unlike Nvidia they actually want to sell cards.
Unfortunately it’s pure capitalism. No matter what company got paid. They don’t care about it users got the card they wanted, or even if the scalers make a profit or go bust, they already got paid.
I haven’t played the game at all since Seekers of the Storm came out but we would play it modded quite a bit before. The Samus character mod is so much fun.
Depends on the game. If it’s not really demanding on reaction time, and the game is locked framerate I’m fine with 30, like Okami. However if the game is not locked FPS and I still can’t hit 60 FPS at least on my 1440p monitor I’d probably just play something else (because I know I could have better experience is I could run it).
However for shooter and reaction heavy games I always aim to max out my 144 Hz monitor, even 60 FPS can feel sluggish for me
Check out “Aquaria”. Not quite the same thing, but a Metroidvania playing as a mermaid with song powers. Lots of boss fights! And you can even breech the surface when you get there!
I undervolted my 5800X3D and 9800X3D and that helped with temps a lot.
I never undervolted my GPU, I generally go the other way with it. My 4080 lost the silicon lottery, couldn’t get any more out of it. Not sure if I won it with my 5080 because a lot of people seem to be having large gains, but I got my boost clock to to a little over 3 GHz and a +1GHz to the memory.
Well, I first played Dragon Age Origins with the framerate fluctuating between 10 and 20 FPS. Wasn’t the most fun I’ve ever had, but ever since 30 - 60 felt like luxury. So yeah, anywhere from 10 to 30 is fine for me, but the more active a game is the closer to 30 minimum with a target of 60
I didn’t have any luck undervolting my GPU; It would just crash with even the smallest voltage offset. That said, I have had success undervolting my CPU. I’d also suggest limiting the total power draw. No noticeable drop in performance for lower temps and reduced fan noise.
I wouldn’t recommended it if you don’t have fairly clean power. Definitely run into issues where a voltage drop in the mains would just shut off my system.
I undervolted my 5800x3d (each core individually) and it cut the temps by quite a bit, without affecting performance. Actually if anything you could say the performance arguably increased because it was no longer the hot little hog it was ootb.
There definitely has been some scalping, but also, just, not a huge amount of inventory available (like sub 100 units available across cities with populations in the millions). A bit of a paper launch TBH.
TSMC only has so much throughput available and NVIDIA has other products they’re selling that they can make better margins on than consumer GPUs. I’m a little surprised they launched at all given how few they’re shipping.
I wonder how much of launching now was to generate buzz to get studios to adopt methods of rendering that work best with with software, make it harder for competitors to compete on hardware.
bin.pol.social
Aktywne