Well, they will. Two things drive the trend, in my view:
Lack of informed opinions. If you don’t know that other options exist, you’ll buy whatever because you think it is the baseline.
Convenience. This one is a killer. People regularly give up a lot – even rights – in the name of convenience.
Between those two factors, it’s a hard sell for the average consumer to not support this kind of corpo garbage. A nihilistic view, maybe, but I think it’s an accurate one.
In a similar vein, it’s pretty easy to show someone that consoles have these needlessly expensive proprietary links, plus games which are very expensive for the same reason. But it is very hard to convince someone that the cool thing they saw on TV isn’t, in fact, “cool” because of the aforementioned reasons. And ultimately, people like having cool things, even if that coolness is subjective.
Historically, it’s been a push-pull between groups, but everyone has had a different future. Now that things are being consolidated wholesale – e.g. physical media going out the window because so many are happy to stream and never own anything – it is more necessary than ever to call out #1 and #2, since the market itself is changing for the worse.
This back and forth from the comments on the article is interesting:
What the article ommits: The youtuber in question has a long history of threatening smaller channels with various actions against them, from brigading to lawyers to copyright strikes, if they do something he doesn’t like and don’t bow to his will. So I’m not surprised to see someone was fed up with him eventually.
Two wrongs don’t make a right as my nan used to say. This YouTuber being a bit of a grunt does not negate the fact YouTube itself is happy taking a hands off approach to a fundamental part of their business model because the ones it affects are not the ones that give them most of the money.
Of course it’s a problem, I just feel 0 sympathy in this case and I find it ironic that it’s him especially that got hit with the same treatment he threatens others with.
I don't think they can do much at all, actually. They're not allowed much wiggle room when it comes to being DMCA-compliant. They pretty much have to take every takedown request at face value, because DMCA requests are a legal process, and I imagine that any intervention on YouTube's side could be seen as arbitration. I doubt they could do much to interfere with an impersonator, since even a falsely-submitted DMCA complaint is still a legal request that has to be processed accordingly.
The DMCA needs to be gutted.
Nintendo can do something, though. They're the ones being impersonated, so they can actually take the guy to court.
Platforms actually do get more leeway than is usually thought with DMCA takedown requests. If they believe it to be fraudulent, they have every right to disregard it. That’s a fact they conveniently try to downplay because they want people to think they have no responsibility for their actions.
Yeah, that’s bullshit. You can tell by the fact that they don’t take down videos from big corporations when some nobody trolls with a fraudulent DMCA request. They only do it when it’s the other way around.
Weird, I have a regular old 2TB (or maybe it was 1?) western digital plugged into the USB on the back of my series x and it works fine, not sure I understand the need to spend a bunch on something like this. Edit: and before responding about speed… I haven’t noticed much, if any, difference in game performance from installing on the drive or external outside of the initial game loading (startup) time, so not sure if that’s the only benefit to using the expansion slot.
Hmm I’ll have to check this later as I don’t remember ever running into that problem since my Xbox internal has been full for a while. But I also wonder if that applies to physical copies or not since all my series x games are physical. Unless Xbox does this automatically in the background without user intervention, then I may have not noticed
The troll was able to spoof the Nintendo.co.jp domain because Nintendo didn’t setup their DMARC settings correctly. They have it setup but with policy “none” instead of “reject”. What a bunch of dumb asses. Such a big company doesn’t even protect their domains against spoofing. Probably why they got hacked, they don’t invest enough in IT security. Though this is typical for Japanese corporations.
The performance improvements claims are a bit shady as they compare the old FG technique which only creates one frame for every legit frame, with the next gen FG which can generate up to 3.
All Nvidia performance plots I’ve seen mention this at the bottom, making comparison very favorable to the 5000 series GPU supposedly.
On the site with the performance graphs, Farcry and Plague Tale should be more representative, if you want to ignore FG. That’s still only two games, with first-party benchmarks, so wait for third-party anyway.
Eh I’m pretty happy with the upscaling. I did several tests and upscaling won out for me personally as a happy middle ground to render Hunt Showdown at 4k vs running at 2k with great FPA and no upscaling or 4k with no upscaling but bad FPS.
Legitimate upscaling is fine, but this DLSS/FSR ML upscaling is dogshit and just introduces so many artifacts. It has become a crutch for developers, so they dont build their games properly anymore. Hardware is so strong and yet games perform worse than they did 10 years ago.
I mean this is FSR upscaling that I’m referring to. I did several comparisons and determined that it looked significantly better to upscaling using FSR from 2K -> 4k than it did to run at 2k.
Hunt has other ghosting issues but they’re related to CryEngine’s fake ray tracing technology (unrelated to the Nvidia/AMD ray tracing) and they happen without any upscaling applied.
From personal experience, I’d say the end result for framegen is hit or miss. In some cases, you get a much smoother framerate without any noticeable downsides, and in others, your frame times are all over the place and it makes the game look choppy. For example, I couldn’t play CP2077 with franegen at all. I had more frames, but in reality it felt like I actually had fewer. With Ark Survival Ascended, I’m not seeing any downside and it basically doubled my framerate.
Upscaling, I’m generally sold on. If you try to upscale from 1080p to 4K, it’s usually pretty obvious, but you can render at 80% of the resolution and upscale the last 20% and get a pretty big framerate bump while getting better visuals than rendering at 100% with reduced settings.
That said, I would rather have better actual performance than just perceived performance.
I wouldn’t say fuck upscaling entirely, especially for 4k it can be useful on older cards. FSR made it possible to play Hitman on my 1070. But yeah, if I’m going for 4k I probably want very good graphics too, eg. in RDR2, and I don’t want any upscaling there. I’m so used to native 4k that I immediately spot if it’s anything else - even in Minecraft.
And frame generation is only useful in non-competetive games where you already have over 60 FPS, otherwise it will still be extremely sluggish, - in which case, it’s not realy useful anymore.
The point is, hardware is powerful enough for native 4K, but instead of that power being used properly, games are made quickly and then upscaling technology is slapped on at the end. DLSS has become a crutch and Nvidia are happy to keep pushing it and keeping a reason for you to buy their GPUs every generation, because otherwise we are at diminishing returns already.
It's useful for use on older hardware, yes, I have no issue with that, I have issue with it being used on hardware that could otherwise easily run 4K 120FPS+ with standard rasterization and being marketed as a 'must'.
You still gotta use their shitty NVIDIA experience app (bloated with ads that make NVIDIA money when you open it), and you buying a used NVIDIA card increases demand (and thus prices) on all NVIDIA cards.
If you are a gamer and not doing AI stuff then buying a non-NVIDIA card is entirely an option.
Ok…they do. They get increased market share, which is measurable and valuable to shareholders, increasing stock value and increasing company liquidity.
Not every game has frame gen… not everybody wanna introduce lag to input. So 50% is 100% sketchy marketing. You can keep your 7 frames, Imma wait for 6090
Their whole gaming business model now is encouraging devs to stick features that have no hope of rendering quickly in order to sell this new frame generation rubbish.
He’s a 50-50 actor. When he’s good, like in Kickass and Bullettrain, he’s real good. But when he’s bad he’s absolutely horrible, like Godzilla and every superhero movie he’s been in.
Edit. I want to add that I didn’t even realize it was him in Bullet Train until half way through. He’s pretty entertaining in it.
The movie itself suffers from a “im sooo smart and clever” syndrome that some movies have. Johnson and whoever played the girl are the good parts of that movie.
Maybe I’m stuck in the last decade, but these prices seem insane. I know we’ve yet to see what a 5050 (lol) or 5060 would be capable of or its price point. However launching at $549 as your lowest card feels like a significant amount of the consumer base won’t be able to buy any of these.
Sadly I think this is the new normal. You could buy a decent GPU, or you could buy an entire game console. Unless you have some other reason to need a strong PC, it just doesn’t seem worth the investment.
At least Intel are trying to keep their prices low. Until they either catch on, in which case they’ll raise prices to match, or they fade out and leave everyone with unsupported hardware.
Actually AMD has said they’re ditching their high end options and will also focus on budget and midrange cards. AMD has also promised better raytracing performance (compared to their older cards) so I don’t think it will be the new norm if AMD also prices their cards competitively to Intel. The high end cards will be overpriced as it seems like the target audience doesn’t care that they’re paying shitton of money. But budget and midrange options might slip away from Nvidia and get cheaper, especially if the upscaler crutch breaks and devs have to start doing actual optimizations for their games.
Actually AMD has said they’re ditching their high end options
Which means there’s no more competition in the high-end range. AMD was lagging behind Nvidia in terms of pure performance, but the price/performance ratio was better. Now they’ve given up a segment of the market, and consumers lose out in the process.
the high end crowd showed there’s no price competition, there’s only performance competition and they’re willing to pay whatever to get the latest and greatest. Nvidia isn’t putting a 2k pricetag on the top of the line card because it’s worth that much, they’re putting that pricetag because they know the high end crowd will buy it anyway. The high end crowd has caused this situation.
You call that a loss for the consumers, I’d say it’s a positive. The high end cards make up like 15% (and I’m probably being generous here) of the market. AMD dropping the high and focusing on mid-range and budget cards which is much more beneficial for most users. Budget and mid-range cards make up the majority of the PC users. If the mid-range and budget cards are affordable that’s much more worthwhile to most people than having high end cards “affordable”.
But they’ve been selling mid-range and budget GPUs all this time. They’re not adding to the existing competition there, because they already have a share of that market. What they’re doing is pulling out of a segment where there was (a bit of) competition, leaving a monopoly behind. If they do that, we can only hope that Intel puts out high-end GPUs to compete in that market, otherwise it’s Nvidia or nothing.
Nvidia already had the biggest share of the high-end market, but now they’re the only player.
It’s already Nvidia or nothing. There’s no point fighting with Nvidia in the high end corner because unless you can beat Nvidia in performance there’s no winning with the high end cards. People who buy high end cards don’t care about a slightly worse and slightly cheaper card because they’ve already chosen to pay premium price for premium product. They want the best performance, not the best bang for the buck. The people who want the most bang for the buck at the high end are a minority of a minority.
But on the other hand, by dropping high end cards AMD can focus more on making their budget and mid-range cards better instead of diverting some of their focus on the high end cards that won’t sell anyway. It increases competition in the budget and mid-range section and mid-range absolutely needs stronger competition from AMD because Nvidia is slowly killing mid-range cards as well.
Steam hardware survey puts 4090 at 1.16% and 7900xtx at 0.54%. That means if we look at only the 4090s and 7900xtx-s then just between the two of them the 7900xtx makes up about a third of the cards. So yeah, you are a minority of a minority.
As for this number jargon. I’m not exactly sure what you’re trying to prove here but I’m sure you’re comparing an overclocked card to a stock card and if you’re saying it’s matching the 4090D then you’re not actually matching the 4090. 4090D is weaker than 4090, depending on the benchmark it ranges between 5% weaker to 30% weaker. If you were trying to prove that AMD cards can be as good as Nvidia cards then you’ve proven that even with overclocking the top of the line AMD card can’t beat a stock top of the line Nvidia card.
They’ll sell out anyways due to lack of good competition. Intel is getting there but still have driver issues, AMD didn’t announce their GPU prices yet but their entire strategy is following Nvidia and lowering the price by 10% or something.
Weird completely unrelated question. Do you have any idea why you write “Anyway” as “Anyways”?
It’s not just you, it’s a lot of people, but unlike most grammar/word modifications it doesn’t really make sense to me. Most of the time the modification shortens the word in some way rather than lengthening it. I could be wrong, but I don’t remember people writing or saying “anyway” with an added “s” in anyway but ironically 10-15 years ago, and I’m curious where it may be coming from.
Although considered informal, anyways is not wrong. In fact, there is much precedent in English for the adverbial -s suffix, which was common in Old and Middle English and survives today in words such as towards, once, always, and unawares. But while these words survive from a period of English in which the adverbial -s was common, anyways is a modern construction (though it is now several centuries old).
AMD has been taking over market share slowly but surely. And the console gaming market… and the portable gamimg market… and the chips out perform intel chips over and over. But ya sure.
I don't dispute that AMD is eating Intel's lunch, but performance-wise, AMD has nothing for nVidia. And that's what the discussion is about, performance.
So much of nvidia’s revenue is now datacenters, I wonder if they even care about consumer sales. Like their consumer level cards are more of an advertising afterthought than actual products.
Bought my first GPU, an R9 Fury X, for MSRP when it launched. The R9 300 series and GTX 900 series seemed fairly priced then (aside from the Titan X). Bought another for Crossfire and mining, holding on until I upgraded to a 7800 XT.
Comparing prices, all but the 5090 are within $150 of each other when accounting for inflation. The 5090 is stupid expensive. A $150 increase in price over a 10-year period probably isn’t that bad.
I’m still gonna complain about it and embrace my inner “old man yells at prices” though.
theverge.com
Najstarsze