I got tired of the whole GPU, PC building thing. It’s something that everybody should do once in their lives if only to learn how computers are put together. However, at a certain point, I just want come home, sit down and play games without having to fiddle around with drivers, so I bought a console.
Right? I find solace in the fact that I can update individual parts of my PC over the years to play whatever new game catches my fancy. Buying a whole new console every generation seems wasteful.
I’m definitely not on board with pixel chasers upgrading graphic cards every year, though. That feels even more wasteful.
My RX480 from 2016 is still kicking, if crashing a bit. At this rate when it breaks I’ll just use my steam deck docked instead of selling my liver to buy a new GPU
Less than a year since I upgraded from my RX580 and I only did it because I got an amazingly stupid deal on a RX7600 ($175 about 3 months after launch). Otherwise, the card is fine and is, in fact, still being used on my cousin’s PC.
Been that way for years. There was a brief respite when people were switching to ASIC bitcoin mining and away from GPU intensive mining and you could actually get a GPU for a fair price, retail, non-scalper price gouging.
Now it’s right back to basically unaffordable for a name brand GPU. Literally more than a mortgage payment.
Others were saying $2k… I am thankfully under $2500. Though I forgot insurance is wrapped in my monthly payment. So maybe I am under even $2k. I know rent around here is between $1k and $2k a month for a two bedroom apt though.
You know, I was thinking about upgrading my graphics card this year along with the rest of my PC but I think I can squeeze a couple more years out of my 3060 TI at this rate.
I just kept an eye on Micro Center’s refurbished cards for a few weeks and was able to snag a 3090Ti late last year with a 3-yr warranty for the same price I paid for a 980Ti in 2015.
I think that might be my plan too, but I’m still waiting a paycheck or two before I even monitor the situation. My 2070 is fine and ultimately I just want to pass it down to a spare PC for kids to mess around on as my oldest hits 3. I know my the time I hit 5 I was playing shit like Dune 2, admittedly with hacked save files my dad setup.
Crypto, followed by NFTs, followed by LLMs… The GPU market has been fucked for years now. Whenever something starts to drop off, another tech bro idea that requires 10,000 GPUs to process takes its place.
Truly just the brute force solution. Need a shitload of compute? GPUs can do it! No one stops to think if we really need it. It’s all about coulda, not shoulda. Yeah, ML and AI has a place, but big tech just thinks “slap an LLM everywhere”. Just such horseshit
The tech industry chases acquisition and investor money only now, not consumer demand or innovative achievement. It’s all just people trying to get rich, mostly so they can escape the effects of late stage capitalism.
The Crypto to AI transition was brutal. Just as demand for GPUs was coming down because people were starting to use ASICs to mine Bitcoin, along comes AI to drive up non-gaming demand again.
The only good news is that eventually when the AI bubble pops there will be massive R&D and manufacturing geared towards producing GPUs. Unless something else comes along… But really, I can’t see that happening because the AI bubble is so immense and is such an enormous part of the entire world’s economy.
The 9070’s on eBay are getting cheaper and cheaper the further we get from the launch. I think scalpers underestimated AMD’s stock and they are slowly discovering that.
Immediately after the launch the XT seemed to be starting at $1,200. Now they are down to $800. The non-xt is down to $650.
Depends on how much stock AMD can provide in the coming weeks and months, but I’m still thinking I’ll be able to get one at MSRP this year.
Funny thing about AMD is the MI300X is supposedly not selling well, largely because they priced gouge everything as bad as Nvidia, even where they aren’t competitive. Other than the Framework desktop, they are desperate to stay as uncompetitive in the GPU space as they possibly can, and not because the hardware is bad.
Wasn’t the Intel B580 a good launch, though? It seems to have gotten rave reviews, and it’s in stock, yet has exited the hype cycle.
I looked for months for a b580 for my wifes pc. Couldn’t get one in stock for MSRP during that time. Finally caved and grabbed a 6900xt for $400 used. The intel cards are awesome, if you can get one. I do hope intel keeps up the mid range offerings at sane prices
Well the B580 is a budget / low-power GPU. All the discussions going around are for flagship and high-end GPUs. Intel isn’t in that space yet, but we can hope they have a B7xx lined up which makes some waves.
I’m having a good time on a laptop with no fancy graphics card and have no desire to buy one.
I also do not look for super high graphical fidelity, play mostly indies instead of AAA, and am like 5 years behind the industry, mostly buying old gems on sale, so my tastes probably enable this strategy as much as anything else.
I’ll be honest, I have never paid attention to GPUs and I don’t understand what your comment is trying to say or (this feels selfish to say) how it applies to me and my comment. Is this intended to mostly be a reply to me, or something to help others reading the thread?
Depending on what happens with GPUs for datacenters, external GPUs might be so rare that nobody does it anymore.
My impression right now is that for nVidia gamer cards are an afterthought now. Millions of gamers can’t compete with every company in Silicon Valley building entire datacenters stacked with as many “GPUs” as they can find.
AMD isn’t the main choice for datacenter CPUs or GPUs. Maybe for them, gamers will be a focus, and there are some real advantages with APUs. For example, you’re not stuck with one particular amount of GPU RAM and a different amount of CPU RAM. Because you’re not multitasking as much when gaming, you need less CPU RAM, so you can dedicate more RAM to games and less to other apps. So, you can have the best of both worlds: tons of system RAM when you’re browsing websites and have a thousand tabs open, then start a game and you have gobs of RAM dedicated to the game.
It’s probably also more efficient to have one enormous cooler for a combined GPU and CPU vs. a GPU with one set of heatsinks and fans and a separate CPU heatsink and fan.
External GPUs are also a pain in the ass to manage. They’re getting bigger and heavier, and they take up more and more space in your case. Not to mention the problems their power draw is causing.
If I could get equivalent system performance with an APU vs. a combined CPU and GPU, I’d probably go for it, even with the upgradeability concerns. OTOH, soldered-in RAM is not appealing because I’ve upgraded my RAM more often than other components on my PCs, and having to buy a whole new motherboard to get a RAM upgrade is not appealing.
Thank you for explaining! I am not sure why people are reacting badly to my statement, is knowledge of GPUs something every gamer is expected to have and I am violating the social contract by being clueless?
Well at one point to be a computer gamer you basically needed to put together your own desktop PC.
Integrated GPUs basically were only capable of displaying a desktop, not doing anything a game would need, and desktop CPUs didn’t integrate graphics at all, generally.
So computer-building knowledge was a given. If you were a PC gamer, you had a custom computer for the purpose.
As a result, even as integrated GPUs became better and more capable, the general crowd of gamers didn’t trust them, because it was common knowledge they sucked.
It’s a lot like how older people go “They didn’t teach you CURSIVE?” in schools nowadays. Being a gamer and being a PC builder are fully seperatable, now, but they learned PC building when they weren’t and therefore think you should have that, too.
It’s fine, don’t sweat it. You’re not missing out on anything, really, anyway. Especially given the current GPU situation, it’s never been a worse time to be a PC builder or enthusiast.
Oh boy. Thanks for the context, by the way! I did not know that about the history of PC gaming.
I did learn cursive, but I have been playing games on laptops since I was little too and was never told I had to learn PC building. And to be completely honest, although knowledge is good, I am very uninterested in doing that especially since I have an object that serves my needs.
I have the perspective to realize that I have been on the “other side” of the WHAT DO YOU MEAN YOU’RE SATISFIED, LEARN MORE AND CHANGE TO BE LIKE US side, although I’m exaggerating because I don’t actually push others to take on my decisions. I don’t spam the uninterested to come to Linux, but I do want people who get their needs adequately served by Windows to jump to Linux anyways because I want to see Windows 11, with even more forced telemetry and shoved-in AI and things just made worse, fail. Even though that would actually be more work for satisfied Windows users.
But I would not downvote a happy Windows user for not wanting to switch, and that kind of behavior is frowned upon, is it just more acceptable to be outwardly disapproving to those who do not know about GPUs and are satisfied with what they have with zero desire to upgrade? I don’t have Sufficient Gamer Cred and am being shown the “not a Real Gamer” door? I think my comment was civil and polite so I really don’t understand the disapproval. If it is just “not a Real Gamer” I’ll let it roll off my back, though I did think the Gaming community on Lemmy was better than that… I would understand the reaction if I rolled up to c/GPUs with “I don’t care about this :)” and got downvoted. Is Gaming secretly kind of also c/GPUs and I just did not know that?
Okay I literally just realized it is probably because I hopped on a thread about GPUs and do not know about the topic being posted about. Whoops. Sorry.
I have a powerful computer for my work with a 3090, i9 etc. I still prefer gaming on the Xbox, it’s in the living room, it doesn’t bother me with driver issues, CPU overheating, and other random bugs I get on the big machine, and it’s in the living room where I’m also spending the time with my family rather than stuck in my cave upstairs.
People understandably love to hate Oblivion and Fallout 3, but I feel the side quest writing had heart, like groups of devs got to go wild within their own little dungeons. Their exploitable mechanics were kinda endearing.
…And I didn’t get that from Starfield? I really tried to overlook the nostalgia factor, but all the writing felt… corporate. Gameplay, animation, Bethesda jank without any of the fun. I abandoned it early and tried to see what I was missing on YouTube, but still don’t “get” what people see in that game.
If you want a big walking sandbox in that vein, I feel like No Man’s Sky would scratch the itch far better, no?
Meanwhile, BG3 and KC2 completely floored me. So did Cyberpunk 2077, though I only experienced it patched up and modded. Heck, even ME Andromeda felt more compelling to me.
I got Cyberpunk in December and KCD2 in February. At this point I’m convinced I’ve spoiled the entire RPG genre for myself for the next decade. I can’t imagine playing 2 great games back to back like that again.
Oblivion is my favorite Elder Scrolls. I actually played it again recently and thought it held up pretty well. I’m a sucker for wandering lush bucolic landscapes though.
They are literally sequels. 2 and 3. That removes any chance of them being unexpected now doesn’t it you dunce.
Ambitious, sure; if your definition of ambitious is delivering a complete game at release.
Weird? If you think these games are weird I’ll absolutely punish your eyeballs with just some stuff on steam that will leave these two games looking absolutely mainstream.
So… Where are all the realistic medieval sandbox RPGs? You know, of the kind set in an actual historical period?
Or… Or… How often has capturing the freedom and complexity of D&D in a videogame been attempted so accurately?
For something to even approach becoming a cliché there’d have to be a lot of that particular something done in exactly that particular way. So please do give a nice long list of games exactly like Kingdom Come Deliverence and Baldur’s Gate 3, because clearly everyone must’ve missed them.
The article totally misses the big intervening step between Skyrim/old Bioware and the failure of Starfield/Dragon Age: CDProjectRED.
While those studios largely just made “more of the same”, CDPR made Witcher 3 and then Cyberpunk 2077. Both games are way better narrative experiences and pushed RPG forward. Starfield looks very dated in comparison to both, and Dragon Age failed to capture to magic. Baldur’s Gate 3 and Kingdom Come: Deliverance 2 are successes because they also bring strong narratives and emotional connections to the stories.
Starfield would have been huge if it had been released soon after Skyrim. But now it just looks old fashioned, and I think the “wide as an ocean, as deep as a puddle” analogy is good for Starfield. Meanwhile Witcher 3 - which is 10 years old! - has quests and storylines with choices and emotional impact. BG3 and KC:D2 are heirs to Witcher 3.
People like to write off CP2077, which is such a shame.
…And maybe this makes me a black sheep, but I bounced off Witcher 2/3? I dunno, I just didn’t like the combat and lore, and ended up watching some of the interesting quests on YouTube.
pcgamer.com
Najnowsze