I’m having a good time on a laptop with no fancy graphics card and have no desire to buy one.
I also do not look for super high graphical fidelity, play mostly indies instead of AAA, and am like 5 years behind the industry, mostly buying old gems on sale, so my tastes probably enable this strategy as much as anything else.
I’ll be honest, I have never paid attention to GPUs and I don’t understand what your comment is trying to say or (this feels selfish to say) how it applies to me and my comment. Is this intended to mostly be a reply to me, or something to help others reading the thread?
Depending on what happens with GPUs for datacenters, external GPUs might be so rare that nobody does it anymore.
My impression right now is that for nVidia gamer cards are an afterthought now. Millions of gamers can’t compete with every company in Silicon Valley building entire datacenters stacked with as many “GPUs” as they can find.
AMD isn’t the main choice for datacenter CPUs or GPUs. Maybe for them, gamers will be a focus, and there are some real advantages with APUs. For example, you’re not stuck with one particular amount of GPU RAM and a different amount of CPU RAM. Because you’re not multitasking as much when gaming, you need less CPU RAM, so you can dedicate more RAM to games and less to other apps. So, you can have the best of both worlds: tons of system RAM when you’re browsing websites and have a thousand tabs open, then start a game and you have gobs of RAM dedicated to the game.
It’s probably also more efficient to have one enormous cooler for a combined GPU and CPU vs. a GPU with one set of heatsinks and fans and a separate CPU heatsink and fan.
External GPUs are also a pain in the ass to manage. They’re getting bigger and heavier, and they take up more and more space in your case. Not to mention the problems their power draw is causing.
If I could get equivalent system performance with an APU vs. a combined CPU and GPU, I’d probably go for it, even with the upgradeability concerns. OTOH, soldered-in RAM is not appealing because I’ve upgraded my RAM more often than other components on my PCs, and having to buy a whole new motherboard to get a RAM upgrade is not appealing.
Thank you for explaining! I am not sure why people are reacting badly to my statement, is knowledge of GPUs something every gamer is expected to have and I am violating the social contract by being clueless?
Well at one point to be a computer gamer you basically needed to put together your own desktop PC.
Integrated GPUs basically were only capable of displaying a desktop, not doing anything a game would need, and desktop CPUs didn’t integrate graphics at all, generally.
So computer-building knowledge was a given. If you were a PC gamer, you had a custom computer for the purpose.
As a result, even as integrated GPUs became better and more capable, the general crowd of gamers didn’t trust them, because it was common knowledge they sucked.
It’s a lot like how older people go “They didn’t teach you CURSIVE?” in schools nowadays. Being a gamer and being a PC builder are fully seperatable, now, but they learned PC building when they weren’t and therefore think you should have that, too.
It’s fine, don’t sweat it. You’re not missing out on anything, really, anyway. Especially given the current GPU situation, it’s never been a worse time to be a PC builder or enthusiast.
Oh boy. Thanks for the context, by the way! I did not know that about the history of PC gaming.
I did learn cursive, but I have been playing games on laptops since I was little too and was never told I had to learn PC building. And to be completely honest, although knowledge is good, I am very uninterested in doing that especially since I have an object that serves my needs.
I have the perspective to realize that I have been on the “other side” of the WHAT DO YOU MEAN YOU’RE SATISFIED, LEARN MORE AND CHANGE TO BE LIKE US side, although I’m exaggerating because I don’t actually push others to take on my decisions. I don’t spam the uninterested to come to Linux, but I do want people who get their needs adequately served by Windows to jump to Linux anyways because I want to see Windows 11, with even more forced telemetry and shoved-in AI and things just made worse, fail. Even though that would actually be more work for satisfied Windows users.
But I would not downvote a happy Windows user for not wanting to switch, and that kind of behavior is frowned upon, is it just more acceptable to be outwardly disapproving to those who do not know about GPUs and are satisfied with what they have with zero desire to upgrade? I don’t have Sufficient Gamer Cred and am being shown the “not a Real Gamer” door? I think my comment was civil and polite so I really don’t understand the disapproval. If it is just “not a Real Gamer” I’ll let it roll off my back, though I did think the Gaming community on Lemmy was better than that… I would understand the reaction if I rolled up to c/GPUs with “I don’t care about this :)” and got downvoted. Is Gaming secretly kind of also c/GPUs and I just did not know that?
Okay I literally just realized it is probably because I hopped on a thread about GPUs and do not know about the topic being posted about. Whoops. Sorry.
Yeah, it’s pretty okay and all, but the hype made it out to be cooler than it was, in my opinion. I’ve been playing Foundation the last day or two and I find it way more addictive, satisfying, and unique, so far. Maybe I just need to revisit Manor Lords. The trailers made the combat out to be Mount and Blade-esque, so I think that’s what really underwhelmed me. It felt more like Civilization-style “throw a bunch of units at the bad guy” combat.
after you hit the 10-15 hours mark you are just looking around like Travolta, that’s it? yep that’s it… no more content. Potential is there but will the devs deliver it? not so sure. Atm the game is overpriced.
For context, it’s somewhat common here in Latin America to name markets after the owner’s name; doubly so in smaller cities. (The city where this happened has 9k inhabitants)
It’s also common to name supermarkets “Super [something]”, to highlight that it sells general goods instead of just produce.
With that out of the way: seriously? Nintendo going after a mum-and-dad market in a small city in North America??? This only highlights that the current trademark and intellectual property laws across the world are toilet paper - they aren’t there to defend “healthy competition” or crap like that, but to ensure megacorps get their way. Screw this shit and screw Nintendo - might as well rename their company to Ninjigoku/任地獄, bloody hell.
I was hoping for Eskel or Letho to be honest. Ciri is a safe choice but a bit on the boring side. Eskel would have been a lot more multi-layered protagonist and dialogues would have been funnier.
I was praying for Letho even though we all knew it was going to be Ciri (for better or worse, we’ll see).
Leaning into the moral grey of the Witcher world with Letho would have been awesome, he says (in one of the endings to his story) he’ll head out east over the mountains so we could have gotten some all-new environments and it would have been 100% CDPR IP so they could have done whatever they wanted without trampling over lore too much. Plus Letho is just an amazing character with huge protagonist energy. Oh well.
Nothing clever to say except that this is fascinating and it would be cool to see mmorpg-like environments for other studies of virality, social phenomena etc. Anyone know of anything else like this?
Not exactly the same, but Borderlands 3 had a minigame that helped map the gut microbiome.
Developed in conjunction with McGill University, Massively Multiplayer Online Science, and The Microsetta Initiative, Borderlands Science is a puzzle game that benefits the real-world scientific community as you play. Borderlands Science presents you with simple block puzzles based on strands of DNA, and by solving them you’re helping to map and compare the microbes contained therein. Completing these puzzles also earns you in-game currency.
In case you’re curious about the practical applications for the raw data gathered through Borderlands Science, the human gut is linked to numerous diseases and conditions, including diabetes, depression, autism, anxiety, obesity and more. By mapping these microbes, the hope is that scientists will be able to better understand these ecosystems, which may help guide future research into novel treatments and interventions.
There’s only one I can think of, the Faldor Massacre. runescape.wiki/w/Falador_MassacreBasically a bug let players kill anyone they wanted even in safe zones. This meant a lot of people lost a lot of gear. Ironically occurring on the date 06/6/06.
Even DRM-free, all digital purchases are still just a license, legally speaking.
Pragmatically speaking, they can't forcibly take the bits off my hard drive. But it also bears pointing out that these days most games on Steam don't bother enabling Steamworks DRM either.
I don’t think streamers and video creators are more likely to be sex pests. You’re just more likely to hear about a sex pest if their career involves trying to be seen by as many people as possible.
That number is not the DbD team, but the Behaviour studio as a whole. DbD is their main breadwinner, but they also have several other active games that they maintain.
Also worth noting is their history as an IP mill. Dead By Daylight is a surprise hit amongst many a licensed deal to produce games that would nearly qualify as shovelware in most cases over the last 20+ years. DbD gives them some independence, but they’re still largely a “studio for hire” by anyone who needs them.
It was crazy how swiftly media moved to present tons of reasons to hate AI.
It really made me realize how the people with this strongest opinions have been given those opinions by media that they don’t even realize is a form of media.
pcgamer.com
Ważne