Shame. It would be a good way to know if you are their favorite child lol. All joking aside, I think this is a compromise as others have alluded deep in comments. Valve likely doesn’t care or enforce it, but they don’t want to be responsible for account transfer due to games licensing and other legal shenanigans.
I think the thing that bothers me most about lemmys whirlwind of negativity about everything is that you get people like op, that find something they want to spread negativity about, then they post it to a bunch of communities.
Lemmy is small, so you see this one thing over and over and over again. It’s so tiring.
I get that this kind of stuff isn’t something to be positive about, I’m just getting so tired of lemmy. At least reddit didn’t have a constant stream of negativity. Multiplexed through every subreddit.
I’m convinced it’s this kind of thing that’s killing the entire thing. You can’t build communities on this, so there’s less and less people looking every day. I know I look at lemmy a lot less than I used to.
Isn’t like every bigger gaming news posted over 100 times on different communities and instances over a period of multiple days, not even including all the reposts of every different gaming outlet?
On lemmy? Yes. And the small nature of it makes it obvious. But lemmy also goes for the negative outrage stuff more than things of interest, and it’s smaller, so like I was saying. One person making the same post to a bunch of communities makes it stand out a lot,.
Honestly, I think it’s probably just time to delete the app. Lemmy isn’t what I hoped it would be. We’ll it was at first and for a good few months. But eventually the good people move away.
You know you have options to block and filler what you see so you can live in your bubble of only positivity while the AAAA gaming industry continues to get worse.
Heck, I mean feel free to block me since I tend to post positive and negative stories of it bothers you so much, no skin off my back while I post positive stories about Arrowhead and Larian Studios until they start doing things that aren’t positive.
The problem with event sec (and I ran large events for a decade) is that organisers often dont treat you like you’re anything but a bunch of fuckwits and dont tell you anything about planned stunts or anything beyond “Dont let anyone past this point without a pass” asking questions and wanting to be in the loop often gets you told to “do your job” which is why I was very anal retentive about what was and wasnt our responsibility in our contracts and what we were/werent liable for. Often the “talent” or people attending will decide they want to do something off script and you have to take a moment to decide wether or not to crash tackle someone somewhere they shouldnt be who might also be a keynote speaker or the headlining artists manager… its almost always a shitshow of not enough info. I have a lovely story about a bass player for a band not wearing a lanyard and starting a fight with two of the security when they wouldnt let him back stage… theres an NDA involved though because he broke one staff members cheekbone before getting knocked the fuck out.
The absolutely idiotic part was taking him to the ground, that outnumbered and not being aggressive it was unnecessary. Once it goes to the ground it almost always turns into a shitshow.
That bit about “every designer or artist has a bunch of shit ideas and just picks their best 5%” is SO TRUE. Remember this when you’re doing something creative and don’t feel great about it. Just keep doing it.
We’ll have to wait until it’s out to see. The statement that they want to minimize grind? Hoooooly crap, that’s the exact opposite of what RS was. To get to the higher levels and max a skill, it was basically a mental game of sticking to the best xp/tick strategies, which could still take a month or more to max one skill. That was after they had introduced a bunch of new things. The original days? It was a third job on top of it being the second job to do it in months.
It was also really fun for being so simplistic and had a good mix of self-aware humor, so I have hope for their new game.
RuneScape to me is about having an easy enough game that you always have something to do on.
I’ve got a few 99s, but I still have plenty of others to do, and if I don’t want to do skilling, I can go do some of the plethora of quests I haven’t done yet. If I don’t want to do either of those, I can go find quite a lot of different things to do ranging from very easy to very difficult.
I’m sure most people stick around for the grind, but for me, it was always because there was SOMETHING for you to do. :)
The most misleading statement ever though. It’s only halfway XP-wise. You will have unlocked much better training methods, gear, support skill, possibly more money, than earlier in the skill. Time-wise you’re definitely way over the halfway point.
This is the most crybaby thing to complain about. Reminds me of the reviews on Steam that are “I do not recommend” (this player has 3,432 hours logged)
The state of video games is wild to see. People will play a hundred hours of a game and say it’s lacking. Players expect endless content and it’s honestly unhealthy for gaming at large.
It’s completely unnecessary as well. We are absolutely spoiled for choice when it comes to video games, I pick up more for free than I have time to play, and with services like gamepass, offers like humble bundle, and the ever-present steam sales, there’s no reason to ever have to fork out big money for a game you feel you need to play a hundred hours in just to feel you’ve got your money’s worth. If you don’t like it after a few hours then just move on to one of the myriad games in your backlog and you’ll soon forget the boring one.
They wouldn’t have nearly as many problems as they did if they waited another 6 months for the initial release. I have a pc with a 1060 card, and I bought it relatively soon after launch, and it was extremely buggy, and I could barely play even at low settings. I made it maybe a 1/3 into the game before I just gave up and decided to wait until it was improved. I just installed again last week and started another play through, and even pre-2.0 it was markedly better and I could get a consistent 30+ fps on medium.
That’s I think the issue. 2.0 obviously contains many more bug fixes, but that’s not really what that release is about and it’s been past just playable for a long time. I actually really like the idea of 2.0, which is not really a bug fix but rethink of some gameplay mechanics that make a lot of sense. Like, it was always infuriating that the best armored clothes in the game often looked absolutely stupid, so I like them making clothing pretty much just cosmetic, and then moving armor to the ripperdoc upgrades. Sure, they could have probably figured that out for 1.0, but once things get into player hands you are always going to learn something. Conversely, Skryim has shipped on every platform with a screen practically and ships every time with the same garbage ass inventory system from 2011.
So yeah, they (the whole industry) should be releasing games that are fully baked, but I really don’t mind the idea that they’re going to take a game and iterate on it more like a platform. I could see Cyberpunk being something I’m still playing in 10 years as long as they keep adding content and iterating, in much the same way that people are still playing the shit out of GTAV.
Okay, but you being able to eek out fun from it doesnt fundamentally change the fact that it was a buggy, broken, amputated mess that was released 2 years too early.
I’ve been enjoying Starfield - but the empty planets suck, especially without vehicles. The scanning thing is boring and dumb, worse than trying to get 100% on a NMS world. It’s a shame that fast travel disconnects you from the space feel of the game, but it makes the rest of the game playable. I like the game overall, but they have definitely dropped the ball on space travel. In theory it’d be cool to come across different “dungeons” etc, as in Skyrim when wandering around, but doesn’t happen in Starfield because you’re generally not going to happen upon them. It’s not interesting to drop down to random planets.
Yeah started finding some neat stuff as I go further out. It’s not that it’s not there, it’s just that you don’t tend to stumble upon it. Like I’ll go to a planet do a mission open up and scan and see some POI like 1200m away. Now do I really want to tedious run over empty nothingness to see if it’s like a space hut or another pirate base etc? I definitely check out nearby POI especially if they are on the way to where i need to go. (Still having fun in the game though and I guess later having options to at least poke around in new places will be fun and i’m curious if the critters are fixed or procedural, like will there be variants all over or just the same few species)
Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU, those came out 2 years ago now, and that’s averaging about 50fps on a 4k monitor.
If that isn’t optimized, idk what is. Yes, I had high end stuff from 2 years ago, but now it’s solid middle range.
People are so damn entitled. There used to be a time in PC gaming where if you were more than a year out of date you’d have to scale it down to windows 640x480. If you want “ultra” settings you need an “ultra” PC, which means flipping out parts every few years. Otherwise be content with High settings at 1080p, a very valid option
I’m not saying it’s not an expensive hobby, it is. PC gaming on ultra is an incredibly expensive hobby. But that’s the price of the hobby. Saying that a game isn’t optimized because it doesn’t run ultra settings on hardware that came out 4+ years ago is nothing new, and to me it’s a weird thing to demand. If you want ultra, you pay for ultra prices. If you don’t want to/can’t, that’s 100% acceptable, but then just be content to play on High settings, maybe 1080p.
If PC gaming is too expensive in general that’s why consoles exist. You get a pretty great experience on a piece of hardware that’s only a few hundred dollars.
4090 is definitely nuts, but with inflation the 4080 is right about on par. As usual team red very close in comparison for a much lower cost. You don’t have to constantly run the highest of the high level to get those sweet graphics, but it’s about personal taste. Personally it’s not for me paying the 40% more for a 10% jump in graphics, but every 2-3 generations is when I usually step back and reanalyze. Tbh usually it’s a game like starfield that makes me think if I should get a new one. Runs great for now though, probably have at least 1 hopefully 2 more generations before I upgrade again
4090 is definitely nuts, but with inflation the 4080 is right about on par.
On par with the competing product? Sure. On par with inflation? Not by a long shot. GPU prices tripled a couple years back. Inflation accounted for only a small fraction of that. They have come down somewhat since then, but nowhere close to where they should be even with inflation.
As usual team red very close in comparison
Indeed. Both brands being overpriced doesn’t make them any less overpriced. Cryptocurrency and scalping may be mostly gone now, but corporate greed persists.
That’s not Todd Howard’s fault, but when he makes a snarky comment expecting everyone to cough up that kind of money to play his game, it’s more than a little tone deaf.
I’ll admit didn’t know the 4000 was that high, but yeah 1200 for the midrange card is too much. If it stays like this I may switch back to team Red. I do believe costs are probably higher, (I remember buying my first board with an AGP slot), the ones now are… a bit more complicated and complex to make, but the jump from 800 in 2020 to 1200 in 2023 is too much.
Adjusted for inflation in the US, the 1080 ti cost only $876 in today’s money when it came out. The 4080 launched at $1231 in today’s money. You are simply incorrect
The dude digs a hole and then grabs a bigger shovel
Some people just really love a company and will do anything to excuse their shortcomings
Starfield is poorly optimized and that’s really all there is to it. I’m sure in a few weeks modders will (once again) fix some obvious issues. Bethesda has no incentive to do the work themselves when the community will do it for free
Okay I’ll admit I didn’t know that’s how much the 4080 was, last time I checked was the 3000 series and yeah, that’s a lot. (I thought it started around 8-900) I stick to my points though, if you want ultra gaming, it’s going to cost an arm and a leg. My main point is still shouldn’t expect older hardware to get ultra settings, and that’s okay. You can play a game on medium settings and still have a blast.
I don’t know if you noticed, but everything became more expensive in the last year. Food, housing, etc, it’s called inflation and PC parts aren’t immune.
For only 300 more I have a mortgage on a 2000sq foot home in a large American city….
I have a 6900xt because I got a promotion recently and wanted to treat myself to get off the r9-300 series finally but it wasn’t 1600, I think I paid 1100
I’m running it on a Ryzen 1600 AF and a 1070. NOT Ti. 1440 at 66% resolution. Mix of mostly low some medium. 100% GPU and 45% CPU usage. 30 fps solid in cities. I won’t complain at all. I’m just happy it runs at all solidly under minimum spec.
Doom eternal also came out 3.5 years ago now, and your card is nearly 5 years old. That’s the performance I would expect from a card that is that old playing a brand new game that was meant to be a stretch.
I’m sorry, but this is how PC gaming works. Brand new cards are really only awesome for about a year, then good for a few years after that, then you start getting some new releases that make you think it’s about time. I’ve had the 3000 series, the 1000 series, before that I was an ATI guy with some sapphire, and before that the ATI 5000 series. It’s just how it goes in PC gaming, this is nothing new
I mean, there isn’t one thing you can point to and say “ah ha that’s causing all teh lag”, things just take up more space, more compute power, more memory as it grows. As hardware capabilities grow software will find a way to utilize it. But if you want a few things
Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that’s 4x the memory and 4x the space on drive)
Engines have generally grown to be more high fidelity including more particles, more fog, (not in Starfield but Raytracing, which is younger than 2017), etc. All of these higher fidelity items require more computer power. Things like anti-aliasing for example, they’re always something like 8x, but that’s 8x the resolution, which the resolutions have only gone up, again rising with time.
I don’t know what do you want? Like a list of everything that’s happened from then? Entire engines have come and gone in that time. Engines we used back then we’re on at least a new version compared to then, Starfield included. I mean I don’t understand what you’re asking, because to me it comes off as “Yeah well Unreal 5 has the same settings as 4 to me, so it’s basically the same”
Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that’s 4x the memory and 4x the space on drive)
Texture resolution has not considerably effected performance since the 90s.
Changing graphics settings in this game barely effects performance anyway.
Things like anti-aliasing for example, they’re always something like 8x, but that’s 8x the resolution, which the resolutions have only gone up, again rising with time.
Wtf are you talking about, nobody uses SSAA these days. TAA has basically no performance penalty and FSR has a performance improvement when used.
If you’re going to try and argue this point at least understand what’s going on.
The game is not doing anything that other games haven’t achieved in a more performant way. They have created a teetering mess of a game that barely runs.
Texture resolution has not considerably effected performance since the 90s.
If this were true there wouldn’t be low resolution textures at lower settings, high resolutions take up exponentially more space, memory, and time to compute. I’m definitely not going to be re-learning what I know about games from Edgelord here.
Texture resolution has not considerably effected performance since the 90s.
lol. try to play a game with 4K textures in 4K on a NVIDIA graphics card with not enough vram and you see how it will affect your performance 😅
I wouldn’t say that Starfield is optimized as hell, but I think it runs reasonably and many people will fall flat on their asses in the next months because they will realize that their beloved “high end rig” is mostly dated as fuck.
To run games on newer engines (like UE5) with acceptable framerates and details you need a combination of modern components and not just a “beefy” gpu…
So yeah get used to low framerates if you still have components from like 4 years ago
Changing graphics settings in this game barely effects performance anyway.
I don’t know and I don’t care what is wrong with your system but the amd driver tells me I’m averaging at 87fps with high details on a 5800X and a radeon 6900, a system that is now two years old and I think this is just fine for 1440p.
So yeah the game is not unoptimized, sure could use a few patches and performance will get better (remember it’s a fucking bethesda game for christ’s sake…) but for many people the truth will be to upgrade their rig or play on xbox
People are entitled because they don’t want to spend thousands of dollars on components only for them to be outdated within a fraction of the lifecycle of a console?
How about all the people that have the minimum or recommended specs and still can’t run the game without constant stuttering? I meet the recommended specs and I’m playing on low everything with upscaling turned on and my game turns into a laggy mess and runs at 15fps if I have the gall to use the pause menu in a populated area. I shouldn’t have to save and reload the game just to get it to run smoothly.
Bethesda either lied about the minimum/recommended requirements or they lied about optimization. Let’s not forget about their history of janky PC releases, dating back to Oblivion, which was 6 games and 17 versions of Skyrim ago.
and no one is saying they have to, that’s my point that keeps getting overlooked. If someone wants to play sick 4k 120fps that’s awesome, but you’re going to pay a premium for that. If people are upset because they can’t play ultra settings on hardware that came out 5 years ago, to me that’s snobby behavior. The choice is either pay up for top of the line hardware, or be happy with medium settings and maybe you go back in a few years and play it on ultra.
If the game doesn’t play at all on lower hardware (like Cyberpunk did on release), then that is not fair and needs to be addressed. The game plain did not work for lower end hardware, and that’s not fair at all, it wasn’t about how well it played, it’s that it didn’t play.
Idk what to tell you mate, I’m on a 3080, 1440p, and I’m getting average 60fps on 1440p My settings are all ultra except for a couple, FSR on at 75% resolution scale. To me, that’s optimized, I don’t even expect 60fps on an RPG. Cyberpunk I’ve never had higher than 50.
Consoles don’t even last their whole life time anymore, both machines required pro models to keep up with performance last gen and rumours have it Sony are gearing up for one this gen too.
I mean, yeah but also by what metric. There’s a thousand things that can affect performance and not just what we see. We know Starfield has a massive drive footprint, so most everything is probably high end textures, shaders, etc. Then the world sizes themselves are large. I don’t know, how do you directly compare two games that look alike? Red Dead 2 still looks amazing, but at 5 years old it’s already starting to show it’s age, but it also had a fixed map size, but it got away with a few things, etc etc etc every game is going to have differences.
My ultimate point is that you can’t expect to get ultra settings on a brand new game unless you’re actively keeping up on hardware. There’s no rules saying that you have to play on 4K ultra settings, and people getting upset about that are nuts to me. It’s a brand new game, my original comment was me saying that I’m surprised it runs as good as it does on the last generation hardware.
I played Borderlands 1 on my old ATI card back in 2009 in windowed mode, at 800x600, on Low settings. My card was a few years old and that’s the best I could do, but I loved it. The expectation that a brand new game has to work flawlessly on older hardware is a new phenomenon to me, it’s definitely not how we got started in PC gaming.
I have a PC with 5800X, 3080 Ti, and 64 GB DDR4-3600. I play at 1440p with 80% render scale, Medium-High settings (mostly Medium) and it’s barely above 60 FPS outdoors. It runs like shit.
I’m curious, I have a 3080 as well and I’m getting ultra across the board and I average 60fps, maybe a setting or two is at high, also 1440p. Installed on an SSD, right? Render scale for me is 75%, only other thing I can think of is I overclocked my ram? But I don’t think that’d account for that huge of a jump
Oh, well then I’d readjust expectations. Doom and fast paced shooters usually go up that high because they have quick fast-paced combat, but RPGs focus on fidelity over framerate. Hell, Skyrim at launch only offered 30fps, Cyberpunk I mentioned I never got above 45. 60 in an RPG is really a good time, don’t let the number on the screen dictate your experience. Comparing a fast shooter and an RPG like this is apples and oranges
I’m honestly shocked a game like this can run at 60fps. <45 and I start to get annoyed in RPGs. I’d expect if you wanted framerates that high you may be needing to window it at 1080 and lowering the settings further.
Nah 60 is not good enough for me. I’m fine if it’s a mobile game or handheld. I have no problems getting 90 FPS minimum in A Plague Tale: Requiem and Cyberpunk 2077.
In Starfield, not even 720p with lowest settings will help because the game is very heavily dependent on CPU. Looking at HW Unboxed benchmarks, the 5800X only managed to do 57 FPS average. You need a 7800X3D or a 13600K to get 90 FPS average.
As long as you know you’re definitely not in the key demographic then, for RPGs 60fps is pretty much the standard. Fine if you want more, but the game was not built as an FPS, it was built as an RPG. Those are the people I’m annoyed with, the ones who are complaining at Bethesda for not building an RPG to run like how you describe on hardware that’s several years out of date already, that’s just not possible
Bullshit, there’s no “standard” FPS for a certain genre. Also the 3080 Ti is a $1200 last gen GPU and the 5800X is a $450 last gen CPU. It’s ridiculous that they can’t even push 100+ FPS at the lowest settings. The CPU overhead in this game is insane. I used to target 120 FPS minimum for all games I play, hence the high-end build, but now even 90 FPS is too much? lmao
How about people with a Ryzen 5 5600 and RTX 3060 that wants to play at 60 FPS? Keep in mind that we’re not talking about 120 FPS, just measly 60 FPS and those parts are barely 2 years old.
Why do people use entitled like it is a bad thing? Why wouldn’t consumers be entitled as opposed to spending money as though it is an act of charity? Pretty weird how mindset of gamers over the years has shifted in a way where the fact that they are consumers has been forgotten.
I say entitled because gamers should just be happy, be happy with the hardware you have even if it can’t put out 4k, turn off the FPS counter, play the game. If you’re enjoying it, who cares if it occasionally dips down to 55? The entitlement comes from expecting game makers to produce games that run flawlessly at ultra settings on hardware that’s several years old. If you want that luxury, you have to spend a shitload of money on the top of the line gear, otherwise just be happy with your rig.
Products are just products designed to get money out of people. I don’t have an appreciation like its some sports team for them. It comes down to simply if it is worth spending money on or not. Being entitled is a good thing, since it encourages less consumerist behavior with how lot of people can use less frivolous spending in their lives.
You can try to spin it as a negative, but I find this hail corporation approach to consumerism very odd. Wanting more value for the money is a good standard to have.
I’m actually agreeing with you, people should be happy to play the games on their older hardware even if it can’t pull down the ultra specs. We don’t need to always be buying the latest generation of GPUs, it’s okay to play on medium specs. We don’t have to have the top of the line latest card/processor/drive, we can enjoy ours for years, even if it means newer games don’t play on ultra. If you have the funds to buy new ones every generation, more power to you, but I buy my cards to last 8-10 years. The flipside is just expect that the games won’t run on ultra.
People should expect more optimization for the games they look into and better price for performance offerings for hardware. Approach of just pushing what is acceptable further into the category of the premium tier leads to worse consumer offerings over the long run. What is considered acceptable hardware has gotten more and more out of reach each generation while disposable income has not kept up.
Complacency and constantly falling scale of what is acceptable is what leads to worse standards. Bad prices and optimization should not get passes. PR management of be happy with hardware or performance is not the job of consumers aside from those who are being paid to run those type of campaigns.
Hmm .i dont know if you ever noticed but there usualy is a very little diffrence between ultra and high/very high but a lot of diffrence in performance. Ultra settings were always designed to sweat the pc and i assume its similar with starfield . And there is also advent of the 4k which put this ridicolous standard even higer( which especialy on pc makes very little sense unless you play on it like on a console from your couch ). In fact the fact that old graphics card are still faring so well is an anomaly rather than the standard.
That’s the thing - I’d say this game is pretty well optimized. People have unrealistic expectations of what their hardware can do. That’s a better way of putting it than “entitled”.
None of the 3D Bethesda games played this well at release. I speak from first hand experience building PCs since 1999 and playing Oblivion, FO3, NV, Skyrim, and FO4 at release. Playing those games on years old hardware required lower than native resolutions and medium settings - exactly what you see in Starfield currently.
PC gamers enjoyed a bit of a respite from constantly needing to upgrade during the PS4/Xbone era. Those machines were fairly low end even at launch and with them being the primary development formats for most games, it was easy to optimize PC ports even on old hardware.
Then the new consoles came out that were a genuine jump in tech again as consoles used to be, and now PCs need to be upgraded to keep up and people that got used to the last decade on PC are upset they can’t rock hardware for multiple years anymore.
Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU
Just specifying the series doesn’t really say much. Based on that and the release year you could be running a 5600X and RTX3060 or you could be running a 5950X and RTX3090. There’s something like a ~2.5x performance gap between those.
pcgamer.com
Ważne