I’m curious, I have a 3080 as well and I’m getting ultra across the board and I average 60fps, maybe a setting or two is at high, also 1440p. Installed on an SSD, right? Render scale for me is 75%, only other thing I can think of is I overclocked my ram? But I don’t think that’d account for that huge of a jump
Texture resolution has not considerably effected performance since the 90s.
If this were true there wouldn’t be low resolution textures at lower settings, high resolutions take up exponentially more space, memory, and time to compute. I’m definitely not going to be re-learning what I know about games from Edgelord here.
I mean, yeah but also by what metric. There’s a thousand things that can affect performance and not just what we see. We know Starfield has a massive drive footprint, so most everything is probably high end textures, shaders, etc. Then the world sizes themselves are large. I don’t know, how do you directly compare two games that look alike? Red Dead 2 still looks amazing, but at 5 years old it’s already starting to show it’s age, but it also had a fixed map size, but it got away with a few things, etc etc etc every game is going to have differences.
My ultimate point is that you can’t expect to get ultra settings on a brand new game unless you’re actively keeping up on hardware. There’s no rules saying that you have to play on 4K ultra settings, and people getting upset about that are nuts to me. It’s a brand new game, my original comment was me saying that I’m surprised it runs as good as it does on the last generation hardware.
I played Borderlands 1 on my old ATI card back in 2009 in windowed mode, at 800x600, on Low settings. My card was a few years old and that’s the best I could do, but I loved it. The expectation that a brand new game has to work flawlessly on older hardware is a new phenomenon to me, it’s definitely not how we got started in PC gaming.
and no one is saying they have to, that’s my point that keeps getting overlooked. If someone wants to play sick 4k 120fps that’s awesome, but you’re going to pay a premium for that. If people are upset because they can’t play ultra settings on hardware that came out 5 years ago, to me that’s snobby behavior. The choice is either pay up for top of the line hardware, or be happy with medium settings and maybe you go back in a few years and play it on ultra.
If the game doesn’t play at all on lower hardware (like Cyberpunk did on release), then that is not fair and needs to be addressed. The game plain did not work for lower end hardware, and that’s not fair at all, it wasn’t about how well it played, it’s that it didn’t play.
I mean, there isn’t one thing you can point to and say “ah ha that’s causing all teh lag”, things just take up more space, more compute power, more memory as it grows. As hardware capabilities grow software will find a way to utilize it. But if you want a few things
Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that’s 4x the memory and 4x the space on drive)
Engines have generally grown to be more high fidelity including more particles, more fog, (not in Starfield but Raytracing, which is younger than 2017), etc. All of these higher fidelity items require more computer power. Things like anti-aliasing for example, they’re always something like 8x, but that’s 8x the resolution, which the resolutions have only gone up, again rising with time.
I don’t know what do you want? Like a list of everything that’s happened from then? Entire engines have come and gone in that time. Engines we used back then we’re on at least a new version compared to then, Starfield included. I mean I don’t understand what you’re asking, because to me it comes off as “Yeah well Unreal 5 has the same settings as 4 to me, so it’s basically the same”
4090 is definitely nuts, but with inflation the 4080 is right about on par. As usual team red very close in comparison for a much lower cost. You don’t have to constantly run the highest of the high level to get those sweet graphics, but it’s about personal taste. Personally it’s not for me paying the 40% more for a 10% jump in graphics, but every 2-3 generations is when I usually step back and reanalyze. Tbh usually it’s a game like starfield that makes me think if I should get a new one. Runs great for now though, probably have at least 1 hopefully 2 more generations before I upgrade again
Doom eternal also came out 3.5 years ago now, and your card is nearly 5 years old. That’s the performance I would expect from a card that is that old playing a brand new game that was meant to be a stretch.
I’m sorry, but this is how PC gaming works. Brand new cards are really only awesome for about a year, then good for a few years after that, then you start getting some new releases that make you think it’s about time. I’ve had the 3000 series, the 1000 series, before that I was an ATI guy with some sapphire, and before that the ATI 5000 series. It’s just how it goes in PC gaming, this is nothing new
I’m not saying it’s not an expensive hobby, it is. PC gaming on ultra is an incredibly expensive hobby. But that’s the price of the hobby. Saying that a game isn’t optimized because it doesn’t run ultra settings on hardware that came out 4+ years ago is nothing new, and to me it’s a weird thing to demand. If you want ultra, you pay for ultra prices. If you don’t want to/can’t, that’s 100% acceptable, but then just be content to play on High settings, maybe 1080p.
If PC gaming is too expensive in general that’s why consoles exist. You get a pretty great experience on a piece of hardware that’s only a few hundred dollars.
Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU, those came out 2 years ago now, and that’s averaging about 50fps on a 4k monitor.
If that isn’t optimized, idk what is. Yes, I had high end stuff from 2 years ago, but now it’s solid middle range.
People are so damn entitled. There used to be a time in PC gaming where if you were more than a year out of date you’d have to scale it down to windows 640x480. If you want “ultra” settings you need an “ultra” PC, which means flipping out parts every few years. Otherwise be content with High settings at 1080p, a very valid option
I actually really like what starfield does. It’s a rolling scale, the more encumbered you are the more you have to pause and “recharge” O2. So being over by 2 won’t affect you a lot, but over by 100 sure will
I’m not asking for free labor, it’s a mod, by definition I don’t need it. It’s a modification of a game. If I’m going to give anyone money it’ll be the creators of the game itself, they deserve my money. Not the person hacking the game a bit. If they want to, awesome, I’ll clone it and add the mod, but if they want to charge, well, I’ve never seen a mod I needed so badly that I’d pay for it.
Beautiful. I’m playing it now and gotta say, I hate everyone who has talked about it up until now. From the die-hard fanboys who say it has to be the best game ever created, to the anti-bethesda circlejerkers who roam every gaming community telling people how it’s a terrible game even though they have not played it.
I’m tired of everyone and their opinions about gaming. I bought it with mid expectations, and I am happy with my purchase.
Honestly, avoid Lemmy and Reddit for reviews on this game. The absolute vitriol it’s gotten here has just pushed me beyond trusting any of them. (and yes, they all end with “I mean I haven’t played it.”)
I have played it. 8 hours in so far, it’s fun. I won’t say it’s “redefining RPGs” for me or anything, but I’m having a good time playing around. To others here on Lemmy I am now the worst person on the planet.