Atomic,

Todd Howard. The CEO of the company which games are world famous for their bugs.

When you play their games. You learn to quicksave before doing anything. Because you never know when opening a door will send a cheese flying at Mach 5 and hit you in the face.

He’s the guy who says my PC is the problem? Not their shitty code? Okay.

ArchmageAzor,
@ArchmageAzor@lemmy.world avatar

Just upgrade your PC 4head

GBU_28,

Y’all are surprised the boss of a AAA studio suggested you buy hardware from companies he has a deeply vested interest in?

It’s all one big circle jerk of companies and anyone buying “cutting edge” gets what they deserve.

You’re the product in more ways than one

Zeppo,
@Zeppo@sh.itjust.works avatar

You’re literally the consumer in this instance. The game is the product. The computer is the product.

gearheart,

I expected this once everyone kept buying into nvidias dlss.

Nvidia and dlss will be required to get titles to run decently.

Minimal game optimization will be done on majority of future game titles.

Fml

TheFerrango,

Minimal game optimization will be done on majority of future game titles.

That’s more optimisation than we get now

Pocketyeti,

Why upgrade when I will just pick it up on the PS7, 10 years from now, along with the Skyrim bundle.

manastorm,

I have a i9 13900k and a Radeon 7900xtx, 64GB RAM and I had to refund on steam it because it would keep crashing to desktop every few minutes. Sometimes I would not even get passed the Bethesda into Logo before crashing. Very frustrating experience to say the least.

entropicshart,

I have a i7-10700k/32gbRAM/3080ti - playing the game at 4k with all settings to max (without motion blur ofc) and with almost 80hrs into the game, I have yet to have a single crash or performance issue.

Only realized people were having issues when I saw posts and performance mods popping up.

AmosBurton_ThatGuy, (edited )
@AmosBurton_ThatGuy@lemmy.ca avatar

I mean, the game definitely runs like shit but if you keep crashing that sounds like a you problem. My 7600x/6700XT/32GB DDR5 build hasn’t crashed once in 15 hours of playtime and I’ve heard a ton of complaints about the game but barely any about crashing.

ADHDefy,
@ADHDefy@kbin.social avatar

Oh, only a 7900xtx? lol

thanevim,

If he's telling us this, does that mean we get to bill him for the upgrade?

qyron,

Not that I’ll be buying it anytime soon but if the hardware specifications I’ve read are true, no graphics card is worth €500+ to play a game. This is bonkers.

Sho,

What Todd Howard is being a dipshit tool again? I’m shocked…shocked I tell you…

Rheios, (edited )
@Rheios@ttrpg.network avatar

I’m a little shocked. Normally its Hines caught with his foot that deep in his own mouth.

speedstriker858,

Ridiculous statement. I’ve got an rx 7900xtx and a ryzen 7 7700x with 64 gigs of ram @5600mhz and the fucking game barely ever hits 144fps. Usually it’s sitting around 100-110 fps which is playable for sure, but literally every other game I’ve played on it has had no problem staying nailed at 144fps. This is at low-medium settings BTW (for starfield).

Rykzon,

Ridiculous statement. 100-110fps is far above playable. Do people forget how Witcher, Crysis and others ran on release?

Neato, (edited )
@Neato@kbin.social avatar

Its on Game Pass, Todd. If it doesn't run well I'll just not play Skyrim-Space Edition.

My partner who is interested has a PS5 and an older PC. If her PC doesn't run it, she'll probably just keep playing Stardew Valley. Honestly it's not like anyone is going to really be talking about Starfield in a month or two except ridiculous ship builds on social media.

nivenkos,

I bought a new PC just to play Starfield (and BG3 with less issues).

It looks alright overall. But it’s pretty crazy that even 30xx cards can’t run it well (I had a 1070 though).

circuitfarmer,
@circuitfarmer@lemmy.sdf.org avatar

I did a CPU/mobo/RAM upgrade for it – but I was quite overdue.

It looks alright overall.

That’s the thing. It looks alright, but it’s not the next-gen beauty fest that they want people to think it is. Plenty look better and run better. I enjoy the game, but the whole argument that it’s a graphical standout doesn’t really hold water.

circuitfarmer,
@circuitfarmer@lemmy.sdf.org avatar

I read this Todd like John Conner says it in T2. “She’s not my mother, Todd.”

comedy,
@comedy@kbin.social avatar

Wish my computer weren't dead, so I could at least try to play it. Although my 2070 wouldn't have survived. It runs nice on my Series X, but I hate playing this type of game with a controller.

circuitfarmer,
@circuitfarmer@lemmy.sdf.org avatar

I’m a PC gamer who likes playing with controllers generally (from the couch), but damn, I hate the way they adapted run and walk to the left analog stick. Feels horrible. I wish I just had autorun and could hold a button to walk. The key binding shuts off even if I try and force it with Steam controller config, because the game doesn’t technically support split inputs.

circuitfarmer, (edited )
@circuitfarmer@lemmy.sdf.org avatar

It’s BS though. People with TOTL hardware are having issues. Those systems don’t underperform because the game is advanced or anything like that – the game underperforms because it is a new release that is poorly optimized. It’s also expected because it’s on a senior citizen of a game engine that likely needs a few other nudges.

Todd Howard forgets that PC users see this shit all the time, and it’s pretty obvious with this one. Hoping to see talk of optimization in a coming patch instead.

Edit: a good example – not hitting 60fps in New Atlantis, but concurrently, CPU usage in the 50s and GPU usage in the 70s. That’s a sign of poor optimization.

Th3D3k0y,

My friend and I were just discussing the likelihood that some hardware producers pay game devs to purposely output bad optimizations so users are encouraged to spend more on upgrades.

circuitfarmer,
@circuitfarmer@lemmy.sdf.org avatar

In this case, you get Starfield free with the purchase of select AMD CPUs or GPUs.

But it’s weird for Todd Howard to come out with this push now, because it’s in response to those already playing the game.

Rheios, (edited )
@Rheios@ttrpg.network avatar

I mean, that’s probably why he would make the push. The bait’s in the mouth (people have the game), then comes the pull of the hook (they have to upgrade to try and handle its poor optimization, fulfilling the benefit of AMD backing them). And Beth doesn’t lose anything if its too frustrating and people stop playing over it because they already have the money.

EDIT: Admittedly I keep forgetting that game-pass is a thing, but maybe even that doesn’t really matter to Microsoft if it got people to get on gamepass or something? That makes my earlier point a bit shakier.

circuitfarmer,
@circuitfarmer@lemmy.sdf.org avatar

Yeah, MS wins either way, so long as people still want to play the game.

huskypenguin,

…like not launching with DLSS. What a weird oversight.

circuitfarmer,
@circuitfarmer@lemmy.sdf.org avatar

AMD is the official sponsor. That’s the one thing that wasn’t a surprise.

hypelightfly,

It's not an oversight, they were paid to not include DLSS.

MentalEdge,
@MentalEdge@sopuli.xyz avatar

While I’m no fan of paid sponsorships holding back good games, this is untrue.

Neither nvidia nor amd block their partner devs from supporting competing tech in their games. They just won’t help them get it working, and obviously the other side won’t either, since that dev is sponsored. There are some games out there that support both, some of them even partnered.

So yes, it’s bullshit. But it’s not “literally paid” bullshit. Bethesda could have gone the extra mile, and didn’t.

hypelightfly, (edited )

AMD blocks partners from implementing DLSS. You're probably right that it's not paid bullshit as the payout isn't monetary. But it's still being blocked due to the partnership.

This is hardly the first game to do this. Jedi Survivor, RE4 have the same problem. AMD sponsored FSR2 only. The work required to implement FSR2 or DLSS is basically the same (motion data). That's why DLSS mods were immediately available.

Since FSR2 was released not a single AMD sponsored game has DLSS added. Even games done in engines like unreal where all the dev has to do is include the plugin.

SpaceNoodle,

Literally not the case here, as evidenced by public communications.

hypelightfly,

Yes, it is the case. Companies lie all the time.

HKayn, (edited )
@HKayn@dormi.zone avatar

Is there actual evidence for AMD blocking DLSS?

And no, AMD being a sponsor is not sufficient evidence.

hypelightfly,

There is circumstantial evidence, no direct evidence as contracts are not public. There is no evidence, (circumstantial or direct) that AMD is allowing partners to add DLSS.

Every single AMD sponsored game released since FSR2 launched does not include DLSS despite it being trivial to add if the work is being done for FSR2. For Unreal engine games it can be enabled by including a completely free plugin, the work is already done. Yet, the AMD sponsored games don't. There is even a game that announced DLSS support before it released and then removed it after becoming AMD sponsored (Boundary).

SpaceNoodle,

To be more accurate, they were paid to include AMD optimization instead of DLSS.

Alto,
@Alto@kbin.social avatar

I'm starting to think that maybe, just maybe brute forcing a 26 yesr old engine that makes skyrim have a stroke if you try to play above 30fps isn't a good idea

_waffle_, (edited )
@_waffle_@sh.itjust.works avatar

What game engine is 26 years old other than the Unreal engine?

Edit: stepped on some toes i guess lmfao

Xanvial,

Gamebryo, the base of creation engine used by Bethesda for this

_waffle_,
@_waffle_@sh.itjust.works avatar

Ah okay. Thank you for the actual answer

NewNewAccount,

Is it actually the same engine?

Animoscity,
@Animoscity@lemmy.world avatar

No, Im not a fan of the game personally but a quick search shows they are using the creative engine 2, which is a newer version of their engine.

Alto,
@Alto@kbin.social avatar

Ill see if I can find it when I'm at my PC, but in an interview a dev said it was still using significant amounts of code from their Gamebryo engine from 97

azertyfun,

They could have called it Creative Engine 129030129784.32985 for all that it matters. It’s just a name for an engine update, as they do for every new game. They didn’t re-write it from scratch; that would be a billion-dollar venture.

From what I’ve read it’s the exact same engine as FO4 with better lighting (and of course, as with every new game, some improvements locally relevant to the gameplay).
But, fundamentally, underneath the fancy lights, still the same engine. That explains the 2008-esque animations, the bugs, the performance issues, and general flatness of the game. It can’t be more than “Skyrim in Space” because that’s what it technically is.

mordack550,

Because putting a 2 after the name makes a new engine. It’s just a new iteration of the same old engine that runs Fallout 3, skyrim, and Fallout 4.

Nfntordr,

Runs fine for me. 5600X, RTX 3080 @ 1440p high-ultra settings native.

Joker,

Same here except I use a 6600 xt, which isn’t anywhere near as good as your GPU. I’m running medium settings at 4k and it’s fine. It even runs on the Steam Deck, although the graphics are not so good on there. Still, it’s playable and I will probably play there when it’s convenient.

IMO, ultra settings are for people with new, high end hardware and to future proof a game for at least a couple years. It’s not for people running a 2-3 year old rig with a 1080p GPU. Medium and high settings are generally good. Ultra is just like bonus mode for hardcore enthusiasts.

Nfntordr,

Yeah, the reason why I mentioned my experience is because I’m finding people with better specs complaining and I’m like if we just turned the FPS counter off and enjoyed the game, I’m sure we’d barely notice it dips below 60 at times.

cyanarchy,

Starfield also requires an SSD, a first for a modern triple-A PC game.

I recall the same being said about Cyberpunk 2077, and I’m not sure that was the first either.

hypelightfly,

Cyberpunk doesn't require an SSD, it had "SSD recommended" under it's storage but not required. Starfield lists it as a requirement.

circuitfarmer,
@circuitfarmer@lemmy.sdf.org avatar

Cyberpunk also has a “HDD mode” in its options.

BruceTwarzen,

Because you load every time you walk through a door.

cyanarchy,

I stand corrected.

nivenkos,

BG3 has the same too.

hypelightfly,

No, it doesn't. Minimum requirements do not include an SSD.

https://larian.com/support/faqs/system-requirements_47

Naz,

To be fair, Cyberpunk 2077 came out in the peak of Covid GPU scarcity, I was still gaming on a GTX1080 at it’s release and the only way I could have a decent experience was running it at 50% resolution scale with 100% sharpening.

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • niusy
  • test1
  • Blogi
  • krakow
  • muzyka
  • sport
  • Technologia
  • nauka
  • Spoleczenstwo
  • fediversum
  • games@sh.itjust.works
  • FromSilesiaToPolesia
  • rowery
  • slask
  • lieratura
  • informasi
  • retro
  • Gaming
  • esport
  • Psychologia
  • Pozytywnie
  • motoryzacja
  • tech
  • giereczkowo
  • ERP
  • antywykop
  • Cyfryzacja
  • zebynieucieklo
  • warnersteve
  • Wszystkie magazyny