Komentarze

Profil ze zdalnego serwera może być niekompletny. Zobacz więcej na oryginalnej instancji.

conciselyverbose, (edited ) do games w Baldur's Gate 3 "feels so alive" because it used mo-cap and 248 actors to bring its characters to life

I mean, it definitely helps. The production quality is insane. But the fact that the choices (or mistakes) have actual real impacts on the game going forward are as big as far as I'm concerned. I ended up with my hand being forced into combat early that made an encounter with a potential party member immediately hostile. That sucks, especially since I wasn't trying to do what happened in the earlier encounter. But in terms of a world feeling alive, having it actually react to what you do is pretty damn significant (unless "you're small and irrelevant" is intentional).

conciselyverbose, do gaming w Soulframe is Elden Ring meets Ghost of Tsushima, but with 'Disney princesses' - PCGamesN

Thanks for this. I was unaware, and that sounds like it has potential.

conciselyverbose, do games w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources

It's not that simple. Even using it as a base gets you into a legal gray area. Learning from a work and incorporating elements into your own work is legal, but copying someone else's legwork like this is legally murky even if you don't take the actual code.

conciselyverbose, do games w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources

Someone distributing it for free doesn't mean they can legally just put it in their code and sell it.

If it is licensed in a way they can use it, they'd still have to do a bunch of testing and validation to actually do it.

conciselyverbose, do gaming w Soulframe is Elden Ring meets Ghost of Tsushima, but with 'Disney princesses' - PCGamesN

No actual Disney princesses? What a letdown.

conciselyverbose, do gaming w Weekly “What are you playing” Thread || Week of August 27th

I started back up the first Blasphemous after finally getting around to figuring out cloud saves on the steam deck (I just forced the windows build). I'd definitely like to get through it and into 2, but I feel like I'm bombarded with huge games I need to play this fall. It might work as a game I can play while watching TV. I started back up while watching 24 (I loved it when I was younger, but watching back I really didn't recognize just how great the writing was.)

I'm still hooked hard on BG3, though. A fucking lot of hours in and I'm still on act 1.

conciselyverbose, do games w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources

I sincerely don't mind their balance between bugs and ambition, but all this shit has me terrified.

conciselyverbose, (edited ) do games w AMD claims there’s nothing stopping Starfield from adding Nvidia DLSS

Every single thing about what you're discussing literally guarantees that GPUs are dogshit. There's no path to any of the features we're discussing getting accepted to open standards if AMD has input. They only added them after Nvidia proved how much better they are than brute force by putting them in people's hands.

Standards do not and fundamentally cannot work when actual innovation is called for. Nvidia competing is exactly 100% of the reason we have the technology we have. We'd be a decade behind, bare minimum, if AMD had any input at all in a standards body that controlled what Nvidia can make.

We're not going to agree, though, so I'll stop here.

conciselyverbose, do games w AMD claims there’s nothing stopping Starfield from adding Nvidia DLSS

Completely and utterly irrelevant? They are explicitly for the purpose of communicating between two pieces of hardware from different manufacturers, and obscenely simple. The entire purpose is to do the same small thing faster. Standardizing communication costs zero.

The architecture of GPUs is many, many orders of magnitude more complex, solving problems many orders more complex than that. There isn't even a slim possibility that hardware ray tracing would exist if Nvidia hadn't unilaterally done so and said "this is happening now". We almost definitely wouldn't have refresh rate synced displays even today, either. It took Nvidia making a massive investment in showing it was possible and worth doing for a solid decade of completely unusable software solutions before freesync became something that wasn't vomit inducing.

There is no such thing as innovation on standards. It's worth the sacrifice for modular PCs. It's not remotely worth the sacrifice to graphics performance. We'd still be doing the "literally nothing but increasing core count and clocks" race that's all AMD can do for GPUs if Nvidia needed to involve other manufacturers in their giant leaps forward.

conciselyverbose, do gaming w Pokemon Sleep Hits 10 Million Downloads Milestone and grants users a special gift

Yeah. In retrospect, I think I was wrong. It would be "save $500 to unlock" on categories where the "50% off" version is still a 200% markup.

conciselyverbose, do games w AMD claims there’s nothing stopping Starfield from adding Nvidia DLSS

Going through a standards group is a massive compromise. It in and of itself completely kills the marriage between the hardware and software designs. Answering to anyone on architecture design is a huge downgrade that massively degrades the product.

conciselyverbose, do games w AMD claims there’s nothing stopping Starfield from adding Nvidia DLSS

Cuda was first, and worked well out of the gate. Resources that could have been spent improving cuda for an ecosystem that was outright bad for a long time didn't make sense.

Gsync was first, and was better because it solved a hardware problem with hardware. It was a decade before displays came default with hardware where solving it with software was short of laughable. There was nothing nvidia could have done to make freesync better than dogshit. The approach was terrible.

DLSS was first, and was better because it came with hardware capable of actually solving the problem. FSR doesn't and is inherently never going to be near as useful because of it. The cycles saved are offset significantly by the fact that it needs its own cycles of the same hardware to work.

Opening the standard sounds good, but it doesn't actually do much unless you also compromise the product massively for compatibility. If you let AMD call FSR DLSS because they badly implement the methods, consumers don't get anything better. AMD's "DLSS" still doesn't work, people now think DLSS is bad, and you get accused of gimping performance on AMD because their cards can't do the math, all while also making design compromises to facilitate interoperability. And that's if they even bother doing the work. There have been nvidia technologies that have been able to run on competitor's cards and that's exactly what happened.

conciselyverbose, do games w AMD claims there’s nothing stopping Starfield from adding Nvidia DLSS

They can't improve openCL. They can make suggestions or proposals, but because broad compatibility are the priority, most of it wouldn't get added. They'd be stuck with a worse instruction set with tooling that spends half its time trying to figure out all the different hardware compatibility you have to deal with.

Cuda is better than openCL. Gsync was better than freesync (though the gap has closed enough that freesync is viable now). DLSS is better than FSR. None of them are small advantages, and they were all created before there was anything else available even if they wanted to. Supporting any of them in place of their own tech would have been a big step back and abandoning what they had just sold their customers.

It's not "pro consumer". It absolutely is "pro technology", though. Nvidia has driven graphic and gpgpu massively forward. Open technology is nice, but it has limitations as well, and Nvidia's approach has been constant substantial improvement to what can be done.

conciselyverbose, do games w AMD claims there’s nothing stopping Starfield from adding Nvidia DLSS

It's a hardware level feature, though. The reason they didn't support hardware prior to RTX was because they didn't have the tensor cores to do the right math.

FSR is substantially less capable because it can't assume it has the correct hardware to get the throughput DLSS needs to work. I know the "corporations suck" talking point is fun and there's some truth to it, but most of the proprietary stuff nvidia does is either first or better by a significant bit. They use the marriage of hardware and software to do things you can't do effectively with broad compatibility, because they use the architecture of the cards it's designed for (and going forward) extremely effectively.

conciselyverbose, do games w AMD claims there’s nothing stopping Starfield from adding Nvidia DLSS

I guess they could just use FSR as a wrapper for DLSS, but they made DLSS because there was nothing like it available, and it leverages the hardware to absolutely blow doors off of FSR. They're not comparable effects.

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • esport
  • muzyka
  • Pozytywnie
  • giereczkowo
  • Blogi
  • sport
  • Spoleczenstwo
  • rowery
  • krakow
  • tech
  • niusy
  • lieratura
  • Cyfryzacja
  • kino
  • LGBTQIAP
  • opowiadania
  • slask
  • Psychologia
  • motoryzacja
  • turystyka
  • MiddleEast
  • fediversum
  • zebynieucieklo
  • test1
  • Archiwum
  • FromSilesiaToPolesia
  • NomadOffgrid
  • m0biTech
  • Wszystkie magazyny