games

Magazyn ze zdalnego serwera może być niekompletny. Zobacz więcej na oryginalnej instancji.

altima_neo, w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources
@altima_neo@lemmy.zip avatar

I’ll believe it when I see it.

I mean even Skyrim ran pretty nice, till you started playing it long enough to start finding the bugs and jank. Of course, it helped that it had all the familiar jank from the previous games.

Gormadt, w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources
@Gormadt@lemmy.blahaj.zone avatar

That’s a really low bar NGL

I’m going to wait for launch and reviews for sure

boeman,

I was thinking the same thing. I’m sure launch will be a bit of a shit show, but at least we usually get some entertaining bugs.

NigelFrobisher, w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources

No fatal accidents in this workplace in over 30 days!

Pratai, w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources

What a sales pitch!

peopleproblems, w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources

I’m a little disappointed I was looking forward to Skyrim bugs in Space

Kolanaki, (edited )
!deleted6508 avatar

The Giant Club space program could now actually send you to space; but there are no giants 😔

vaultdweller013,

That we know of

nanoUFO, w Killer Klowns from Outer Space: The Game - Official Gameplay Teaser Trailer
@nanoUFO@sh.itjust.works avatar

It’s wild that something was made from this IP in current year.

Lenny,

That is wild. Someone was talking about being scared of clowns as a kid and it reminded me of this movie. I probably only caught 30 min of it on cable but it was so wtf (to a young me) that it was enough to never forget it. Never thought I’d see it come back!

Hubi,

Absolutely. I first saw the movie a few years ago at a local horror festival and loved it. It has held up really well, in a very cheesy way. Still, a sequel would’ve surprised me… and a video game? I never would’ve guessed that.

FrankTheHealer, w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources

Do the NPCs in this game give anyone else a sort of uncanny valley feel

Call_Me_Maple, w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources
@Call_Me_Maple@lemmy.world avatar

I don’t believe you.

FoundTheVegan,
@FoundTheVegan@kbin.social avatar

Seems like a standard marketing move to get ahead of the meme. We'll see how this article ages by next week, but I pretty sus. 😂

BallShapedMan, w Armored Core VI peaks at 150K peak concurrent Steam players on day 1, making it the 4th biggest launch of 2023. It's also the second biggest From Software launch ever, second only to Elden Ring
@BallShapedMan@lemmy.world avatar

And this just in, I suck at it.

buffalo,

Something I’m struggling with: I don’t like it, but I want to.

I’ve only played like 2 hours, and I only bought it bc from software, but it looks soooo cool. I suck at most fromsoft games in the beginning, but the repetition and watching yourself get better as you master the mechanics was what got me.

But I don’t know with this one. I’m not complaining, everyone has their cup of tea. Just can’t decide if I’m going to give up on it or keep trying. I’m very very happy that it’s successful tho!

oxideseven,

Every single Armored Core.

I’ve wanted to like this series and I never really do lol

FordBeeblebrox,

Funny, I’ve been an AC fan my entire life but none of the Souls games have drawn me in. Different strokes I guess

oxideseven,

I despise the souls and souls-like. I just really want a merch game I can get into and I never do get into them. Not since like front mission 3 or xenogears or something.

Kolanaki, (edited )
!deleted6508 avatar

Well… Front Mission and Xenogears, while featuring giant mechs, are RPGs/tactical strategy games and not “mech games.”

“Mech” games are typically what AC is or more of a tank simulator, your tank just has 2 legs and arms.

Necromnomicon,

I am in the same boat as you. I’m so happy a new Armored core game is here

BallShapedMan,
@BallShapedMan@lemmy.world avatar

I’ve found the build of my AC has a lot to do with how successful I am. I’m rocking two shotguns and lasers on each shoulder and right now I’m murdering most things.

Kolanaki, (edited )
!deleted6508 avatar

If I was able to beat Steel Battalion, I sure as shit can handle this. Even if the boss at the end of chapter 1 is a huge douche nozzle with his god damn wall of missiles and speedy movement for his ridiculous size and god damn shield that sometimes doesn’t even take damage when I chip away at it, or my guns won’t fire despite not being reloaded or overheated…

Plibbert, w Killer Klowns from Outer Space: The Game - Official Gameplay Teaser Trailer

Dude this movie is what gave me fear of clowns as a kid lol.

BlinkAndItsGone, w AMD claims there’s nothing stopping Starfield from adding Nvidia DLSS

Here’s the most important part IMO:

He admits that — in general — when AMD pays publishers to bundle their games with a new graphics card, AMD does expect them to prioritize AMD features in return. “Money absolutely exchanges hands,” he says. “When we do bundles, we ask them: ‘Are you willing to prioritize FSR?’”

But Azor says that — in general — it’s a request rather than a demand. “If they ask us for DLSS support, we always tell them yes.”

SO developers aren’t forced contractually to exclude DLSS, but outside the contract language, they are pressured to ignore it in favor of FSR. That explains why these deals tend to result in DLSS being left out, and also why there are some exceptions (e.g. Sony games–I imagine Sony knows what features it wants its PC releases to have and has decided to push back on DLSS inclusion). I think AMD is being honest this time, and I’m surprised it admitted publicly that it’s doing this. Hopefully the word about this will get out and more developers will insist on including DLSS.

rivalary,

I wish Nvidia and AMD would work together to create these features as open standards.

sugar_in_your_tea,

Well, FSR is open, as is FreeSync and most other AMD tech, it’s just that NVIDIA is so dominant that there’s really no reason for them to use anything other than their own proprietary tech. If Intel can eat away at NVIDIA market share, maybe we’ll see some more openness.

conciselyverbose,

I guess they could just use FSR as a wrapper for DLSS, but they made DLSS because there was nothing like it available, and it leverages the hardware to absolutely blow doors off of FSR. They're not comparable effects.

sugar_in_your_tea,

Last I checked, DLSS requires work by the developers to work properly, so it’s less “leveraging the hardware” and more “leveraging better data,” though maybe FSR 3 has a similar process.

conciselyverbose,

It's a hardware level feature, though. The reason they didn't support hardware prior to RTX was because they didn't have the tensor cores to do the right math.

FSR is substantially less capable because it can't assume it has the correct hardware to get the throughput DLSS needs to work. I know the "corporations suck" talking point is fun and there's some truth to it, but most of the proprietary stuff nvidia does is either first or better by a significant bit. They use the marriage of hardware and software to do things you can't do effectively with broad compatibility, because they use the architecture of the cards it's designed for (and going forward) extremely effectively.

sugar_in_your_tea,

I think it’s more the other way around. They designed the feature around their new hardware as a form of competitive advantage. Most of the time, you can exchange cross platform compatibility for better performance.

Look at CUDA vs OpenCL, for example. Instead of improving OpenCL or making CUDA an open standard, they instead double down on keeping it proprietary. They probably get a small performance advantage here, but the main reason they do this is to secure their monopoly. The same goes for GSync vs FreeSync, but it seems they are backing down and supporting FreeSync as well.

They want you to think it’s a pro-consumer move, but really it’s just a way to keep their competition one step behind.

conciselyverbose,

They can't improve openCL. They can make suggestions or proposals, but because broad compatibility are the priority, most of it wouldn't get added. They'd be stuck with a worse instruction set with tooling that spends half its time trying to figure out all the different hardware compatibility you have to deal with.

Cuda is better than openCL. Gsync was better than freesync (though the gap has closed enough that freesync is viable now). DLSS is better than FSR. None of them are small advantages, and they were all created before there was anything else available even if they wanted to. Supporting any of them in place of their own tech would have been a big step back and abandoning what they had just sold their customers.

It's not "pro consumer". It absolutely is "pro technology", though. Nvidia has driven graphic and gpgpu massively forward. Open technology is nice, but it has limitations as well, and Nvidia's approach has been constant substantial improvement to what can be done.

sugar_in_your_tea,

CUDA is only better because the industry has moved to it, and NVIDIA pumps money into its development. OpenCL could be just as good if the industry adopted it and card manufacturers invested in it. AMD and Intel aren’t going to invest as much in it as NVIDIA invests in CUDA because the marketshare just isn’t there.

Look at Vulkan, it has a ton of potential for greater performance, yet many games (at least Baldur’s Gate) work better with DirectX 12, and that’s because they’ve invested resources into making it work better. If those same resources were out into Vulkan development, Vulkan would outperform DirectX on those games.

The same goes for GSync vs FreeSync, most of the problems with FreeSync were poor implementations by monitors, or poor support from NVIDIA. More people had NVIDIA cards, so GSync monitors tended to work better. If NVIDIA and AMD had worked together at the start, variable refresh would’ve worked better from day one.

Look at web standards, when organizations worked well together (e.g. to overtake IE 6), the web progressed really well and you could largely say “use a modern browser” and things would tend to work well. Now that Chrome has a near monopoly, there’s a ton of little things that don’t work as nicely between Chrome and Firefox. Things were pretty good until Chrome became dominant, and now it’s getting worse.

It absolutely is “pro technology”

Kind of. It’s more of an excuse to be anti-consumer by locking out competition with a somewhat legitimate “pro technology” stance.

If they really were so “pro technology,” why not release DLSS, GSync, and CUDA as open standards? That way other companies could provide that technology in new ways to more segments of the market. But instead of that, they go the proprietary route, and the rest try to make open standards to oppose their monopoly on that tech.

I’m not proposing any solutions here, just pointing out that NVIDIA does this because it works to secure their dominant market share. If AMD and Intel drop out, they’d likely stop the pace of innovation. If AMD and Intel catch up, NVIDIA will likely adopt open standards. But as long as they have a dominant position, there’s no reason for them to play nicely.

conciselyverbose,

Cuda was first, and worked well out of the gate. Resources that could have been spent improving cuda for an ecosystem that was outright bad for a long time didn't make sense.

Gsync was first, and was better because it solved a hardware problem with hardware. It was a decade before displays came default with hardware where solving it with software was short of laughable. There was nothing nvidia could have done to make freesync better than dogshit. The approach was terrible.

DLSS was first, and was better because it came with hardware capable of actually solving the problem. FSR doesn't and is inherently never going to be near as useful because of it. The cycles saved are offset significantly by the fact that it needs its own cycles of the same hardware to work.

Opening the standard sounds good, but it doesn't actually do much unless you also compromise the product massively for compatibility. If you let AMD call FSR DLSS because they badly implement the methods, consumers don't get anything better. AMD's "DLSS" still doesn't work, people now think DLSS is bad, and you get accused of gimping performance on AMD because their cards can't do the math, all while also making design compromises to facilitate interoperability. And that's if they even bother doing the work. There have been nvidia technologies that have been able to run on competitor's cards and that's exactly what happened.

sugar_in_your_tea,

Opening the standard… compromise the product massively

Citation needed.

All NVIDIA needs to do is:

  1. release the spec with a license AMD and Intel can use
  2. form a standards group, or submit it to an existing one
  3. ensure any changes to the spec go through the standards group; they can be first to market, provided they agree on the spec change

That’s it. They don’t need to make changes to suit AMD and Intel’s hardware, that’s on those individual companies to make work correctly.

This works really well in many other areas of computing, such as compression algorithms, web standards, USB specs, etc. Once you have a standard, other products can target it and the consumer has a richer selection of compatible products.

Right now, if you want GPGPU, you need to choose between OpenCL and CUDA, and each choice will essentially lock you out of certain product categories. Just a few years ago, the same as true for FreeSync, though FreeSync seems to have won.

But NVIDIA seems to be allergic to open standards, even going so far as to make their own power cable when they could have worked with the existing relevant standards bodies.

conciselyverbose,

Going through a standards group is a massive compromise. It in and of itself completely kills the marriage between the hardware and software designs. Answering to anyone on architecture design is a huge downgrade that massively degrades the product.

sugar_in_your_tea,

How do you explain PCIe, DDR, and M.2 standards? Maybe we could’ve had similar performance sooner if motherboard vendors did their own thing, but with standardization, we get more variety and broader adoption.

If a company wants or needs a major change, they go through the standards body and all competitors benefit from that work. The time to market for an individual feature may be a little longer, but the overall pace is likely pretty similar, they just need to front load the I/O design work.

conciselyverbose,

Completely and utterly irrelevant? They are explicitly for the purpose of communicating between two pieces of hardware from different manufacturers, and obscenely simple. The entire purpose is to do the same small thing faster. Standardizing communication costs zero.

The architecture of GPUs is many, many orders of magnitude more complex, solving problems many orders more complex than that. There isn't even a slim possibility that hardware ray tracing would exist if Nvidia hadn't unilaterally done so and said "this is happening now". We almost definitely wouldn't have refresh rate synced displays even today, either. It took Nvidia making a massive investment in showing it was possible and worth doing for a solid decade of completely unusable software solutions before freesync became something that wasn't vomit inducing.

There is no such thing as innovation on standards. It's worth the sacrifice for modular PCs. It's not remotely worth the sacrifice to graphics performance. We'd still be doing the "literally nothing but increasing core count and clocks" race that's all AMD can do for GPUs if Nvidia needed to involve other manufacturers in their giant leaps forward.

sugar_in_your_tea,

communicating between two pieces of hardware from different manufacturers

  • like a GPU and a monitor? (FreeSync/GSync)
  • like a GPU and a PSU? (the 12v cable)

DLSS and RTX are the same way, but instead of communicating between two hardware products, it’s communicating between two software components, and then translating those messages onto commands for specialized hardware.

Both DLSS and RTX are a simpler, more specific casez of GPGPU, so they likely could’ve opened and extended CUDA, extended OpenCL, or extended Vulkan/DirectX instead, with the hardware reporting whether it can handle DLSS or RTX extensions efficiently. CPUs do exactly that for things like SIMD instructions, and compilers change the code depending on the features that CPU exposes.

But instead in all of those cases, they went with proprietary and minimal documentation. That means it was intentional that they don’t want competitors to compete directly using those technologies, and instead expect them to make their own competing APIs.

Here’s how the standards track should work:

  1. company proposes new API A for the standards track
  2. company builds a product based on proposal A
  3. standards body considers and debates proposal A
  4. company releases product based on A, ideally after the standards body agrees on A
  5. if there is a change needed to A, company releases a patch to support the new, agreed-upon standard, and competitors start building their own implementations of A

That’s it. Step 1 shouldn’t take much effort, and if they did a good job designing the standard, step 5 should be pretty small.

But instead, NVIDIA ignores the whole process and just does their own thing until either they get their way or they’re essentially forced to adopt the standard. They basically lost the GSync fight (after years of winning), and they seem to have lost the Wayland EGLStream proposal and have adopted the GBM standard. But they win more than they lose, so they keep doing it.

That’s why we need competition, not because NVIDIA isn’t innovating, but because NVIDIA is innovating in a way to lock out competition. If AMD and Intel can eat away at NVIDIA’s dominant market share, NVIDIA will be forced to pay nice more often.

conciselyverbose, (edited )

Every single thing about what you're discussing literally guarantees that GPUs are dogshit. There's no path to any of the features we're discussing getting accepted to open standards if AMD has input. They only added them after Nvidia proved how much better they are than brute force by putting them in people's hands.

Standards do not and fundamentally cannot work when actual innovation is called for. Nvidia competing is exactly 100% of the reason we have the technology we have. We'd be a decade behind, bare minimum, if AMD had any input at all in a standards body that controlled what Nvidia can make.

We're not going to agree, though, so I'll stop here.

sugar_in_your_tea,

The process I detailed does not require consensus before a product can be released, it just allows for that consensus to happen eventually. So by definition, it won’t impede progress. It does encourage direct competition, and that’s something NVIDIA would rather avoid.

mindbleach,

Nvidia of all companies does not get to whine about this.

BlinkAndItsGone, (edited )

Well, Nvidia isn’t directly involved here at all, they’ve only commented on the issue once (to say that they don’t block other companies’ upscaling). The objections tend to come from users, the majority of whom have Nvidia cards and want to use what is widely considered the superior upscaling technology.

mindbleach,

Oh, are they annoyed by vendor-specific software, now that it affects them? My heart bleeds.

DrSleepless, w 2023 was the most watched Gamescom Opening Night Live in history. Viewership was up 66% over ONL 2022, and Gamescom had 320,000 attendees across the week in Cologne, Germany.

Starfield is why

mindbleach, w Roblox ‘operates illegal gambling ring that preys on children’: lawsuit

Forget the kids and ignore the odds. Any game taking real money is a scam.

(No that doesn’t mean buying games. No that doesn’t mean subscriptions. No that doesn’t mean expansions. No that doesn’t mean card games. No that doesn’t mean arcades. Jesus Christ, do people find a lot of ways to get mad about nonsense, whenever I say this.)

Nothing inside a video game should cost real money. Absolutely fucking nothing. All possible forms are abuse, built on how games by definition invent value for worthless elements that can be arbitrarily granted or withheld. That is what makes them games.

The business model is intolerable - and if we allow it to continue, there will be nothing else. It’s the dominant strategy. Your disgust and non-participation will never outweigh some tiny fraction of people getting taken for obscene quantities of real money in exchange for incrementing a variable. It’s in free mobile trash. It’s in $70 “AAA” flagship-franchise titles. It’s in single-player, multi-player, subscription MMOs - it’s in everything. There is zero incentive for them not to try robbing you like this. Companies that don’t rob you will make less money than companies that do.

Only legislation can fix this.

Ban the entire business model. (No that doesn’t mean games. No that doesn’t mean content. Jesus Christ, am I tired of dealing with pearl-clutching nonsense, just to say “fuck lootboxes.”)

Overt abuse gets disguised. It’s still abuse. All they’re getting better at is how deep the hooks can slide before people notice.

Content is the bait on this hook. All it’s doing is disguising the abuse. The abuse remains. The abuse is the entire point. The abuse is the only part that makes money.

This business model is a threat to the entire medium, and the only real solution is dead simple. We will be fine without it. We will only be fine, without it.

MomoTimeToDie,

deleted_by_author

  • Loading...
  • mindbleach,

    Lootboxes aren’t “buying content.” Buying a game, is. Buying DLC, is. Gambling on a hat that’s already in the game you’re playing is plainly something different, and increasingly, that’s the only source of revenue.

    This is not theoretical. We’re already in a stupid sci-fi future where four-billion-dollar games can be “”“free”“” and somehow convince people to spend thousands of dollars apiece on a deluge of random bullshit which is also allegedly free. And it’s not even possible to have a sane argument about this shit-show, because people pretend they don’t understand the thing all these games do.

    I want video games to make money the way they did in 2008.

    Do you have an opinion about that?

    If it goes ‘then games would magically look like 2008 forever,’ stop.

    If it goes ‘but then they’d make 2008 kinds of money,’ stop.

    This is new. This is bad. This is spreading. We should stop it.

    Kranerian, w Starfield is Bethesda's Least Buggiest Game to Date, Say Sources
    @Kranerian@kbin.social avatar

    That's such a low bar that it's clipped through the floor.

    mindbleach,

    That’s how they do door sills!

    Disgusted_Tadpole, w 2023 was the most watched Gamescom Opening Night Live in history. Viewership was up 66% over ONL 2022, and Gamescom had 320,000 attendees across the week in Cologne, Germany.
    @Disgusted_Tadpole@lemmy.ml avatar

    I’m so glad to read such news. I do hope irl games conventions won’t die any time soon

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • muzyka
  • esport
  • Spoleczenstwo
  • test1
  • Psychologia
  • Technologia
  • rowery
  • FromSilesiaToPolesia
  • fediversum
  • lieratura
  • sport
  • Blogi
  • Pozytywnie
  • nauka
  • motoryzacja
  • niusy
  • slask
  • informasi
  • Gaming
  • games@sh.itjust.works
  • tech
  • giereczkowo
  • ERP
  • krakow
  • antywykop
  • Cyfryzacja
  • zebynieucieklo
  • kino
  • warnersteve
  • Wszystkie magazyny