pcgamer.com

XbSuper, do games w CD Projekt recommends starting a new game when Cyberpunk 2077 Update 2.0 drops: 'starting fresh will enhance your overall gameplay experience'

Cool, maybe I’ll actually pick this up next time it’s on sale (and includes the dlc).

Torque2101, do games w CD Projekt recommends starting a new game when Cyberpunk 2077 Update 2.0 drops: 'starting fresh will enhance your overall gameplay experience'

I'll take "things that should be obvious if you're not a gonk" for 100, Alex.

Donebrach,
@Donebrach@lemmy.world avatar

Seriously was anyone not gonna do this?

neokabuto,

I don’t have the free time anymore to start over if I want to play the DLC this year.

Retreaux,

Seriously the assumption of free time that corpos have for their consumer base is WILD, but it just feeds into the ‘must have 100% market share’ mentality that drives the culture as we lose every shred of our living moments on anything but living

Chev, (edited ) do games w CD Projekt recommends starting a new game when Cyberpunk 2077 Update 2.0 drops: 'starting fresh will enhance your overall gameplay experience'

In best case, I will play the game after the first patch after 2.0 + PL. Good that I’m still in my first playthrough in Baldurs Gate 3. So if the patch comes around the weekend of the 6th of October, I should be fine.

candyman337, (edited ) do games w Stacked 3D cache is coming to Intel CPUs, and gamers should be excited (should we?)

Oh boy can’t wait to have cups that burn a hole right through their coolers

I’d really love it if we’d just have a generation or two where they focused on making cpus more efficient and less hot rather than ramping power every generation , same with gpus

deranger,

This only got bad with the most recent generation of CPUs. AMD 5xxx series is very efficient as demonstrated by Gamers Nexus. The Intel CPUs from 2500k to idk, 8xxx series? were efficient until they started slapping more cores and then cranking the power on them.

candyman337,

Yes the second thing about cranking power and cores is what I’m talking about.

Also, as far as gpus, the 2000 series was ridiculously power hungry at the time, and it looks downright reasonable now. It’s like the Overton window of power consumption lol.

Image

deranger, (edited )

I dunno, I ran a 2080 on the same PSU that I used on a 2013 build, a 650W seasonic. Got some graphs? Power consumption didn’t seem to jump that bad until the latest gen.

My current 3090 is a power hog though, that’s when I’d say it started for Nvidia (3000 series). For AMD, 7000 series CPUs, and I’m not really sure for Intel. 9900k was the last Intel CPU I ran, it seemed fine. I was running a 9900k/2080 on the same PSU as the 2500k/570 build.

candyman337, (edited )

As for as the 2080 goes, like I said, it was big FOR THE TIME, and power hungry FOR THE TIME. It’s still reasonable especially for today’s standards

As for as the last two gens, 3000 and 4000 series, they are known to draw more than their rated power requirements, which, for their min recommended psu wattage, 3080 was 50 watts more than the 2080 (750w), and 4080 was 100 w more than that (850w)

To add to that, both of these gens of cards, when doing graphics intensive things like gaming, can overdraw power and have been known to cause hard shutdowns in pcs with PSUs that are even slightly higher rated than their min rec. Before these last two gens you could get away with a slightly lower than rated wattage PSU and sacrifice a little performance but that is definitely no longer the case.

And sure, the performance to watts used is better in the 3080, but they also run 10+ degrees hotter and the 4000 series even moreso.

I just hope the 5000 series goes the way of power consumption refinement rather than smashing more chips onto a board or vram fuckery like with the 4060, like I’d be happy with similar performance on the 5000 series if it was less power hungry

Fermion,

The 7 series are more efficient than the 5 series. They just are programmed to go as fast as thermals allow. So the reviewers that had really powerful coolers on the cpus saw really high power draw. If instead you set a power cap, you get higher performance per watt than the previous generations.

Having the clocks scale to a thermal limit is a nice feature to have, but I don’t think it should have been the default mode.

dudewitbow,

Intel became less efficient because of how long they were stuck on 14nm. In order to compensate to beat amd in performance mindshare, they needed to push the clocks hard.

Overtime, cpus have been sitting closer to max clock, defeating the purpose of overclocking to many, where adding 1GHz was not out of the ordinary. Now getting 0.5GHz is an acheivement.

ono,

I felt the same when the current-gen CPUs were announced, but when I looked closer at AMD’s chips, I learned that they come with controls for greatly reducing the power use with very little performance loss. Some people even report a performance gain from using these controls, because their custom power limits avoid thermal throttling.

It seems like the extreme heat and power draw shown in the marketing materials are more like competitive grandstanding than a requirement, and those same chips can instead be tuned for pretty good efficiency.

candyman337,

Yeah I’m talking about Nvidia and Intel here, but tbh ryzen 4000 cpus run pretty hot, but they also optimized ryzen quite a bit before they changed to this new chip set, which makes sense to me. Seems like Nvidia and Intel are worried about what looks good power wise on paper rather than optimization sometimes.

dudewitbow,

AMD uses 290/390 to compete with Nvidias 970, people buy Nvidia, shoulda bought a 390 meme is born after the 3.5 gb vram controversy happens. AMd mocked for high power consumption.

AMD releases 6000 series gous to compete with Nvidias Ampere line, uses a notibly significant lower power draw, people still buy Nvidia.

Power draw was never part of the equation.

candyman337,

That’s because Nvidia still has the leg up on rtx, but that doesn’t mean Nvidia shouldn’t be thinking about it. I’m not talking about what the market directs them to do, I’m talking about what I hate personally

dudewitbow, (edited )

I mean they did this generation technically. All of the rtx 4000 cards sans the 4090 are fairly efficient… only because nvidia moved the names of the gpu for each tier thats not the halo card.

Point is, you cant have everything and people generally prioritize performance first. Because efficiency has rarely gave either gpu company more profit gpu wise.

If you cared about efficiency, Nvidia answer to people would be buying their RTX 4000 SFF Ada(75w ~3060ti perf) or RTX 6000 Ada… if you can afford it.

ivanafterall, do gaming w Microsoft would buy Valve 'if opportunity arises,' said Phil Spencer in leaked email

God, please, no. If ever you heed your humble servant...

ono,

Not to worry. I think this qualifies as a “cold day in Hell” situation.

reflex,
@reflex@kbin.social avatar
Bizarroland,
@Bizarroland@kbin.social avatar

Somebody please tell Gabe that even if he would walk away with billions of dollars he's going to lose his soul in the process.

It's just not worth it, tell Microsoft to go take a long walk off of a short pier into a vat of battery acid.

canis_majoris, do games w Stadia's death spiral, according to the Google employee in charge of mopping up after its murder
@canis_majoris@lemmy.ca avatar

One of the main issues with Stadia is that they didn’t even do the basics. I saw basically no marketing, and on top of that, I heard all kinds of rumors about the business model that were entirely false. They made no effort to combat the misinformation. It was never the case that you literally had to purchase the game on top of the subscription fees, but that was like the number one issue brought up in every discussion.

Facebones,

It’s been how long now? TIL that was false. 🤷

canis_majoris,
@canis_majoris@lemmy.ca avatar

I know, right? Service has been down almost an entire year.

Facebones,

The “pay for sub then buy games on top of that” was 100% how I heard it worked and NEVER heard anything different from anywhere.

That’s kinda nuts.

conciselyverbose,

It was basically true.

There was a bad experience version you could use without a subscription to games you purchased outright, and they included "free" games with your subscription, but to get a reasonable experience you had to pay for both.

Chozo,

The subscription was only necessary if you wanted to play in 4K or wanted "free" monthly games. Everything else worked just fine without the sub, with no change to performance.

conciselyverbose,

The subscription was absolutely required for performance not to be a complete dumpster fire.

The free tier wasn't mediocre. It was unplayable.

HarkMahlberg, (edited )
@HarkMahlberg@kbin.social avatar

From everything I can see, you did have to buy games on Stadia. They would give you a free game a month, but if that wasn't the game you wanted to play, you had to buy it. The base version of Stadia was free, but the Pro version gave you a discount on games - it did not make them free.

This is the official support forum and there are many Q&A's about purchasing games:

https://community.stadia.com/t5/Payments-Billing/Can-t-buy-games-in-the-Store-OR-HDT-01/m-p/52482

Got my Stadia Pro account with a credit card...

... If you have an Android device, you can also try via the Stadia app to purchase games (once purchased, you can play them everywhere, on mobile, TV or PC).

Stamau123,

So it wasn’t bullshit? Well in the end the environment was confusing, as thus it died

conciselyverbose,

The "wrong" part was that you could theoretically play games you owned without the subscription active.

But it was downgraded heavily enough that it wasn't really worth doing.

Astroturfed,

I couldn’t figure out how to do anything with one without paying the subscription. The interface was horrible and clearly designed to force you into subscribing before you could even use the thing.

Molecular0079,

It was never the case that you literally had to purchase the game on top of the subscription fees

It depends on the game. There were a bunch of games under “Stadia Play” that came along with the subscription, GamePass style. And then there were games you had to outright purchase.

Trihilis,

The main problem with stadia was Google. I knew it was doomed from the start and that’s why I never bothered with it. I actually know a lot of people that didn’t bother with it because it was from Google. It’s basically a self fulfilling prophecy at this point that most of their shit ends up on the Google graveyard.

A lot of people actually don’t trust Google anymore since they’ve already been screwed over many times by them.

DarkGamer, do games w Stadia's death spiral, according to the Google employee in charge of mopping up after its murder
@DarkGamer@kbin.social avatar

Because everything ran locally at a datacenter, the real killer app of Stadia would have been a super-massively multiplayer game. There wouldn't be any problems with latency between game states, (any lag would be between the server and the console.) Imagine massive wars or mediaeval battles with thousands of participants. They never developed games that took advantage of what was unique about the platform.

merc,

AFAIK, MMOs keep all the game state on the servers already. The difference is that what they send to the client is key deltas to the game state, which the client then renders. Stadia type services instead render that on the datacenter side and send the client images.

With their expertise at networking and so-on, Google might have been able to get a slight advantage in server-to-server communication, but it wouldn’t have enabled anything on a whole different scale, AFAIK.

IMO, their real advantage was that they could have dealt with platform switching in a seamless way. So, take an addictive turn-by-turn game like Civilization. Right now someone might play 20 turns before work, then commute in, think about it all day, then jump back in when they get home. With Stadia, they could have let you keep playing on your cell phone as you take the train into work. Play a few turns on a smoke break. Maybe play on a web browser on your work computer if it’s a slow day. Then play again on your commute home, then play on the TV at home, but if someone wanted to watch a show, you could either go up and play on a PC, or pull out your phone, or play on a laptop…

DarkGamer,
@DarkGamer@kbin.social avatar

Larger massive multiplayer capability was one of the features Google was touting upon Stadia's launch:

Over time, Buser [Google’s director of games] says we should not only see additional exclusive games on Stadia, but also cross-platform games doing things on Stadia “that would be impossible to do on a console or PC.” Instead of dividing up virtual worlds into tiny "shards" where only 100 or 150 players can occupy the same space at a time because of the limitations of individual servers, he says Google’s internal network can support living, breathing virtual worlds filled with thousands of simultaneous players.
https://www.theverge.com/2019/6/6/18654632/google-stadia-price-release-date-games-bethesda-ea-doom-ubisoft-e3-2019

merc,

Sure, they claimed that, but it’s telling that nobody ever took them up on that.

Google’s internal network may be good, but it’s not going to be an order of magnitude better than you can get in any other datacenter. If getting thousands of people into the same virtual space were just a matter of networking, an MMO would have already done it.

A shard is going to be storing the position, orientation and velocity of key entities (players, vehicles, etc.) in memory. If accessed frequently enough they’ll be in the processor’s cache. There’s no way the speed of accessing that data can compare with networking speeds.

That doesn’t mean there couldn’t have been some kinds of innovations. Say a game like Star Citizen where there are space battles. In theory you could store the position and orientation of everything inside a ship in one shard and the position and orientation of ships themselves in a second shard. Since people inside the ship aren’t going to be interacting directly with things outside the ship except via the ship, you could maybe afford a bit of latency and inaccuracy there. But, if you’re just talking about a thousand-on-thousand melee, I think the latency between shards would be too great.

EnglishMobster,

You’d only be able to play with people local to you, in the same Stadia datacenter. If Stadia wanted to minimize latency, they would increase the number of datacenters (thus making fewer people per instance).

DrQuint, (edited ) do games w Stadia's death spiral, according to the Google employee in charge of mopping up after its murder

I will never, ever, understand why Stadia was something thay had to be “ported into” at such high cost. Specially for games that were ALREADY working on Linux. Like, what the fuck was the hold up. I read up stories that it was basically like porting to a fourth console and that just sounded outrageously stupid in my head.

Whatever tech stack they had, they could have made it way more profitable by making it generic windows boxes that partially run your library elsewhere. I dunno if there’s some hubris or some licensing bullshit behind it, but fact is, if I want to do this on GeForce Now, I can do it, no questions asked, and as the costumer, that’s the beginning and end of my concerns.

redcalcium, (edited )

Google engineers always choose the hardest route to solve problems. Why wouldn’t they? If your products are going to be shutdown in a few years anyway, might as well have a glowing resume from working on those products (resume-driven development).

Think about it, every time Google made a product with sensible tech stacks, those products were actually started outside Google and later bought by Google (Android, YouTube, etc). If Google made Android from scratch, there is no way they’ll use java and Linux, they’ll invent a new language and made their own kernel instead (just like fuchsia os which might be canned soon).

anemomylos,
@anemomylos@kbin.social avatar
  • Kotlin: "are you talking to me?"
hesusingthespiritbomb,

Kotlin was made by Jetbrains and later adopted by Google.

sznio,

But Kotlin is actually an improvement over Java.

Golang thoooooo

atocci,

TIL Fuchsia hasn't been killed quite yet.

smeg,

Does it actually even exist? I feel like I’ve been getting whispered rumours about it for years and years, but never anything sold!

atocci,

Yes! Nest Hub devices run it

smeg,

Oh wow, I’ll have to have a read up

merc,

might as well have a glowing resume from working on those products (resume-driven development).

This is so true. Getting promoted requires showing impact. If you use off-the-shelf tools (that happen to be easily maintainable) that’s not an impressive impact. If you invent a new language (and make up a convincing reason it was necessary) and so-on, that’s really impressive and you can get promoted. The minefield you leave behind that makes maintaining your solution so difficult is just another opportunity for someone else to get promoted.

Zeth0s, (edited )

Only Microsoft can run decently windows in a decently big data centers. Because they can tweak it, as they do for Xbox os as well. For everyone else scaling windows server VMs or containers is a pain, because windows is a bad, poorly optimized, resources-hungry OS developed with main goal to make hardware obsolete every 3-5 years.

I don’t know what nvidia is doing, but when I use it at my friends’ places, lags are painful.

Linux was the right call in theory, in practice gaming industry is pretty broken on the PC side with its lock on windows, as we see on every new AAA port… Let’s hope valve can save it, but I doubt.

smeg,

I don’t think the people downvoting you have ever experienced the pain of dealing with Windows in a cloud environment

Pxtl, (edited )
@Pxtl@lemmy.ca avatar

No, we’re downvoting because of conspiracy theories about planned obsolescense.

Yes, it’s disappointing how hardware requirements climb for minimal appreciable improvement, but Hanlon’s Razor applies.

Zeth0s, (edited )

It is not a conspiracy though. Planned obsolescence is a well known real thing. There is a reason unix computers last on average longer than windows computers, and Linux is the stereotypical OS for old pcs.

If people are downvoting for this, they should learn how computers and operating systems work

Zeth0s, (edited )

Don’t worry, I was expecting the downvotes. This place is full of angry windows fan boys that believe they are tech expert because they watch ltt and can install a skyrim mod. Less than reddit luckily

Astroturfed,

The thing was clearly designed to force you into paying a subscription fee. You can’t let people have something they could possibly easily use and play games that aren’t on your subscription if your entire purpose is to milk a monthly subscription from the users. Google, fuck you capitalism woohoo.

spankinspinach, do games w CD Projekt recommends starting a new game when Cyberpunk 2077 Update 2.0 drops: 'starting fresh will enhance your overall gameplay experience'

I’m so excited to play this on PS5. Survived a 50 hours playthrough on PS4, liked the story enough but almost purely ground my way through due to shitty performance. This time I wanna just enjoy it as it was meant to be played :)

Kaldo, (edited ) do games w Cyberpunk 2077 2.0's revamped police force is finally good enough
@Kaldo@kbin.social avatar

Tbh I'm still not sure what the point of it is. In gtav you get into trouble with police if you rob shops, steal cars or drive over pedestrians, among other things like scripted missions. In saints row it's about gang warfare and them being a nuisance during your city demolition. In mafia you have to obey road laws, hide weapons from plain sight and they are generally a bigger threat.

You can't rob stuff or do heists in 2077, you can summon your own car for free at any point so no need to steal them and since you can fast travel you don't drive as much anyway. The missions that do have car chases are heavily scripted and on the rails.

Is this something just for people who want to go out of their way to fight endless waves of cops and thats it or am I missing something that makes it such a hype worthy feature?

SeedyOne,

It was a huge immersion breaker for anyone not going stealth/low profile (as the author admits he does). In fact, it was the reason I haven’t played until now. I guess I’m a patient gamer and it irked me what was missing from launch. I’d built my 2070 machine for this game years ago and now I’m stoked to have a 3080 to break it in with.

Kaldo,
@Kaldo@kbin.social avatar

Seems more immersion breaking to me that you can fight maxtac and get away with it in the first place, or that they all still just forget about you if you hide for a minute or two out of sight, but we'll see. Maybe I'm just missing something and will appreciate it ingame more.

Potatos_are_not_friends,

Driving was a huge immersion breaker. After 15 feet, there was literally no pedestrians. There was barely no other cars on the road.

TigrisMorte, do games w CD Projekt recommends starting a new game when Cyberpunk 2077 Update 2.0 drops: 'starting fresh will enhance your overall gameplay experience'

As they've rearranged and completely changed lots of the controls and menus, I'll need to start a new campaign just to learn to play it.

Murvel, (edited ) do games w Cyberpunk 2077 2.0's revamped police force is finally good enough

Just leaves the other half of the game to be fixed. I swear I gave it another go a month ago, and it was as buggy as the first time I played it. Just with new bugs, this time around.

edit: it’s still a buggy fucking game

paddirn,

I never suffered from the game-stopping bugs others had, but IMO the other half of the game that needs fixed is the storyline itself. The character’s storyline itself is so linear that nothing you really do makes a difference, your background is just reduced to a set of extra dialogue choices. It’s all just window dressing over a very on-rails game. All they needed to do was just copy GTA V more and it’d be an improvement over this game.

notaviking, do games w Stacked 3D cache is coming to Intel CPUs, and gamers should be excited (should we?)

Imagine a ±6GHz CPU with 3D cache. Now we just have to wait for LTT to fuckup the graphs

KalabiYau, do games w Stacked 3D cache is coming to Intel CPUs, and gamers should be excited (should we?)
@KalabiYau@lemmy.world avatar

I think so. The AMD 3D cache CPUs are impressive in terms of gaming performance (though the inability to overclock them still leaves some benefit to non-3D cache CPUs. Non-3d cache CPUs are also great for everything other than gaming).

Decoy321, (edited ) do games w Former BioWare manager wishes Dragon Age had kept a 'PC-centric' and 'modding-driven' identity like Neverwinter Nights

You know what kind of news about Dragon Age would actually interest me? News on a goddamn release date. It’s been 9 years. They can either make the damn game or shut the fuck up.

micka190,

or shut the fuck up

I’d love some news about them going back to the dark fantasy and writing style of Origins/Awakening over the high fantasy BS Inquisition set them on, tbh.

Signtist, (edited )

Origins’ story was so good that it got me to go to the library in the height of my teenage “reading is lame” phase just to get more exposition from the books. I really wish they’d stayed in that vain in the sequels.

stopthatgirl7,
!deleted7120 avatar

At this point, I won’t believe Dreadwolf is actually coming out until it’s on store shelves.

sugar_in_your_tea,

That’s how I treat all games, and even then I try to ignore it until a few weeks after launch when most of the worst bugs are patched.

ZOSTED,

There’s another DA game in the works??

Decoy321,

There’s been one the whole time. The question is whether or not they’ll actually finish it.

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • giereczkowo
  • Blogi
  • Spoleczenstwo
  • muzyka
  • sport
  • lieratura
  • rowery
  • esport
  • slask
  • Pozytywnie
  • fediversum
  • FromSilesiaToPolesia
  • niusy
  • Cyfryzacja
  • krakow
  • tech
  • kino
  • LGBTQIAP
  • opowiadania
  • Psychologia
  • motoryzacja
  • turystyka
  • MiddleEast
  • zebynieucieklo
  • test1
  • Archiwum
  • NomadOffgrid
  • m0biTech
  • Wszystkie magazyny