wccftech.com

pory, do games w The Elder Scrolls VI Is at Least Five Years Away, and Is Likely to Launch on PC, Xbox Series X|S Only
@pory@lemmy.world avatar

No way in hell they’re gonna still be supporting the Series S in five years.

khornechips,

I have a Series X, and I sort of hope they stop supporting it in 5 years. 30FPS is pretty rough in starfield as it is.

Nfntordr, do games w Doom Studio id Software is Seemingly Working on new Version of its Game Engine - id Tech 8

I wish there were more games licensed to use idtech.

mindbleach, do games w Xbox Partners With Alpine F1 Team For Multi-year Sponsor Deal As "Official Console Partner"

Anti-competitive exclusivity is the only thing propping up the facade of console platforms.

Weirdfish, do games w The Elder Scrolls VI Is at Least Five Years Away, and Is Likely to Launch on PC, Xbox Series X|S Only

I’ve always been a fan, but no way I’m buying a second system just for Bethesda games.

Yes, I bought a switch to play Zelda, but that’s where I draw the line.

Skyrim has released on basically every platform that exists, I have to assume Starfield, and ES6, will eventually release on PS5. That is just too much money to leave on the table.

On the other hand, Demon Souls was spawned out of a failed PS exclusive to go head to head against Oblivion, and I’d dare say the souls series have given more to gaming than the past decade of Bethesda releases.

Lojcs, do games w The Elder Scrolls VI Is at Least Five Years Away, and Is Likely to Launch on PC, Xbox Series X|S Only

Xbox X Series +X user base in shambles

adriator, do games w The Elder Scrolls VI Is at Least Five Years Away, and Is Likely to Launch on PC, Xbox Series X|S Only

The official announcement teaser for The Elder Scrolls VI came out in June of 2018. That means Bethesda will have most likely started advertising the game a full decade before it came out, if the game is at least five years away at this point.

azurefirefly, do games w [Rumor] Nintendo Switch 2 SOC May Be Produced on a 5nm Process Node; To Have Max Clock Speed Higher Than 2.5 GHz - Rumor
@azurefirefly@lemmy.basedcount.com avatar

They finally caught up to 2018 technology!

iHUNTcriminals,

I hope they release more articles comparing it to ps5 while at the same time saying it should not be compared to ps5

azurefirefly,
@azurefirefly@lemmy.basedcount.com avatar

Me too

verysoft, (edited )

Doesn't really matter, they don't need the switch to have bleeding edge performance, that isn't why it sells. It has to be affordable and using older processes helps achieve that.

(besides 5nm is only 3 years old)

parrot-party,
@parrot-party@kbin.social avatar

No but it does need enough performance to be capable of running games in low quality modes. The Switch is so anemic that many big budget games are simply not even trying anymore as performant running can't be achieved without complete rewrites of engine code. So a better Switch that is at least a low spec gaming computer will enable more big games to many the effort of trying to support it.

ForgotAboutDre,

A big issue with modern game developers is bad inefficient code. Compare Nintendo titles file size and performance to every other big game. I don’t think any AAA PC/PS6/XBOX? is going to run on the most powerful switch in 3 years time.

reallynotnick, do games w [Rumor] Nintendo Switch 2 SOC May Be Produced on a 5nm Process Node; To Have Max Clock Speed Higher Than 2.5 GHz - Rumor

Nintendo underclocked the X1, so all this tells us is some upper bound.

lustrum,

Big time. Their chassis design will dictate performance too. They will get the best chip they can in a cost budget and then thermal/battery limits will dictate where that chip actually lies.

The steam deck is cool and a great device but the Switch 2 will be sleeker and nintendo won’t settle for a 90min battery a whiney fan, and that has trade offs.

parrot-party,
@parrot-party@kbin.social avatar

Hopefully they'll actually use an active dock with cooling this time rather than the simple stand that comes with the Switch 1.

lustrum,

Might actually be a good shout.

Switch 1 had a 720p screen with a 1080p max TV output. That’s approx a 2x increase in throughput.

With Switch 2 it’s expected to be a 1080p screen and a 4k output, that’s a 4x increase in pixel throughput. So a 2x output increase might not be adequate.

However, it is widely expected to have DLSS, which would greatly reduce that requirement.

ForgotAboutDre,

Cost is going to be a big factor. Nintendo doesn’t want the best possible console. The want a good console, that they can get into as many hands as possible. Even a simple active dock is going to add £10 to the price.

Laser, do games w [Rumor] Nintendo Switch 2 SOC May Be Produced on a 5nm Process Node; To Have Max Clock Speed Higher Than 2.5 GHz - Rumor

Rumors say it might be possible to run Pokemon games without absymal frame rates

geosoco,

That's just misinformation. That'll never happen.

XTornado, (edited )

Unless they are overclocking the developers I doubt it.

Oneeightnine, do games w [Rumor] Nintendo Switch 2 SOC May Be Produced on a 5nm Process Node; To Have Max Clock Speed Higher Than 2.5 GHz - Rumor
!deleted4231 avatar

Absolutely none of that makes any sense to me. What does it (likely) mean for the people who are going to be buying one of these?

geosoco,

basically it just means it's using newer chip-making processes to make the chip smaller and faster. It's sort of a no-brainer that a new chip would use some updated processes and likely run faster than one made 7 or 8 years ago.

NOT_RICK,
@NOT_RICK@lemmy.world avatar

Smaller chips are a combination of faster and more power efficient than chips of a larger process size. The smaller the chips the less electrical impedance there is. That’s what makes processors hot. Less heat means less energy wasted and more potential to run the processor at a faster rate

dudewitbow, (edited )

its meant for people in the tech space that can cross compare numbers with on the market devices. some basic specs give you a ball park estimation of what you kind of expect. Albeit, this is from WCCFTech, which theyll post just about any rumor, so take with huge grain of salt.

for laymans, ill do some of the cross comparison now.

5nm is the fabrication process used in AMDs current top end gpus, and current generation GPUs. In apple terms, same process used on its A14/A15 (iPhone 12-14) and M1/M2 (all current macbook devices) chips (only difference between the two generations is bleeding edge vs matured process, but they are effectively the same size).

For comparisons sake, the 5nm process is used by Nvidia’s current generation RTX 4000 series gpus, but a special process for it (cusotmized basically for Nvidia). The clocks likely refer to CPU clocks so I will drop discussion of gpus here and move onto Nvidia’s CPU offerings.

Nvidia essentially only puts CPUs on its enterprise and developer parts (the Tegra line, which is how the Switch ended up using it). Nvidias “Thor” would be the only device using 5nm, but little is known about Thor so I would refer to last gen Orin, which have development boards already on the market (in the same way the Tegra X1 in the switch also has Development boards on the market).

Orins Wikipedia section on pure numbers, the 2 middle SKUS, the 2 NX models are the ones that would likely go into a switch due to their TDP (10-25W), as 10W is the typical handheld TDP and 15-25 tends to be the TDP of devices when “Docked”. Since last gen orin was capable of holding 2.2 Ghz CPU docked, then the switch SOC at least on paper, is closer to the full clocks when compared to the older Tegra X1 in the switch (which had it clocked to 1000 Ghz essentially, which is almost half of what the chip was designed for ~1800 which is seen in the commercially available Nvidia Shield TV). The CPU is a Arm Cortex A78, so I’d compare it to phones using it such as phones using the Snapdragon 888 cpu, but downclocked. Also forgot to put out there, Orins GPU is essentially similar to the Nvidia RTX 2050 mobile if you need some remote idea on how it would perform graphically.

Opinion post starts here:

Im on the boat who believes Nvidia is going to use Orin (or a varient of Orin just shrunk down to 5nm, as Orin is a 8nm product) as Nvidia does not like to do custom designs for any customer. It’s the reason why Apple for instance, dropped nvidia and the last Nvidia GPU used in an apple product i believe was the GTX 670. The choice sounds like a very Nintendo thing to do, because 1. Nintendo has a history of choosing the lower end part nowadays and 2. Nintendo prefers to have their consoles sold at profit and not at a loss, so theyre more inclined to pick the cheaper device of any option. Given that Orin is an early covid design, it makes sense of the timeline as it would kinda be similar to the switch (the Switch launched in 2017, used the Tegra X1 which was in devices in 2015). Orin was produced early 2022, and the next Switch would likely launch in 2024

sugar_in_your_tea,

Nothing until they actually announce something. Rumors aren’t to be trusted at all, Nintendo has a history of disappointing on specs and making up for it with interesting gameplay.

Molecular0079, do games w Witchfire Q&A on Long Dev Phase, EGS Exclusivity, DLSS 3's Phenomenal Boost, FSR/XeSS Support and System Specs

Bah, I guess I am just gonna have to wait a year after release to play this game.

Blizzard,

Hope they’ll eventually release it on PS5.

Blizzard, do games w Witchfire Q&A on Long Dev Phase, EGS Exclusivity, DLSS 3's Phenomenal Boost, FSR/XeSS Support and System Specs

Glad to hear the game is still alive.

lustyargonian, (edited ) do games w Forza Motorsport 2023 vs Forza Motorsport 7 Comparison Shows Visual Improvements

Looks cool!

Billboard trees and lack of fog, detailed reflections and shadows were tradeoffs of last gen that at that time weren’t as perceivable as they are now, when compared side by side. I wonder what the next gen would look like? Accurate RTAO, RTGI, RTR and no ghosting artifacts? It definitely feels like we’re near the end phase of graphical fidelity. I mean we can improve infinitely but it’ll come at extremely diminishing returns and insane amounts of pixel peeping.

Maybe the focus would shift towards realistic animation blending and pixel accurate collision physics once everything is path traced and uses photogrammetry. Thanks for listening to my ramblings.

DmMacniel, do games w Starfield Is Seemingly Missing Entire Stars (the local 'sun') When Running On AMD Radeon GPUs

Waaaaait… it was a bug and not gross incompetence?

geosoco,

I don't think we know.

Makes me wonder of the dev team is on a much-needed vacation or if they only run nvidia gpus. lol

Hildegarde,

The game runs better on AMD, and Bethesda partnered with AMD in some way for this PC release.

geosoco,

That really just means AMD gave them a lot of money, and they just made sure FSR2 worked. lol

Naz,

I’ve got a 7900XTX Ultra, and FSR2 does literally nothing, which is hilarious.

100% resolution scale, 128 FPS.

75% resolution scale … 128 FPS.

50% resolution scale, looking like underwater potatoes … 128 FPS.

I don’t know how it’s possible to make an engine this way, it seems CPU-bound and I’m lucky that I upgraded my CPU not too long ago, I’m outperforming my friend who has an RTX 4090 in literally all scenes, indoor, ship, and outdoor/planet.

He struggles to break 70 FPS on 1080p Ultra, meanwhile I’m doing 4K Ultra.

redcalcium,

Creation Engine has always been cpu-bound since gamebryo era.

Xperr7,
@Xperr7@kbin.social avatar

I have noticed it's better anti-aliasing than the forced TAA (once I forced it off)

geosoco, (edited )

Some of the benchmarks definitely pointed out that it was CPU bound in many areas (eg. the cities).

I think the HUB one mentioned that some of the forested planets were much more GPU bound and better for testing.

I'm on a tv so capped at 60fps, but I do see a power usage difference with FSR - 75% vs FSR- 100% that's pretty substantial on my 7900xt.

AnUnusualRelic,
@AnUnusualRelic@lemmy.world avatar

“fsr2.h”

Ok, can we have the monies please?

violetraven,
@violetraven@lemmy.blahaj.zone avatar

Does it run better by not rendering light emitting objects?

Hildegarde,

That’s one way to improve performance

Frog-Brawler,
@Frog-Brawler@kbin.social avatar

Perhaps. Who needs stars anyway?

booly,

All GPUs perform equally well the same at ray tracing when there are no rays to trace

MooseLad,

I had no idea it was a problem on Radeon GPUs. I saw a few people complaining about not seeing the stars, but I didn’t have a clue what they were talking about since it was always fine for my Nvidia card.

hoshikarakitaridia,

If it’s down to very specific Chipsets, that sounds like an unforseeable bug.

Deceptichum,
@Deceptichum@kbin.social avatar

An unseeable unforeseeable bug?

hoshikarakitaridia, (edited )

Correction: someone pointed out they are literally interfacing the graphics drivers the wrong way, so it’s still on the their Devs.

e-ratic,
@e-ratic@kbin.social avatar

"Bethesda's Bug", when you can't tell if something isn't working correctly or if it's just not implemented at all.

Hexarei,
@Hexarei@programming.dev avatar

It can be both

Pxtl, do games w Starfield Is Seemingly Missing Entire Stars (the local 'sun') When Running On AMD Radeon GPUs
@Pxtl@lemmy.ca avatar

Ugh. A part of me wants to give AMD a chance for my next upgrade and push back against Nvidia’s near-monopoly of GPUs but I really don’t want to deal with how everything kinda-sorta works on Radeons.

ruckblack,

I’ve exclusively been on AMD since like 2015 and my GPUs “kinda-sorta working” has not been my experience at all lol. Literally have never had brand-specific problems. The only brand-specific issues I’ve had were trying to get my laptop with an Nvidia GPU to work properly under Linux.

Sharkwellington,

I have a suspicion that developers do less testing, optimization, and bugfixing for AMD cards due to reduced market share and that’s why more of these brand-specific coding errors slip through for them. It’s unfortunate but I can’t deny I’ve seen some weird bugs in my time.

Pxtl,
@Pxtl@lemmy.ca avatar

Oh of course. I don’t actually blame AMD for those kinds of bugs. But it’s the reality as a user, at least in my experience… but it’s been like stupid long time since I’ve used a machine with an AMD card.

Indicah,

Some games are built specifically for AMD from the ground up and have been optimized like crazy. Depends on the game and the devs mostly. And let’s not forget that if devs want it to run well on PS5 and Xbox Series x/s, then they better have good AMD optimization.

darkeox,

How can an AMD sponsored game that litteraly runs better on all AMD GPU vs their NVIDIA counterpart, doesn't embark any tech that may unfavor AMD GPU can be less QA-ed on AMD GPUs because of market share?

This game IS better optimized on AMD. It has FSR2 enabled by default on all graphics presets. That particular take especially doesn't work for this game.

Squirrel,
@Squirrel@thelemmy.club avatar

And, being Bethesda, it’s not like bugs are unexpected.

JJROKCZ,

I’ve been red only in my rig for over a decade and the only problems I’ve had are that I play the same games as everyone else perfectly fine and I have more money in my wallet due to not spending as much on parts. That and the bulldozer generation CPUs heated my house like crazy, there’s no denying that lol

XTornado, (edited )

Ugh… the last part is still happening? Like are the new CPUs also so hot or whatever would somebody call it?

I am tempted to build a new PC all AMD for costs alone although the AM4 probably won’t last as long as the Am3 did sadly. But the summer is already terrible with my Intel… no need for more heat.

JJROKCZ,

No bulldozer chips have been gone for like 6-7 years. They last two ryzen generations have been far more energy/heat efficient than intel. Ryzen is the better choice by far right now

ninjan,

Current Intel is worse than current AMD for CPU heat and Nvidia is currently cooler than AMD on GPU. Also we’re on AM5. AM4 lived for a relatively long time, no indication that AM5 won’t be a long runner as well. Intel changes socket more often as well so for longevity AMD is almost always the best, except at the tail end of a socket.

Resolved3874,

Huh. Didn’t even know they replaced am4 until this comment 😂 my am4 ryzen 5 paired with an rx6700xt still does everything I want it to do. And if it starts slacking I have plenty of upgrading left to do.

Frog-Brawler,
@Frog-Brawler@kbin.social avatar

I’ve exclusively used AMD GPU’s since building my first PC 27 years ago. I’m not aware of things “kinda-sorta” not working.

dudewitbow,

You make it sound like nvidia has never pushed out a kinda sorta works driver.

vikingtons,
@vikingtons@lemmy.world avatar

This issue also occurs on Intel Arc Alchemist and Nvidia Maxwell

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • krakow
  • FromSilesiaToPolesia
  • test1
  • muzyka
  • rowery
  • fediversum
  • healthcare
  • Gaming
  • Cyfryzacja
  • Blogi
  • NomadOffgrid
  • esport
  • Technologia
  • ERP
  • shophiajons
  • informasi
  • retro
  • Travel
  • Spoleczenstwo
  • gurgaonproperty
  • Psychologia
  • slask
  • nauka
  • sport
  • niusy
  • antywykop
  • Radiant
  • warnersteve
  • Wszystkie magazyny