dudewitbow

@dudewitbow@lemmy.ml

Profil ze zdalnego serwera może być niekompletny. Zobacz więcej na oryginalnej instancji.

dudewitbow,

Bit of both, people have less time on average, and its hard to break into the MMO market when its only dominated by giants due to the amount of content required to make a competant mmo.

There is still a huge market though.

dudewitbow, (edited )

They have the opposite mindset when it comes to how they handle hardware. Nintendo targets the cheapest demographic, willing to use old technology if it meant turning profit., Apple targets the complete oppisite by always buying out bleeding edge TSMC nodes. The only thing they share in common is both sell at profit on hardware sales.

dudewitbow,

Pretty much the biggest mistake made due to greed is the decision to retroactively apply thr deals to already existing titles. Its one thing to neuter titles in the future, but another to fuck over everyone whose already committed to using it on a different TOS

$70 Mortal Kombat 1 Switch version called "robbery" as graphical comparisons flood the internet (www.eurogamer.net) angielski

Fans have taken to the likes of X (formerly Twitter) and TikTok to question NetherRealm's decision to market Mortal Kombat 1 as a $70 Switch release. It has been called "robbery" and "disrespectful" to users.

dudewitbow,

Do people not realize that some, if not all store fronts have a clause that the base price of a game has to be the same on other storefronts.

I know its true for steam vs other pc store fronts, but i believe its probably true for consoles as well.

Modder Turns Framework Laptop PCB Into a Handheld Gaming PC (www.tomshardware.com)

What if the modular computing evangelists at Framework decided to make a handheld? YouTuber Pitstoptech has largely answered this question by building a “fully upgradeable gaming handheld” around one of Framework’s upgradable motherboards....

dudewitbow,

Framework 16 pcbs wouldnt be ideal for handhelds.

If somone wants to mid end game on a gaming handheld disregarding price, people have to hope that AMDs Strix Halo (40CU apu, 6700xt for example is a 40 CU gpu) is a real product next year.

dudewitbow, (edited )

its meant for people in the tech space that can cross compare numbers with on the market devices. some basic specs give you a ball park estimation of what you kind of expect. Albeit, this is from WCCFTech, which theyll post just about any rumor, so take with huge grain of salt.

for laymans, ill do some of the cross comparison now.

5nm is the fabrication process used in AMDs current top end gpus, and current generation GPUs. In apple terms, same process used on its A14/A15 (iPhone 12-14) and M1/M2 (all current macbook devices) chips (only difference between the two generations is bleeding edge vs matured process, but they are effectively the same size).

For comparisons sake, the 5nm process is used by Nvidia’s current generation RTX 4000 series gpus, but a special process for it (cusotmized basically for Nvidia). The clocks likely refer to CPU clocks so I will drop discussion of gpus here and move onto Nvidia’s CPU offerings.

Nvidia essentially only puts CPUs on its enterprise and developer parts (the Tegra line, which is how the Switch ended up using it). Nvidias “Thor” would be the only device using 5nm, but little is known about Thor so I would refer to last gen Orin, which have development boards already on the market (in the same way the Tegra X1 in the switch also has Development boards on the market).

Orins Wikipedia section on pure numbers, the 2 middle SKUS, the 2 NX models are the ones that would likely go into a switch due to their TDP (10-25W), as 10W is the typical handheld TDP and 15-25 tends to be the TDP of devices when “Docked”. Since last gen orin was capable of holding 2.2 Ghz CPU docked, then the switch SOC at least on paper, is closer to the full clocks when compared to the older Tegra X1 in the switch (which had it clocked to 1000 Ghz essentially, which is almost half of what the chip was designed for ~1800 which is seen in the commercially available Nvidia Shield TV). The CPU is a Arm Cortex A78, so I’d compare it to phones using it such as phones using the Snapdragon 888 cpu, but downclocked. Also forgot to put out there, Orins GPU is essentially similar to the Nvidia RTX 2050 mobile if you need some remote idea on how it would perform graphically.

Opinion post starts here:

Im on the boat who believes Nvidia is going to use Orin (or a varient of Orin just shrunk down to 5nm, as Orin is a 8nm product) as Nvidia does not like to do custom designs for any customer. It’s the reason why Apple for instance, dropped nvidia and the last Nvidia GPU used in an apple product i believe was the GTX 670. The choice sounds like a very Nintendo thing to do, because 1. Nintendo has a history of choosing the lower end part nowadays and 2. Nintendo prefers to have their consoles sold at profit and not at a loss, so theyre more inclined to pick the cheaper device of any option. Given that Orin is an early covid design, it makes sense of the timeline as it would kinda be similar to the switch (the Switch launched in 2017, used the Tegra X1 which was in devices in 2015). Orin was produced early 2022, and the next Switch would likely launch in 2024

dudewitbow,

I mean isnt this why people bought the HP Reverb?

Its partially a self inflicted problem if you need Valve to do it.

dudewitbow,

Publicly traded companies*

Private ones dont always have CEOs chasing every penny looking for only short term gains.

dudewitbow, (edited )

It work for paid games, youd have to apply it to microtransaction level if by f2p game, which is the real target for the change.

Starfield getting DLSS support, FOV slider, HDR calibration, and more (www.eurogamer.net) angielski

With last week's Starfield launch slowly simmering down, Bethesda has started to cast its gaze forward, confirming a number of "community requested" features are on the way, including Nvidia DLSS support on PC, an FOV slider, and more....

dudewitbow,

HDR was the only option I was suprised to not see at launch. Its a space game, litterally one of HDRs biggest use cases.

dudewitbow, (edited )

I had installed a seperate lut and used reshade injectes HSR effect to get a pseudo HDR experience.

I absolutely hated the fact that even on the intro screen on my Oled, the space background was not black. I was like what is this madness.

dudewitbow, (edited )

Its hard to hear it in circles because its a jrpg first (which on its own, is a niche) and only on PS5 at the moment, which not everyone has. Theres about 40m ps5 sold, and less than a tenth has bought the game.

dudewitbow,

Im aware its closer to an action rpg, but it doesnt help that its tied to the final fantasy name.

Its closer to what Secret of Mana is than to traditional Final Fantasy, but chose to name itself in the latter.

dudewitbow,

How i see it is al alternative falllout timeline aet in the future. A lot of the basic game mechanics are straight upgrades from Fallout 4, with slightly better faction writing than 4, and slightly more rpg checks to make an experience feels better than 4. IMO i dont think its better than New Vegas, but its a direct upgrade from FO4

dudewitbow,

Gearbox only owns the IP, not the studio. So although ROR could die with gearbox, theres nothing stopping the dev from making a spiritual sucessor.

dudewitbow,

The game isnt ending, just adding new content is. Thats like saying that x company stopped updating a game i bought, so i should get a refund on it.

dudewitbow,

Well the mainstream nintendo ones do. Fire Emblem Heroes is one of nintendos longest running mobile games, and is still the one that makes the most $ (not including the Pokemon series, which is often considered seperate to Nintendos other ips in terms of management)

dudewitbow,

Imo, that was intentional to get double dip profita on mostly the same work.

I fully believe all of this is just holdover for the next mainline mario kart title, which will be on the next switch device within the first 3 months of its release next year.

Starfield Is Seemingly Missing Entire Stars (the local 'sun') When Running On AMD Radeon GPUs (wccftech.com) angielski

So a user on Reddit (ed: u/Yoraxx ) posted on the Starfield subreddit that there was a problem in Starfield when running the game on an AMD Radeon GPU. The issue is very simple, the game just won't render a star in any solar system when you are at the dayside of a moon or even any planetary object. The issue only occurs on AMD...

dudewitbow,

You make it sound like nvidia has never pushed out a kinda sorta works driver.

dudewitbow,

Its moreso creation kit. Without it, some mods are really hard to make.

dudewitbow,

The mods made by creation kit, and the mods made using script extenders arent the same kind of mods.

CK allows modding in custom NPCs, followers and more models, as well as modifying vanilla asset models in the game easier (part of the reason why the only thing you see in modding space are texture swaps and full model swaps) as well as quest mods. These are the types of mods you see like in skyrim that can be enabled by console as well.

Script extenders enable mods that rewrite how the game functions, be it physics, adding new interactions and such.

dudewitbow,

Full model replacements are typically not hard, especially if theyre being used as a replacment for already existing assets vs creating a new asset and item id for.

Its like the modding scene for both brawl and umvc3.

It only starts off with replacing already exiting assets, but it wont explode till you figure out how to add custom assets. Brawls point was when project m devs found how to add character slots, umvc3 was when they figured out similar + making fully custom models/animations without having to borrow existing ones.

You need the tools to exist to get to the blowup point.

Consumer Nintendo Switch 2 rumored to have more RAM than the Xbox Series S (www.notebookcheck.net) angielski

A new Nintendo Switch 2 rumor has surfaced claiming that the next-generation hybrid console could actually arrive with more memory than a powerful rival like the Microsoft Xbox Series S. The same source has also offered an update in regard to the Switch 2’s potential DLSS support and ray-tracing capabilities.

dudewitbow, (edited )

Neither are too new. Both features are technically available for volta gpus or newer. The switch was maxwell, and unless you fully believe the switch 2 will use pascal (2016), then it is at the very minimum, using volta, which means it can use rtx/dlss (but i dont expwct it to ACTUALLY use rtx)

dudewitbow,

People suspect its a vr headset, or a vr headset adjacent tech like a more portable (reletively) higher powered pc box to be able to use it while carrying it to be able to do pcvr without being tethered to a standing object.

[Rumor] Nintendo Switch 2 Will Come With 12 GB RAM, Ray Tracing Capabilities; Unreal Engine 5 Demo Ran With DLSS 3.1 (wccftech.com) angielski

Speaking on X/Twitter, Necro Felipe, Universo Nintendo editor-in-chief who correctly revealed information regarding Nintendo products and games in the past before the official announcements, revealed that, according to their sources, the new console will be ray tracing capable and will come with 12 GB of RAM, which is a big step...

dudewitbow,

It was actually originally planned for 2gb till capcom convinced nintendo to make it 4.

dudewitbow,

Starfield has the problem that horizontal progression games like horizontal progression mmos have, which is they have a LOT of things you can do unlocked after a certain point (getting to constellation for the first time) but doesnt handhold you to any of the other features.

People who get sidetracked easily dont have that problem because they like picking and choosing what they want to do. People who need guidance gets lost in the options.

dudewitbow,

Hence, but doesnt handhold you through them.

It has the functions, tells you very little about them because after the intro moments, theres virtually no tutorials.

dudewitbow,

They tutorial the basics of how to fly a ship and loot space items, without informing you that some of its features are locked behind skills (target mode, thrusters for strafing)

dudewitbow,

Its not like intel never had gpu drivers (they have had igpus for ever), they just never had to constantly need to update them for the gaming audience.

Lets not pretend features like intels quicksync that came out on sandy bridge igpus to do video encoding didnt reshape how companies did encoding for viewing(which would lead to NVenc or AMD VCE) or scrubbing in the case of professional use.

The gpu driver team had existed for awhile now, its just they never was seveeely pressured to update it specifically for gaming as theybreally didnt have anything remotely game ready till arguably tigerlake’s igpu.

dudewitbow,

I dont think itll be high powered, thats just the reporter adding something for clickbait.

Im one to believe in Bobby Kottick mentioning that the Switch 2 is roughly the power of a PS4 as he was in contempt of the court when his leak of its performance was discussed. the handheld likely has better cpu performance though vs ps4, as its basically in the same playing field as the steam deck is, both companies who can sit and make thir 30% cut from developer games.

Tldr, dont expect Series S perf, expect steam deck performance with better battery and DLSS support to 4k (i personally believe itll target 1080p60, and use DLSS Performamce preset to upscale to 4k, as 1440p tvs arent common)

dudewitbow,

Thats more due to Nvidia making both Frame Generation, Upscaling and the original use, Anti Aliasing (the SS in DLSS is super sampling) the same term.

Realistically, DLSS should be referred to as an anti aliasing technique(like TAA is) but it was basically colloquially hijacked to turn into an upscaling tech.

dudewitbow,

Thats more due to Nvidia making both Frame Generation, Upscaling and the original use, Anti Aliasing (the SS in DLSS is super sampling) the same term.

Realistically, DLSS should be referred to as an anti aliasing technique(like TAA is) but it was basically colloquially hijacked to turn into an upscaling tech.

dudewitbow,

The problem is DLSS should be the AA (as intended) and not the reverse. DLAA should have stayed DLSS and upscaling should have gotten another name.

dudewitbow,

I mean id argue Skyrim isnt that great to look at either, given its a continent full of tombs and snow due to its climate. Its just gray and white.

Both Morrowwind and Oblivion had more interesting places to look at than skyrim did.

dudewitbow,

Are you about to bring up skies of all things, and compare to a space game with multiple skies due to having multiple planets?

dudewitbow,

not the guy, but I have mine on a Sata SSD and I don’t think my loading times are the same as his, so I’d expect either slow CPU or on a Hard Drive (going against the minimum requirements that the game should be played on a SSD)

What's wrong with the Saints Row reboot again? angielski

I got it expecting to hate it, but as I kept playing, I found myself legitimately enjoying it. Not begrudgingly enjoying it, not enjoying it outside of one or two small details, but actually being engaged in the story and gameplay. Which leads me to wondering why people had a problem with this game in the first place again?

dudewitbow,

Outside of bugs, the content of CP2077 from launch to the launch of the edge runners anime was minimal…it was functionally the same game still.

People who generally hate didnt play the game.

dudewitbow,

Gta san andreas was during an update in 2014, long before the remaster was released.

dudewitbow,

Side related, you know rebranding failed when a news site uses the old Bandai Namco Logo over the new one.

dudewitbow,

I’m not saying reflex is bad and not used by esports pros. Its just the use of theoretical is not the best choice of word for the situation, as it does make a change, its just much harder to detect, similar to the difference between similar but not the same framerate on latency, or the experience of having refresh rates that are close to each other, especially on the high end as you stop getting into the realm of framerate input properties, but become bottlenecked by acreen characteristics (why oleds are better than traditional ips, but can be beat by high refresh rate ips/tn with BFI)

Regardless, the point is less on the tech, but the idea that AMD doesnt innovate. It does, but it takes longer for people to see t because they either choose not to use a specific feature, or are completely unaware of it, either because they dont use AMD, or they have a fixed channel on where they get their news.

Lets not forget over a decade ago, AMDs mantle was what brought Vulkan/DX12 performance to pc.

dudewitbow,

I wouldnt say compete as the whole concept of frame generation is that it generates more frames when gpu resouces are idle/low due to another part of the chain is holding back the gpu from generating more frames. Its sorta like how I view hyperthreads on a cpu. They arent a full core, but its a thread that gets utilized when there are poonts in a cpu calculation that leaves a resouce unused (e.g if a core is using the AVX2 accerator to do some math, a hyperthread can for example, use the ALU that might not be in use to do something else because its free.)

It would only compete if the time it takes to generate one additional frame is longer than the time a gpu is free due to some bottleneck in the chain.

dudewitbow,

Because AMD gpu division is a much smaller division in an overall larger company. They physically cant push out as much features because of that. When they decide to make a drastic change to its hardware, its rarely seen till its considered old news. Take for example maxwell and pascal. You dont see a performance loss at the start because games would be designed for hardware at the time, in particular whatevers the most popular.

Maxwell and Pascal had a notible trait allowing it to have lower power consumption, the lack of a hardware scheduler as Nvidia moved the scheduler onto the driver. This allowed Nvidia to manually have more control of the gpu pipeline allowing for their gpus to handle smaller pipelines better, compared to AMD which had a hardware based one with multuple pipelines that needed an application to use properly to maximize its performance. It led to Maxwell/Pascal cards to have better performance… Til it didnt, as devs started to thead games better, and what used to be a good change for power consumption evolved into a cpu overhead problem (something Nvidia still has to this day reletive to AMS). AMDs innovations tend to be more on the hardware side of things which is pretty hard to market because of it.

It was like AMDs marketing for Smart Access Memory (again a feature AMD got to first, and till this day, works slightly better on AMD systems than other ones). It was a feature that was hard to market because there isnt much of a wow factor to them, but is an innovation.

dudewitbow,

Which then comes with the question of price/perf. Its not that its a bad idea that DLSS is better than FSR, but when you factor in price, some price tiers start to get funny, especially in the low end.

For the LONGEST time, the RX 6600, which by default, was about 15% faster than the 3050, amd was significantly cheaper, still was outsold by the 3050. Using DLSS to cover the performance of another GPU does natively (meaning objectively better, no artifacts, no added latency) is when that argument of never buying a gpu without DLSS becomes weak, as the issue for some price brackets is what you could get at the same price or similar might be significantly better.

In terms of modern gpus, the 4060ti is the one card everyone for the most part, should avoid (unless your a business china that needs gpus for AI due to the U.S government limiting chip sales)

Sort of the same idea im RT performance too. Some people make it like AMD cant RT at all. Usually their performamce is a gen behind, so in situations like the 7900 xtx vs the 4080, could swing towards the 4080 for value, butnfor situations like the 7900xt, which was at some point, being sold for 700$, ots value, RT included was significantly better than the 4070ti as an overall package.

dudewitbow,

Which is what.im.sayong, the condition of course that the gpus are priced close enough (e.g 4060 vs 7600). But when theres a deficiency in a cards spec (e.g 8gb gpus) or a large discrepancy in price, it would favor the AMD usually .

Its why the 3050 was a terribly priced gpu for the longest time, and currently, the 4060ti is the butt of the joke, and someone shouldnt use those over the AMD in the said price range due to both performamce, and hardware deficiency(vram in the case of the cheaper 4060ti)

dudewitbow,

In the case of the 4060ti 8gb, turning on RT puts them past the 8gb threshold killing performance, hence hardware deficiency does matter in some cases.

dudewitbow,

because I think the post assumes that the GPU is always using all of its resources during computation when it isn’t. There’s a reason why benchmarks can make a GPU hotter than a game can, as well as the fact that not all games pin the gpu performance at 100%. If a GPU is not pinned at 100%, there is a bottleneck in the presentation chain somewhere. (which means unused resources on the GPU)

dudewitbow,

I still think it’s a matter of waiting for the results to show up later. AMD for RDNA3 does have an AI engine on it, and the gains it might have in FSR3 might be different in the same way XeSS does with branching logic. Too early to tell given that all the test suite tests are RDNA3, and that it doesn’t officially launch til 2 weeks from now.

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • Spoleczenstwo
  • rowery
  • esport
  • Pozytywnie
  • krakow
  • giereczkowo
  • Blogi
  • tech
  • niusy
  • sport
  • lieratura
  • Cyfryzacja
  • kino
  • muzyka
  • LGBTQIAP
  • opowiadania
  • slask
  • Psychologia
  • motoryzacja
  • turystyka
  • MiddleEast
  • fediversum
  • zebynieucieklo
  • test1
  • Archiwum
  • FromSilesiaToPolesia
  • NomadOffgrid
  • m0biTech
  • Wszystkie magazyny