Well. It's funny I read this. Since I just watched this video https://www.youtube.com/watch?v=t_rzYnXEQlE about refactoring the entire source code of Super Mario 64. It's insane how much effort modders put into those things.
I remember I picked this game up to replay it for a few bucks on steam. I had no idea how bad the PC version was. I must of replayed it 3-5x back in the day on Xbox. I couldn’t believe how broken and unplayable it was on PC.
Yeah, that makes sense. They probably can’t properly support a video card they couldn’t get their hands on due to Intel not shipping it until late last year. They also aren’t that powerful of cards. Lastly the Intel drivers are brand new. Most engines are not treated against them, as such there are a lot of corruption bugs. Which makes sense because they weren’t able to get the cards early enough to support them. Since Intel has now discontinued their flagship arc card not even a year after release it’s unlikely any games will really support Intel gpus in the future.
For anyone still following this thread in confusion, the Limited Edition (LE) card is Intel’s equivalent of a Founder’s Edition card. Intel stopped producing LE cards, but their AIB partners are still producing their own SKUs.
That’s a bit disingenuous. It’s Intels own Limited Edition A770 SKU that is discontinued not the A770 as a model. They still ship the chip to AIB makers like ASRock etc. Their second generation, BattleMage, is still on track as well so on the contrary I believe we’ll see much better support for Intel GPUs in the coming years since more game developers will have had adequate time with the hardware. Intels cards are also priced competitively if we’re looking at the entry level cards which is bound to make them end up in many cheaper pre-builts that parents buy for their younger kids. So I expect to be quite commonly used for certain games in the coming years.
The limited edition wasn’t limited in the sense they planned to stop making them. It’s their flagship. This is what I got off of a few articles. If they are still shipping chips to people, it wasn’t clear from a few places I read this from. Additionally battlemage information seems to be all from leaks.
Either way with how shotty the drivers have been went how little hardware has been available to place blame at video game developers for not supporting their cards is silly.
I’m placing 0 blame on developers here but it’s just a fact that Intel can’t reasonably optimize the drivers for all games past and present in such a short time. And developers haven’t had access to the card for even remotely long enough for it to be part of the testing for any game (outside small titles maybe but they generally don’t need special treatment driver wise) releasing this year or next. AMD and Nvidia have literal decades of head start. So while I would’ve wanted Intel to do a better job I’m not trivializing the monstrous task either, and all things considered they’ve done OK. Not great, not horrible.
If it wasn’t clear in the articles you read then those places wanted the clicks and engagement that comes from vaguely implying that Intel is killing their GPU division.
Falsehood flies, and the Truth comes limping after it - Jonathan Swift
Its not like intel never had gpu drivers (they have had igpus for ever), they just never had to constantly need to update them for the gaming audience.
Lets not pretend features like intels quicksync that came out on sandy bridge igpus to do video encoding didnt reshape how companies did encoding for viewing(which would lead to NVenc or AMD VCE) or scrubbing in the case of professional use.
The gpu driver team had existed for awhile now, its just they never was seveeely pressured to update it specifically for gaming as theybreally didnt have anything remotely game ready till arguably tigerlake’s igpu.
I don’t know much about specs. I just find it fascinating that people are actually defending Bethesda in this post. Where’s the standard anti-Bethesda fandumb pile on?
I’ve probably seen it here more than on Reddit, but that’s because I spend more time in the general gaming community here, while on Reddit I was in the fan community specifically… particularly teslore, where “Duh, TES lore is stupid and random” doesn’t get much traction.
The problem is that no one actually really follows the specs (or the specs don't define everything). So you can't just build to the spec and have your game work. You have to know all the ways the different hardware manufacturers cheat and adapt your stuff to their drivers.
If intel still has issues in their drivers and implementation, developing to run correctly on their cards isn't trivial at all. It should still mostly work, but it's hard to catch every edge case without experience with how they do things.
There are 2 versions of XeSS: one which runs on most later Nvidia and AMD GPUs and gives roughly equivalent results to FSR1, and another which only runs on Intel GPUs because it uses their equivalent of tensor cores (thus more like DLSS). I don’t ever see a scenario where anyone is going to support the second one unless Intel starts sponsoring games. And for the first, what’s the advantage over FSR1?
there isn’t one. At the moment, they aren’t aiming that high. Their performance varies wildly from game to game, but at best, their most powerful card atm punches at about 3060 levels.
Yup, it’s not in the specs that it supports intel graphics.
These days it’s expected that any directX/Vulkan supporting card can run just about anything with varying levels of performance, back in the day it was very very specific what a 3d game engine supported. If your card wasn’t on the list it wasn’t going to run outside of software mode unless the newer version of that card had backwards compatibility features. Also later on you had to worry about very specific shader features and direct x features being supported to even get the game to look right.
Just a bit interesting how times change. They definitely should have worked with intel a bit to get it to work, at least given them a copy with some time for them to work out their drivers to support it.
What makes you say that? Last time I checked (about a year ago), yuzu and ryujinx were way more performant and fewer bugs in the emulated titles compared to Xenia (Canary). Have there been such big improvements to Xenia since?
I remember the time when I was really excited about this game. The original writer and composer were both returning, it looked so promising. But we all know what happened, and after Rik Schaffer himself said the soul of the project left when Brian Mitsoda was fired my expectations are firmly settled at the bottom.
Well, I do own an older Switch that'd be vulnerable to the easy exploits but I gave up when I was supposed to get some joycon-ish device to hack my switch... so "just works" is far from the truth unless I've overlooked something.
Hacking a console often involves a bit of work and in some cases that can include physically altering the console. With older Switches you need a PC or Android phone, a USB cable and a little thingy to jump two pins the right Joy-Con rail.https://lemmy.world/pictrs/image/f0437388-4e33-4ae6-8b28-46a0595a1477.jpegThere’s a bit of a process to it, but it really isn’t too bad.
You don’t need a switch (hacked or otherwise) to use yuzu. The “dump the keys from your own console” stuff is cover-your-ass doctrine, the keys are easily available online
Yuzu is an emulator. You don’t need a physical console to use it, unless you insist on dumping your own firmware/roms/keys.
Modding actual switch hardware is certainly more involved. Those rcm jigs are annoying, and later revisions require a modchip which is not an easy install.
Sounds like a lot of misconceptions have been given.
You don’t need to get any weird joycon, you definitely have everything you need. Either a right-joycon or a paperclip.
I’ve done both (and broke my spare JC in the process). I recommend the paperclip. [2:24 tutorial]
What’s simply happening is you’re sending power to a specific pin on your switch. When it gets power and you press the special dev-buttons (Minus-Volume & Power) it goes black and can be exploited with some tech-wizardry.
There’s some cool stuff like themes, homebrew, mods… Been playing Smash Ultimate online for years with mods. However, if you have the means on PC the actual gaming experience over the Switch is typically better and easier to get into.
It's not that hard, but definitely can be daunting if you're not too into computers. Really the little RCM jig is just a plastic piece that slides into your right joycon rail and jumps two pins together that basically put it into developer/diagnostic mode. Then you need either a PC, android, or one of the portable payload injectors to get it into the hacked system. From there you can set it up so that it runs a virtual hacked operating system off of an SD card, and you can still boot into the stock firmware without altering your console at all.
It's relatively easy if you follow instructions and have an early switch. The later model ones do require you physically solder on a modchip, which I wouldn't have bothered with if I hadn't bought one of the early switches.
Switch’s operating system is based on the OS from 3DS. The ARM architecture was already well documented and emulated. Tegra has documentation from NVidia.
With all that, making a Switch emulator was relatively “easy”. They took Citra, the 3DS emulator, and worked from there.
Xbox 360 is a different beast. Even its OS was only kinda Windows, so they couldn’t just take Wine and a PowerPC emulator and call it a day. Taking long is IMO not much of a surprise because of that.
IIRC the original Xbox has even worse emulation, to this day, despite being infamously close to a stock PC.
What makes RDR’s emulation struggles noteworthy is that it’s a highly desirable game that still took ages to unfuck. Most nightmare cases for emulators seem to be random D-list titles. Pinball Fantasies on Game Boy had incomprehensible crashes, early and reliably, for no discernible reason. True Crime New York on Gamecube was a white whale for Dolphin despite being absolute garbage.
RDR was a huge deal for its own sake - and it ran bad, looked worse, and stayed that way for a while. Back in the day it was common for emulators to only work properly for big-name games. NESticle and SNES9X absolutely cheated to run major titles. Early N64 development was nothing but. So having this killer app refuse to work, year after year, was a lingering presence in people’s minds.
Finally getting it working, only to have a nearly painless alternative drop, is pretty goddang funny.
Aside from poorly documented hardware, one reason why Xbox emulation is in such an early state was simply lack of interest. The Xbox had a meager first-party library and what exclusives there were, were already available to play on every Xbox released ever since via back compat.
I think a more tragic case would be if MGS4 was ever re-released as part of the Master Collection ports. That game was designed from the ground up for PS3, and runs terrible even on the custom RPCS3 builds designed specifically for it.
dsogaming.com
Aktywne