If you have physical media there are ways to rip them using a commodity DVD/Bluray drive so you can back them up, load them in the emulator, and prevent data rot on the actual disks.
If you’re asking for a direct way to play straight from an XBOX handheld (native play) that’s unlikely to happen for the 360 because of it’s unique architecture, and the Xbox One and beyond I believe have some significant departures from the standard x86 architecture that would make it impractical (it would be more logical to give you steam codes instead lol).
I kind hated whenever i was looking at solutions to running pc game pass games on my steam deck, everyone was like: “install edge it works on Linux, that will let you stream the game.”
Like, that was not what I was looking for. It also required you to go to a higher tier of sub.
Just make it a law that every game has to use the Decima engine because Death Stranding 2 runs at 60fps, loads in basically 1 second, and is a contender for best looking game of the generation.
they just need to keep the same graphics and improve optimization and stop relying on dlss and frame gen.
that shit works well, but id be really impressed to just play a game that looks great and throw on ray tracing and still dont need to use dlss to get above 100 fps.
It’s because of how the tech works. It uses the previous frame to render the next, which leads to ghosting. It’s not as bad with DLSS4 but it’s still there
So many folks online seem to be upset that there are very few games are exclusive to only the latest consoles. They want “next gen” games but fail to realize that the product they want would not have a large enough market for the development costs.
Nintendo still makes plenty of exclusives. They’re not yet obsessed with live service games. Sony and Microsoft have wasted insane amounts of time developing trash that will never see the light of day.
Moore’s Law was originally formulated as the cost per integrated component being cut in half every x months. The value of x was tweaked over the decades, but settled at 24.
That version of the law is completely dead. Density is still going up, but you pay more for it. You’re not going to build a console anymore for the same cost while increasing performance.
High end PC’s can still go up, but only by spending more money. This is why the only substantial performance gains the last few GPU generations has been through big jumps in cost.
Important to note, the current chip fabrication process of 5nm is very close to limits imposed by the laws of physics. Unless a wholly different chip making process is invented that can go even smaller, we might be looking at the actual limit of the tech.
People are criticizing Playstation, but I’m still really happy to have a Playstation 5 alongside my Steam Deck.
As much as I love playing old games on the deck and it has now become my main console, I’m happy to have my Playstation 5 for demanding games (mostly sim racing) and to watch blurays.
Having a powerful PC would not bring me the versatility my Steam Deck/Playstation 5/Surface Go 1 is giving me for now.
Although that might change if the Playstation 6 comes without blurays.
I don’t understand why some PC people feel the need to say consoles are useless, when it’s only to them that they would be.
I have a PC and PS5. I like my PS5 because I know when I buy a game for it, it’s going to work. On the other hand with my PC I have to triple check the specs and fiddle around with the graphics settings, and the only way I can have confidence that it will play any game is if I spend ~£1000 upgrading it. Or just get the game on the playstation.
Doesn’t surprise to hear this. If Nvidia was really holding back, then AMD would have past them. I feel like they are starting to experience what the cpu side started seeing when they hit 4ghz and had to start chipping away at more clocks. It took longer as they are doing easily parallel operations, but it was bound to happen. I really wonder how both AMD and Nvidia will compare to their prior architectures next iteration. Will my 4080 still be faster than a 6070(ti)?
They very much can. I haven’t checked personally but just look at the average Digital Foundry video for all the tradeoffs and framerate drops. There is very much a market for getting a stable 1440/60 (or even 4k/160) that can then be upscaled/framegened to 4k/120 with the subsequent generation moving towards that. Similarly, higher fidelity assets DO make a difference when you are doing those side by sides.
That said: Demnstrating framerate is a real challenge. Which… is probably why Sony have been revamping the PSN Store over the past year or so. Sidestep youtube and their increasing use of generative content so that you can instead say “Wow. I can totally see the difference between Red Dead on PS5 and PS6!” and so forth.
Do I personally care? Not overly so. But most people are picking which console they buy based on performance (or, more often, which their friends have) but are buying the new console for the new Madden. And they’ll keep doing that… if they can afford a PS6.
Yeah, for me it’s not even just the creative freedom, but an actual fuzzy feeling that me and the devs are having fun together. Open-source games also hold a special place in my heart for that reason, no matter how scrungy they are.
Yeah, I might be showing my age, but my interpretation of “a better game” was right away “a more fun game”, which got followed up with the thought: Did it make them more fun?
I feel like we had fun figured out pretty well in the last century already. And in many ways, the higher specs are used to add realism and storytelling, which I know many people enjoy in their own way, but they’re often at odds with fun, or at least sit between the fun parts of a game.
Like, man, I watched a video of the newest Pokémon game and they played for more than an hour before the tutorial + plot exposition was over. Practically no fun occurred in that first hour.
Just imagine putting coins into an arcade cabinet and the first hour is an utter waste of time. You’d ask for your money back.
I would love to buy a game at a reasonable price that I actually have a chance of finishing in a weekend, or maybe one marathon session. A game with a great story and good gameplay that isn’t drawn out over 30 hours.
As for graphics, I’m quite happy if they drop all the shiny new bullshit that you have to watch a Digital Foundry video on just to even know it exists in the game, and rather focus on a good art style.
And it goes without saying, pay the people who make the games more, and the mega corp CEO’s less.
Shinobi: Art of vengeance. It is a 2D brawler with stages that end with a boss, but they have metroidvania elements to add value for replaying them. It is about 10 hours to go start to finish while doing a lot of the back tracking stuff. Simple story but incredible combat and good level design(only one level I thought was a bit meh), the art and animations are fantastic as well. Very much a new take on the old games, and has a demo to try as well.
The “paid more to work less” part is not tenable. The games that fit that bill that you’re thinking of represent less than 1% of their peers. They are outliers, not a sustainable industry; the exception, not the rule. For every Silksong there are maybe 100 that make just enough to make ends meet, and 1000 duds that will never pay for themselves that you’ve never heard of.
What you’re saying is you want fewer steady incomes and more lottery winners. Sure, that’d be nice, but it’s not a sustainable strategy.
Ex. Wildgate launched recently. They deliberately opted to sell the game for a flat $30 rather than going F2P/P2W. As a result, they regularly get reviewed negatively by people saying “dead game, greedy devs won’t lower the price to compete with F2P games” and “the cosmetics you unlock by playing look better than the ones you can buy” (yes, there are people unironically posting those as negative reviews).
So at least understand why the most common strategy is often exploitative, and why it’s actually not a simple solution that a bunch of armchair experts have figured out in a comments section.
The only occasion I could buy that a “console makes the exclusives” is when the costs are so high that the investors decide a $60 price tag isn’t enough.
That can be alleviated with DLC, or live service bullshit; or it can become an incentive to buy a particular console.
Then, when someone is braindead and doesn’t want a big epic award winning adventure, they’ll use that same console to play Fortnite. Thus, God of War helps sell VBucks or whatever.
It’s a weird analysis, but even though we no longer see console exclusives and it’s seen as a pro consumer move, I also think it was just a way for managers to boost one quarter’s revenue, and it wasn’t really good for the console ecosystem as a whole, especially considering how it would fund future exclusive epics.
Im guessing speed. To me, one of my favorite improvements over the ps4 was load speeds of everything. It felt like I went back in time to SNES days when walking into the next room was instant, unlike on the ps1 for the same games.
Right? Consoles used to exist because specialized equipment would perform better than general consumer electronics. That hasn’t really been true for a long time now.
That’s not exactly accurate, it’s ignoring significantly that hardware in consoles is mostly static. The consistency and limitations on hardware undoubtedly is an important part of that equation
Everyone’s different, and you can get used to a lot.
So some people might not be able to tell 90fps from 120fps, but I definitely notice. But if I played something at 90 for long enough, I’d get used to it and stop noticing how much worse it was from 120 fps.
I will say they don’t get near enough credit for not only adaptive triggers, but them working on damn near any game that appears on PlayStation even while on PC.
I bought a ps5 just for those triggers, and gave it away (but kept a controller) once it worked on PC.
That’s the direction Playstation needs to go. If they made a new controller with Hall Effect sticks and 4 back buttons they’d absolutely clean up. They came so close with the edge, but didn’t give a back button for half the face buttons. And went “replaceable” sticks that will eventually break instead of Hall Effect.
I can’t tell the diff in FPS performance. I wish I could.
When I got my PS5 and played some games I was like, wow the future is here! Turns out I was playing PS4 games. Also I turned on and off ray-tracing for SpiderMan and couldn’t tell either.
These comments are severely overestimating the level of autonomy players are given in this game. It’s just a branching story, where the branches one player is presented with are dependent on the branch another player chose. I imagine if only a single person plays this game, it will just make stuff up to make it seem like there are other players affecting the world.
Also, also the cynicism on Lemmy is a stale meta at this point. Be the change you wanna see or stfu.
How are you defining “live service single player” game? This is a narrative adventure game. I will be surprised if you ever actually interact with another player directly at all. The dev has said that it supports completely offline play.
Edit: the devs have also specifically said you won’t interact with other players in real time. It’s about as “multiplayer” as the bloodstains in dark souls, but if they had a bigger effect on your narrative.
I think this is why a certain scrolling shooter at the endgame of a certain game closely located to a tomato didn’t emotionally work for me. I can do the math - it can’t just throw that many other players at the problem to get me through the enemy ships, and the game needed to be playable off the internet since little else of it was online.
the example you gave is not actual players and wasn’t meant to be actual players, some are made up names but some are possibly real usernames of players that have made the sacrifice at the end of the scroller shooter.
I dont understand this argument. When a game is considered very good, particularly by people that are already invested in a series, those people want remakes and remasters to more or less be exactly the same game, with only technical improvements such as graphics and framerate. The game is beloved and changing it more often negatively effects the experience. This way new players and old players can have discussion about the game and their experience is more or less the same. Changing the game means new players will have a totally different experience from old players, and ruins discussion between the two.
Why can they not make their new version a separate mode, like New Game Plus?
A remaster is what you describe - technical improvements such as graphics and framerate.
Remakes are (supposed to be) additive - improving the story, changing un-fun mechanics, implementing new stuff that still fits the themes of the game (or that they originally wanted to include, but couldn’t due to budget or time or publishing constraints).
If you’re looking through nostalgia lens, yea, a remaster is all you need. But, when it’s not a studio just looking for a cash grab, devs can have plenty of reasons for wanting a second crack at their game.
FF7 Remake is a great example. Sure, there’s been a lot of controversy around the changes. But I’ve really enjoyed a lot of them because it’s different from the original. It didn’t ruin the discussion - it added to the conversation.
A remaster is generally a re-release of an already existing game. It is a new build of the same game, on the same engine, with the same assets. The only difference being compatibility with new hardware, etc. In my opinion, a lazy cash grab that realistically shouldn’t even exist. Often times these new builds aren’t even the same and have many bugs not originally present in the original game that the remaster developers never even fix.
A remake should always try to stay as close as possible to the original for its initial presentation. The intention of a remake is to become the current market replacement of an old product, for various reasons. Maybe it doesn’t run on new hardware or the original code was deleted/lost. Maybe the original game was poorly received and the developers want to try again with some QoL adjustments. Maybe the graphics haven’t aged well but the story is timeless. This is why a studio would opt for a remake instead of a lazy remaster.
The issue comes from something like Silent Hill 2 Remake. It did not include a “Classic Mode.” The remake alters some pretty important themes in the game, changes multiple story elements, and entirely changes the focus of the gameplay, putting a greater emphasis on action and combat than the original ever did. The remake shifted the tone away from a melancholic exploration of a character into a Hollywood action movie with an over-reliance on jump scares (basically every Bloober game, honestly).
This has problems when fans attempt to talk about the game. Which version is each talking about? People do not always specify. If one person talks about the Coin Puzzles in the apartments for example, the clues, hints, and solutions are completely different between versions. Players of the original game needed to get a crate of rotten juice cans and drop it down a trash chute in order to receieve a coin for that puzzle, but that entire sequence was removed in the remake. This is only a minor example that doesnt impact the story, but the problem of discussion disconnect is apparent. You can imagine how confusing it would get when there are other major changes that do impact the story later on in the game.
These differences are fine if the developers add them as an “Arrange Mode” or “Remake Mode,” but not as the only way to experience the game. That effectively says “our new version is the only good version, because we won’t allow the players to directly compare the two with the same engine and graphics. If you want the old version, you can’t, because we definitely aren’t selling the original and pirating the original that we refuse to sell you is copyright infringement.”
A remake should always try to stay as close as possible to the original for its initial presentation. The intention of a remake is to become the current market replacement of an old product, for various reasons.
Reading your comment, it seems like you’re locked onto the idea that all remasters are lazy, low quality cash grabs and that remakes should actually just be high quality remasters.
Remasters don't change the content of the game. Remakes do. And there's a spectrum of quality for both.Life is Strange had a bad remaster. They updated the graphics, but there’s original aesthetic looked better than the uncanny “upgrade”. Skyrim - Special Edition had a better visual upgrade and fixed bugs.
Twin Snakes was a bad remake of Metal Gear Solid. They added unnecessary cutscenes and tried to bork in mechanics from MGS2 just because it was newer. RE4 was a good one.
It sounds like you wanted a high quality remaster of Silent Hill 2, and instead they gave you a remake and never released a digital version of the original. So now everyone’s playing the remake and calling it Silent Hill 2, instead of properly differentiating it as Silent Hill 2 Remake/Silent Hill 2 (2025).
And I agree that the situation is ass for navigating online conversations.
But a remake should not “stay as close as possible to the original”. That’s what remasters are for.
Isn’t this discussion already murky by the fact that persona games re-release with upgraded versions even without ramakes? As far as I know, they are actually remaking the Golden version, which is already notably different from the original (many touches rather than a full on overhaul)
gamesradar.com
Gorące