If only they appealed to me as a consumer at all. If I want to play it, I have an Xbox and it’s much cheaper on there. What I really wanted was a PC port, which they didn’t launch. Instead, because of the Switch port, not only am I not paying them, I’m just slightly more likely to pirate it and emulate if I ever get the hankering to play.
these damn fools porting it to the two most popular platforms instead of the one you in particular wanted. they should fire whoever made that choice smh
I mean, yes? People have been waiting on this game for 13 years and PC has only gotten more popular since. Rather than do the three most popular platforms that the game isn’t on, they did two. If they’ve managed to port it to the Nintendo Switch of all things, they can spit out a half-baked PC port.
I mean you can ignore the slew of other older games getting ported to PC over the years, be it the Xbox exclusives, PlayStation exclusives, Capcom console exclusives (Dragons Dogma, Dead Rising, Devil May Cry), absolutely countless numbers of JRPGs and similar, and more, but that doesn’t make me the ignorant one.
30, 60 or whatever fps is (or at least should be) a development decision made very early in development. It’s only a case of poor optimization if it doesn’t reach the target they’ve set.
I don’t like it either, but an Unreal 5 game running at 30 fps (if that lol) on current gen is the norm.
Might just be my middle-aged eyes, but I recently went from a 75Hz monitor to a 160Hz one and I’ll be damned if I can see the difference in motion. Granted that don’t play much in the way of twitch-style shooters anymore, but for me the threshold of visual smoothness is closer to 60Hz than whatever bonkers 240Hz+ refresh rates that current OLEDs are pushing.
I’ll agree that 30fps is pretty marginal for any sort of action gameplay, though historically console players have been more forgiving of mediocre performance in service of more eye candy.
Are you sure you have the reset rate set correctly on your video card? The difference between 75hz and 160hz is very clear just by moving your mouse cursor around. Age shouldn't have anything to do with it.
Quite sure – and given that one game I’ve been playing lately (and the exception to the lack of shooters in my portfolio) is Selaco, so I ought to have noticed by now.
There’s a very slight difference in smoothness when I’m rapidly waving a mouse cursor waving around on one screen versus the other, but it’s hardly the night-and-day difference that going from 30fsp to 60fps was back in Ye Olden Days, and watching a small, fast-moving, high-contrast object doesn’t make up the bulk of gameplay in anything I play these days.
That's weird. I'm getting to the age where I wouldn't see the point in 4k, I'd need to have my head on top of the screen to see it. But refresh rate can be felt in fluid scrolling etc and definitely even if only on the unconcious level, improves awareness in games too.
Hmm, I’ve found it quite noticeable. Perhaps turn an FPS counter on and see what it’s actually running at. If you have a game showing on both screens, it’ll likely limit the fps to suit the lowest display hz.
If your comp is good enough absolutely. Strong PCs now can run sub 5ms frame times at 4k pretty regularly. Especially for competitive games that aren’t designed to look incredible.
It really depends what one’s doing, also. For many things, including many games, 30fps is fine for me. But I need at least 60fps for mousing. Beyond that though I don’t notice the mouse getting smoother above 60fps, but some games I do have a better experience at 120fps. And I’m absolutely sold on 500+ fps for simulating paper.
In the interview they said how they show the game the way it is and focus on that part of development. They said how combat wasn’t worked on yet when they showed the game, which now looks pretty reactive. They’re going to focus on sound next and performance last, and when they said 30 it seemed like “bare minimum is solid 30”. Given the feedback, there’s a chance they’ll try to incorporate 60 fps now.
While it’s a design decision, UE is also a bit more scalable generally, assuming it’s not all reliant on lumen, nanite and vsm.
Either ways, they need to learn from previous 30 FPS launches and try to communicate better. Saying it doesn’t need 60 is dismissive to a large audience of gamers who don’t like the trade-off of frames over image quality.
If it has backwards compatibility then yes. If not then I guess it doesn’t really matter. I won’t have much interest if they don’t finally embrace backwards compatibility like everyone else.
There are games right now that could use a boost in performance. If they don't have backwards compatibility it will be a huge disappointment, and I doubt I'll be buying it anytime soon.
Agree that this needs to happen. But given the brand power of Nintendo, they will just do whatever they want to screw over their fans and we will still be willing to pay.
say that the Xbox isn’t powerful enough to run it at anything beyond that.
There’s no way they can’t just lower the resolution and apply upscaling like every other game that has a quality and performance"mode. They’re intentionally locking it to 30 for some bizarre reason.
“It’s 4K in the X. It’s 1440 on the S. We do lock it at 30, because we want that fidelity, we want all that stuff. We don’t want to sacrifice any of it.”
I might hope it’s not because of the same reason Bethesda locked their framerates, because their entire game’s physics and other stuff would break when you unlocked it. I assume it’s not, if it’s only locked on Xbox, which then would mean that the console is just weak.
I mean… 30fps has been the single-player console experience for as long as I can remember. (Except for the PS4/XboxOne-native games – seemingly this entire generation – which get 60fps on current gen.)
Yes, PC can do 60fps+ if your rig is beefy enough. Yay.
Console wars bullshit is insufferable. Even when PC is one of the consoles.
Yeah but on PC you usually get graphics settings you can tune to whatever you like. I’d personally rather have a slightly worse looking game running at 60+fps, than a beautiful one at 30.
That was an option on console for most of the generation so far: Performance Mode vs. Quality Mode. But that’s mostly because nearly every game released so far has been a hastily ported last-gen title. It feels like this gen has really just barely started.
Single-player console games being 30fps is not new by any stretch. That’s basically what consoles do. And they’ve managed pretty well with it so far. If you want to spend 2-3x more on a beefy PC, you can get all the frames you want. More power to you.
20 years ago… Skyrim, Fallout, The Last of Us 1, GTA4-5 on PS3/360 gen. 30fps.
10 years ago… God of War, Gears of War single-player, Fallout 4, The Last of Us 2 on PS4/XBoxOne gen. Also 30fps.
Single-player console games being 30fps is not new by any stretch
Yeah I know, that’s why I never really got into console gaming unfortunately. As I said elsewhere, I genuinely have trouble making out objects while looking around in first-person games, if it’s running at 30fps.
Didn’t know about the current gen having performance settings, that’s pretty neat. Might actually consider getting one if I can actually run games at a reasonable framerate on them with a lower quality setting.
Just because 30FPS has been a standard on consoles for so long doesn’t mean it should stop there.
There’s no reason to not advance if they got the opportunity to do so, the entire gaming industry benefits from it.
Xbox is just not capable of handling the game at higher framerates, that has nothing to do with console wars or whatever, it’s just the limitation of the hardware and it being an underwhelming console in general.
Consoles are $500 gaming machines, generally capable of about 30fps in games. It’s no different for Microsoft or Sony.
And Nintendo… Well, Nintendo is Nintendo.
The bean counters have decided that people don’t want to spend more than that on videogame consoles. If you want more fps, luckily everything gets a PC port nowadays; and your almost-certainly-more-than-$500 rig can handle that.
I’d say 60+fps is especially necessary for first-person games. I seriously have issues making out objects and other things when looking around first-person at 30fps.
No wonder consoles are just not as appealing anymore.
We used to get systems, that were purposefully designed to only play games, but do it phenomenally well. That shit absolutely defined an entire generation of gaming.
Now we get a crippled PC, with dorito ads on the dashboard
Eh… Consoles used to be horribly crippled compared to a dedicated gaming PC of similar era, but people were more lenient about it because TVs were low-res and the hardware was vastly cheaper. Do you remember Perfect Dark multiplayer on N64, for instance? I do, and it was a slideshow – didn’t stop the game from being lauded as the apex of console shooters at the time. I remember Xbox 360 flagship titles upscaling from sub-720p resolutions in order to maintain a consistent 30fps.
The console model has always been cheap hardware masked by lenient output resolutions and a less discerning player base. Only in the era of 4K televisions and ubiquitous crossplay with PC has that become a problem.
At launch the 360 was on par graphically with contemporary high-end GPUs, you’re right. By even the midpoint of its seven year lifespan, though, it was getting outclassed by midrange PC hardware. You’ve got to factor in the insanely long refresh cycles of consoles starting with the six and seventh generations of consoles when you talk about processing power. Sony and Microsoft have tried to fix this with mid-cycle refresh consoles, but I think this has honestly hurt more than helped since it breaks the basic promise of console gaming – that you buy the hardware and you’re promised a consistent experience with it for the whole lifecycle. Making multiple performance targets for developers to aim for complicates development and takes away from the consumer appeal
Between last generation and this one, though, we’re at the point where consoles are more like prebuilts. Games have performance targets, it’s up to users to decide when they feel like an upgrade. The only difference is that games (usually) won’t release for models that can’t run them well, compared to some people who try to squeeze out every frame they can from their 10-year-old potato PCs, though every now and then you still get a Cyberpunk 2077 on consoles.
But there’s a reason why some games still target the PS4 in 2024, because if you’re a small-budget indie game that doesn’t need the full hardware of the PS5, why not? Since you don’t get locked out of older stuff when you upgrade anymore, which enables newer stuff to keep releasing on older systems, anyone can hold on to a console until they run into a game worth upgrading for.
It’s playable and you can enjoy the game, but 30FPS is embarrassing. It makes me feel like I’m a kid playing on a PC assembled out of old leftover components. Which was tolerable when I was a cashless kid playing pirated games on inherited frankenPCs, but it feels so wrong when playing a bought game on its intended spec hardware.
I've been waiting for years. Before RDR2 came out. But you're absolutely right. Waiting for a couple months and I'm confident we'll see a real sale. The whole reissue/not remaster game is not playing out the way they'd hoped. Remember the GTA Master collection or whatever the hell they call it?
Most definitely. Zelda TotK is the late-in-lifetime masterpiece — we are ready for the next generation.
One problem Nintendo faces here is that last time they had Nvidia Tegra, a chip that didn’t really find any other use at scale so Nintendo could source it for pennies. AMD owns the console grade SoC market, and won’t be selling Ryzen 8000 series for cheap — maybe Nintendo could again source something from the previous generation to keep the BOM down?
I think the more severe issue here is that an architecture change would make the new device incompatible with the Switch. So they should preferably stay with something arm-based that can ideally mimic the original SOC closely.
Plus Nintendo usually has long partnerships with hardware partners. From GameCube to Wii U they used IBM’s PowerPC processors, and it was a long period of time. In 2016 Nvidia’s CEO Jen-Hsun Huang praised the partnership with Nintendo, expecting it to last “last two decades”. Nintendo also wants the next-gen transition to be as smooth as possible, retaining Switch’s massive user base. Therefore the company’s next console is likely having an ARM SOC made by Nvidia; anything else would be a suicide mission for them.
Yeah, agreed that this remains by far the most likely scenario. I guess Switch sales alone are enough for Nvidia to keep going with Tegra, despite finding little use elsewhere.
It’s just been a long time since the first news about T234 (Orin) & T239 came out (mid 2021), with a rumoured & cancelled chip based on Lovelace after that — most likely it was too expensive for Nintendo.
That cancellation left me wondering whether there could be other plans in play. It’s an old chip by now, but that too tracks with Nintendo.
Nintendo’s solution to backwards compatibility has been interesting but straightforward in the past. All they’ve done before, with a few exceptions, is slap the old processor in the new device to make it backwards compatible. I’m curious what they would do this time.
I've basically been holding onto the hope I'll be able to play Hyrule Warriors: Age of Calamity on co-op without a drastic drop in resolution and framerate once this new console comes out. Because that was not a level of performance appropriate for an exclusive game.
Considering the stories of the code base being an absolute mess, I can see how there might be a floor in what they can charge due to the labour cost of making it functional on PS4 and Switch. Even if the port is the barest of bones, or even a hot mess.
At least it will make it easier to emulate on Linux 😄
No one is saying that it should be free. The issue is that $50 for a straight port of a 13 year-old game is ridiculous when Sony’s competitor offers free backwards compatibility—and 4K upscaling. Microsoft absorbed their expenses for that as the cost of doing business, and it would be a blip on Rockstar’s balance sheet as well, since they make a perpetual avalanche of money from GTA Online. There’s really no credible justification for a price this high.
Oh yeah, total agreement there. the price is hot garbo. But figured theres some marketing bs covering the real reason its so high. But yea its probably just corporate greed.
I'm not super worried about it. If it doesn't push the numbers they like, they'll put it on sale just like the GTA Definitive Edition on switch goes on sale semi-regularly. If you don't like the price, don't buy it. When it hits a price you deem acceptable, then go for it
xenia_canary is coming along. I could play the entirety on xenia_canary on Windows… only a matter of time.
That’s why I’ve been pushing for a whole rebuild of it. RDR2 already has New Austin and West Elizabeth fully built out in it’s map. Most of the characters exist already in RDR2. Honestly, for the amount of time to “remaster” it they should just rebuild it as a DLC to RDR1, have the player immediately take off into it from the end of the game. Charge $60 bucks for it and everyone would buy it still.
I already finished it on RPCS3 and can only recommend to try it out. Yes, there were a few places and cutscenes in the game that didn’t run great but 98% of the time it was a smoother and better looking experience than the PS3 version, thanks to upscaling and frame rate unlocking (at least on an AMD 5800X).
nordic.ign.com
Ważne