Huh I guess my 1060 doesn’t meet the minimum requirements either but it honestly doesn’t seem to play much worse than people with much better hardware. I’m between 40-60 fps depending on the map that’s loaded.
Well, it’s surprisingly an 8/10 game graphics aside. There is no mtx, they still fix their game despite low player count on steam. (no idea how many players on the EA play/Gamepass or console. But on PC if you meet the requirement you can run the game really smoothly. (I run 6800XT so at default 1440p/60fps no issue at all. )
I originally bought it for science and see how UE5 features run on my machine from a released game, and it does surpass my expectation. Also see what’s currently possible with those UE5 tech.
It does have kinda bad luck release timing wise, but it fits a good “alt” game you can play without any commitment. (no season pass, no battle pass, no grinding requirements/mission, semi open world but the main plot can be done without doing those exploration/puzzles. ) So I’d assume the sale can last longer people can play this game with more modern PC build.
All thats well and good, but marketing simply failed. Literally the first I and many others heard of it was the 751 peak player count. Maybe itll get more popular as time goes on, but thats a hell of climb.
Lol, I bought on EGS cause I know developer will get more money out of that purchase.(12% and they don’t have to pay UE royalty for sales on EGS. I do this pretty much for all unreal engine games.)
But I think the honest hw spec probably scare off a lot of potential PC buyers of this type of game. If somehow they have a demo as show case + spec check then people might be more willing to try or even buy.
I’ve got a 7900XTX Ultra, and FSR2 does literally nothing, which is hilarious.
100% resolution scale, 128 FPS.
75% resolution scale … 128 FPS.
50% resolution scale, looking like underwater potatoes … 128 FPS.
I don’t know how it’s possible to make an engine this way, it seems CPU-bound and I’m lucky that I upgraded my CPU not too long ago, I’m outperforming my friend who has an RTX 4090 in literally all scenes, indoor, ship, and outdoor/planet.
He struggles to break 70 FPS on 1080p Ultra, meanwhile I’m doing 4K Ultra.
I had no idea it was a problem on Radeon GPUs. I saw a few people complaining about not seeing the stars, but I didn’t have a clue what they were talking about since it was always fine for my Nvidia card.
Ugh. A part of me wants to give AMD a chance for my next upgrade and push back against Nvidia’s near-monopoly of GPUs but I really don’t want to deal with how everything kinda-sorta works on Radeons.
I’ve exclusively been on AMD since like 2015 and my GPUs “kinda-sorta working” has not been my experience at all lol. Literally have never had brand-specific problems. The only brand-specific issues I’ve had were trying to get my laptop with an Nvidia GPU to work properly under Linux.
I have a suspicion that developers do less testing, optimization, and bugfixing for AMD cards due to reduced market share and that’s why more of these brand-specific coding errors slip through for them. It’s unfortunate but I can’t deny I’ve seen some weird bugs in my time.
Oh of course. I don’t actually blame AMD for those kinds of bugs. But it’s the reality as a user, at least in my experience… but it’s been like stupid long time since I’ve used a machine with an AMD card.
Some games are built specifically for AMD from the ground up and have been optimized like crazy. Depends on the game and the devs mostly. And let’s not forget that if devs want it to run well on PS5 and Xbox Series x/s, then they better have good AMD optimization.
How can an AMD sponsored game that litteraly runs better on all AMD GPU vs their NVIDIA counterpart, doesn't embark any tech that may unfavor AMD GPU can be less QA-ed on AMD GPUs because of market share?
This game IS better optimized on AMD. It has FSR2 enabled by default on all graphics presets. That particular take especially doesn't work for this game.
I’ve been red only in my rig for over a decade and the only problems I’ve had are that I play the same games as everyone else perfectly fine and I have more money in my wallet due to not spending as much on parts. That and the bulldozer generation CPUs heated my house like crazy, there’s no denying that lol
Ugh… the last part is still happening? Like are the new CPUs also so hot or whatever would somebody call it?
I am tempted to build a new PC all AMD for costs alone although the AM4 probably won’t last as long as the Am3 did sadly. But the summer is already terrible with my Intel… no need for more heat.
No bulldozer chips have been gone for like 6-7 years. They last two ryzen generations have been far more energy/heat efficient than intel. Ryzen is the better choice by far right now
Current Intel is worse than current AMD for CPU heat and Nvidia is currently cooler than AMD on GPU. Also we’re on AM5. AM4 lived for a relatively long time, no indication that AM5 won’t be a long runner as well. Intel changes socket more often as well so for longevity AMD is almost always the best, except at the tail end of a socket.
Huh. Didn’t even know they replaced am4 until this comment 😂 my am4 ryzen 5 paired with an rx6700xt still does everything I want it to do. And if it starts slacking I have plenty of upgrading left to do.
Billboard trees and lack of fog, detailed reflections and shadows were tradeoffs of last gen that at that time weren’t as perceivable as they are now, when compared side by side. I wonder what the next gen would look like? Accurate RTAO, RTGI, RTR and no ghosting artifacts? It definitely feels like we’re near the end phase of graphical fidelity. I mean we can improve infinitely but it’ll come at extremely diminishing returns and insane amounts of pixel peeping.
Maybe the focus would shift towards realistic animation blending and pixel accurate collision physics once everything is path traced and uses photogrammetry. Thanks for listening to my ramblings.
wccftech.com
Najstarsze