128GB micro SD cards are like $12. 512GB is maybe 40$. Can get a 1TB SD card for $100 but I think the 512 is a good middle ground between price and storage.
Yeah, 256 is around $20 last I looked, too. Not bad. Been considering getting one, probably not for anything with an install this large, but it’s nice to know I’d have the option.
Yes, a 256GB+ SD Card. Be sure to enable slow HDD mode in BG3 settings if you’re installing to an SD Card. (It will help loading screen times at the cost of using more RAM.)
Not currently, no. They burned enough dev cycles trying to get split screen co-op on the S that now BOTH the S and X versions are delayed, which I guess is better than “not happening at all.”
The S has every right to exist, but as soon as it starts interfering with Series X development (which has been for a while now), it’s time for it to go.
Microsoft needs to cut it loose like the boat anchor it is and just release a discless Series X and call it good.
All sorts of “impossible” poets were being made to switch a few years ago. Witcher 3, Doom, Wolfenstein 2, etc. The games have moved on to the point it’s not feasible anymore, but they would put them on there if they could.
It’s one game. By and large developers have managed to get games running pretty well and feature complete ok the S. Some really impressive attempts like the Cyberpunk version. Everyone is thowing the baby out with the bathwater over one game.
We’ve got games coming out at the moment that use Unreal 5 and it’s next gen features that are still coming out on the S like Immortals or Remnant 2. They have reduced fidelity on the S as expected but they still run fine there. BG3 is literally held up over one issue, the split-screen, that they’re apparently still working on to see if they can patch it back in post launch, MS clearly just let them launch without it to take a win back from Sony.
It’s literally 1/3rd more expensive and thats not an insignificant amount. If your rent increased by 1/3rd tomorrow you’d probably be pissed and if you had a 33.33 percent chance of getting struck by lightning by stepping outside tomorrow you’d probably stay indoors that day.
$100, plus the cost of the mandatory microSD or SSD you’ll need to add to even install the game on Deck, plus the $50 discount for the Series S if you have a modicum of patience. The difference is more like $175-200, and last year the Series S was $100 off for Black Friday. Assuming the game is targeting holiday 2023 for Xbox, you could potentially grab the Series S + BG3 for under $300.
I’ve played this on my deck, and it is playable, but the frame rate was not stable unless it capped it 30 and the graphics had to be dialed back a bit. If the S can hit 60 then it’s already a better version.
I play at 900p60. Turn literally everything to low or off except textures at medium. Enable the AMD upscaling to the highest quality setting (forget what it’s called). Be sure to turn off Antialiasing (don’t really need it at high resolutions) and God rays. Turn off all optional things but those two are the most important. Also, if BG3 is installed to an SD card, then enable slow HDD mode.
It still stutters a little when transitioning to cut scenes, but I believe that exists in all PC versions.
Edit: And I have made it (what I think is) mostly through Act 2. I’ve also hosted an online session with my friend (who also plays on Steam Deck using my settings) and my husband (gaming laptop) with no issues.
It’s only able to hold a relatively stable 30fps in act 1. As soon as you hit act 2 it struggles to escape the teens, even on low settings. It was so bad that I had to abandon playing on Deck and move to my PC.
Am I misreading your comment? You’re saying Series S is not the cheapest because Steam Deck is more expensive? Did you have a typo? Am I suffering CO poisoning?
Microsoft did the right thing by softening their stance on system parity. Insisting on it would have hurt the Xbox further along the line, but now devs know they can still release on Xbox if they can’t get one or two features to run on the S.
I didn’t know it wasn’t on Xbox, that’s GOTTA be hurtin em. I’m sure they’ll learn from this and make whatever exceptions need to be made far earlier next time.
If I’m not mistaken the only reason it’s not already on Xbox is because Microsoft insisted it needs to have shared screen on all models, which proved to be problematic and eventually impossible on S, but they refused to release it on X in the meantime.
It’s already been hurting them a lot it sounds like. I don’t think Baldur’s Gate is the first game to not release on Xbox because they couldn’t achieve system parity with the S. If they’ve really softened on it, then that’s a good idea. Better late than never.
Yeah feature parity made sense in the beginning so the S didn’t get left behind but at this point its place feels secure to me. It’s the cheap option. I think most gamers understand that and accept the trade-offs that are inherent in that choice.
Also while it’s neat that they made the game as pretty as they did, this is at the end of the day an isometric turn based crpg. It shouldnt be that hard to scale down.
It’s not exactly isometric considering you can tilt and zoom the camera and get it all the way down to over the shoulder adventure style, allowing you to see off into those beautiful vistas. It has some performance issues even on PC in some places like the mountains and the namesake city.
It's not magic, it's an Nvidia server you're paying for on a time share. And it's decent, but frankly, as the kind of person that can tell when my 120Hz VRR display is hitting a flat frametime by eye it's nowhere near comparable to local play, even in optimal circumstances.
Streaming is a nice option when you need a hardware-independent, location-independent way to run a heavy game, or as a stopgap when your client hardware can't cut it with a modern release the cloud service covers, but it's not an optimal experience and it's problematic if it becomes a primary way to run games for a host of other reasons. I actually find GFN to be a solid idea, in terms of tapping into libraries you already own, but it's absolutely a secondary, value-added solution to either running games on client or even pushing your own stream from a server you own.
Nvidia absolutely does not care about dedicated gaming GPUs anymore though, this is where their focus currently resides. Do you understand how much money Nvidia makes on their server hardware vs the consumer graphics card market? It’s absolutely disgusting how much their server offerings are making them right now. They don’t care about anecdotal scenarios like yours where you can tell a 120Hz VRR displaying is hitting a flat frame time. There will be a day when Nvidia just won’t offer midrange graphics cards anymore, because it’s just going to cost too much with little return on their investment over the server hardware and subscriptions.
As a regular user of this service, it's very good, although not perfect, a few bugs, some ux issues with the different game providers, but nothing unsolvable. The customer support is surprisingly reactive too.
disclaimer: tested on a 10Gb/s simmetric connexion and an urban 5G network
Geforce Now has been good the few times I've tried it (note that I have Cable internet with a direct ethernet connection to my computer and I don't use wi-fi) but there's just barely enough latency that those who are latency sensitive can notice it.
Fine for slower paced games, but potentially an issue for faster paced ones depending on how hard they are.
I legitimately don’t understand why people love this endless stream of the same game over and over and over and over for exorbitant prices every single year.
I have enjoyed the 2 new Modern Warfares -- I played Cod-MW3 growing up, but only the campaigns, so MW2019 was my first MP COD, and I had a lot of fun for what it was. Warzone was also the first BR that appealed to me, and I ended up playing a lot of it as well. I'd guess I completed 3-4 battlepasses before I realized I wasn't enjoying myself anymore
I skipped BO3 and Vanguard, but by the time MW2 was coming out last year, I had the apetite to try CoD again, and the campaign was okay (not as good as MW2), and I enjoyed the MP about 2/3s as much as I did MW2019. I had some fun with the extraction game mode, and finished the S1 battlepass, but haven't come back since.
Another MW3 so soon feels like a misstep here. I am curious about the campaign, since it seems like they're mixing some cool setpiece levels from MW2+3, but that's not worth full price to me.
I’m not touching that game until they go and take the cowboy out of the game. We’re not fools Blizzard, that rename was just a cheap way to put things under the rug and we know already who’s that guy based on
But like...the real guy isn't a cowboy. They renamed him. You'd remove an entire character from the game? Renaming him seems like the appropriate response.
I agree removing the character entirely seems too far. The characters are designed differently enough that you would be leaving a reasonable number of people without their favorite playstyle entirely without a suitable replacement.
I get people not being able to look past the history, but I'm not sure there's any more reasonable course of action that can be taken. The name was the homage, not the kit, art, or lines. And now that name is gone.
I think those who are still upset about it would have to accept they fall under "Don't like? Don't engage" rather than calling for further action.
So… could someone explain to me what makes this better than just using the Remote Play app on whatever device you want? That has existed for years and has always worked great.
Probably not much, especially if you already like the controller. I have a crap controller that I use with my phone, so it would probably be an upgrade over that... but even so, I’d rather just buy a better controller.
Yeah… Even then, it basically just uses a PS5 controller with a tablet in between it. If someone likes the controller here, they can use literally any other PS5 controller. They can’t even use the argument that it’s more portable when it’s a hefty 8" tablet lol. What a strange device…
Why couldn't this device have been the one to have "360" in the name? It would have worked (for once!).
"Why do they call it the PieceShyt360? Because you throw it away and it boomerangs right back to you!"
Seriously, though, how am I going to catch the Joker with this thing?
For reals: looks like it has bad ergonomics, the software (just stock Android?) looked to be performing poorly in that one video, it will be super overpriced probably, and we don't know how locked down it will be. If it is cheap and we have play store services, it might be an ok emulation device?
vg247.com
Aktywne