I would go so far as to say that I would count that as a negative if I considered buying it. Who wants to be the weirdo whose handheld device lights up their surroundings with changing colors?
There are people for everything. But this group ain’t big. Being a handheld also means staring into the shifting lights. Like facerolling your keyboard instead of looking at the screen.
Remember back when triple AAA games were released and didn’t need to be patched immediately so you can play because the game devs actually decided to make a proper functioning game instead of going for greed?
I lived it and fucking NOPE so many broken games. Check out AVGN or any Games Done Quick glitch speed run for many examples.
Games had less moving parts back then so they seemed like they worked, until you find out that there were entire spells that didn’t work in Final Fantasy or how you can jump just right and enter a game breaking bug thay required a reset (SMB1 minus world). Or how uninstalling the game would uninstall Windows (Kohan? And Pools Of Radiance 2)
Plus now they can make patches in between the time they start pressing discs (gone gold) and release, hense the “day 1 patch”. Personally I’m glad games can be fixed post release although I would prefer it to be more complete/fix at launch than usual.
Yeah, a mix of both would be ideal. The fact that we’re surprised that Starfield and Baldur’s Gate 3 were solid on release is a problem, all AAA releases should have that level of quality at a minimum on release.
If games are consistently solid at release, I’d probably preorder like I used to. Now I wait and see because, more often than not, it’s a buggy mess the first few weeks.
I remember those days as having no Internet let alone high speed, I recall reading old ads for some PC games. So they don’t seem as great as you’re implying because at least the game will most likely be fixed with an accessible patch these days. You’d have to do a lot more work to get one before or completely wasted your money in rare cases.
Also usually publishers set the release date, though in this case I’m not sure if it was in house or not so may not be a point, though you called out developers so figured I’d add it in.
While this is awesome we still need to have the same performance on Windows. Yes, some games run better through proton for some reason, but that’s the minority. Hopefully, proton will not be needed for new games in the future and we get native builds like CS2.
If only it was just that… this bricks your console. You can’t play games on it, or update it.
It would work for piracy eventually probably, but without firmware updates you’re stuck with older games (unless there are new games released with firmware updates included and pirating those games works)
I mean, it is supposedly already a thing on Android. And there are rumors that Valve have been trying it out for Deckard. So this could very much come this year or so.
Probably not. Steam for macOS still has no SteamPlay support, so your best bet is installing the regular Steam through a separate Heroic prefix. Works great, but it does still require Rosetta.
That said, Box64 and FEX are both making a lot of progress, so it’d be awesome to see these in action officially soon
Not very likely. Translating cpu architectures is completely different from from what wine/proton does. A compatibility layer for arm would be even more difficult and expensive, and have a performance penalty. They might plan that for further into future though, if arm pcs take off. A Mac implementation would probably need a lot of apple-specific work, and there aren’t many mac gamers out there.
There already are some projects that make it work. I haven’t looked at the specifics yet but as far as I understand it everything that can be handled as a library call as native ARM code does just that and only pure x86 calls are emulated. And since nowadays so much stuff is abstracted away and the heavy lifting is done by Vulkan the performance tends to be very good.
Asahi linux already ships a VM to run steam on macbooks. And the VM is not even doing the heavy lifting. They do cpu instruction translation on the go, the VM is there just to solve some memory allocation quirks.
On Windows you may be right. A buddy I game with regularly has had trouble with DX12 games crashing randomly.
On Linux they run just fine and frequently perform better than DX11 on Linux or DX12 on Windows.
Pretty hilarious if after years of being the scourge of Linux and FOSS advocates, complaining how they could never leave Windows because they need it to play, gamers become our greatest allies, switching in droves to get more out of their hardware and games.
Really, this isn’t entirely new, I remember some games were known to run better on Wine than Windows years ago already (Soldier of Fortune comes to mind).
Is that something that can be tweaked on the Windows side? Because if you can’t mess with it on windows, one may argue that the comparison is valid because Linux allows you to tweak those settings
They are supposed to have the same settings, the Linux ones are just wrong and using more power than told It’s supposed to be capped to 17 but pulling over 20 vs 16 on windows. That’s a lot more lower resulting in higher fps but lower battery life.
Cyber dopamine himself says he’s not a benchmark guy as well.
I like him a lot, he’s really passionate and a positive breath of fresh air online, but the guy is surely stoned nearly 100% of the time. No way I’m taking his technical tests at face value.
60-80% better frames on Bazzite for space marine 2 was just too much to not be an error.
Imagine you would save all YouTube videos on your hard drive. You don’t have enough space for that (and time to download anyway). So the next best thing is to just stream those videos and parts you actually watch.
And this is kind of how this game works; it will only deliver those parts and download in the background (which is called streaming) what you currently visit and need. Because you don’t have enough space on your drive.
mbtrhcs wasn't saying that you specifically don't have a big enough hard drive, they're saying that MS Flight Simulator is simply too big of a game to completely store on a player's computer.
MS Flight Simulator has a fairly accurate 3D model of the entire earth. Like, the whole thing. So it's constantly downloading the parts that the player is currently in, and deleting the parts that they are not in.
I hope there is a manual download function for your favorite areas to play them offline, that do not get deleted over time. Kind of how maps on your phone work, just with lot more requirements.
Live information from the earth like weather and other data. If its raining in your city, then it will be raining in the game at this place too. Plus the game does not have all other data anyway, because entire earth is too big for your drive.
My first reply said it was streaming high-res data from the cloud. Considering it’s a flight simulator advertising to cover the entire world, most people would intuit that would include textures and 3d models.
I’m not going to sit here and argue with you, have a nice day.
My first reply said it was streaming high-res data from the cloud.
Your first reply also stated that it needed 180mbps to stream weather data.
Considering it’s a flight simulator advertising to cover the entire world, most people would intuit that would include textures and 3d models.
You can fit the entire world’s texture and 3d models on a super small file. The file size is entirely dependent on the level of detail of those textures and models. Hence the MS Paint analogy.
I appreciate you not arguing anymore, at least you know when to quit.
Then I have no idea what reply you were referring to. Your first reply to me was a snarky one about digital representation of the Earth. Maybe check usernames next time.
And your point about fitting the entire world’s albedo, normal, roughness, specular, height, etc etc textures as well as high-fidelity 3D models is laughably false.
It would be, had I made such a comment. But I didn’t. You just pulled that out of your ass. I made a comment about storing “The Earth” on your local machine.
you are literally the only person confused about this.
Confused because people like you are making me that way.
It’s not weather, it’s terrain and textures. It’s a high resolution stream of where you are flying over so you don’t need to keep the earth on your PC. The base install is supposed to be only ~30GB data, that’s not enough to see your house.
It’s dumb. I’d much rather have a 500GB install. They might as well just make the game a streaming service. It also ensures an early death for the game and no functionality without an internet connection.
I don’t think requiring online functionality is the death knell of a game in the year 2024. Personally, I’m excited. Their servers were so damn slow to download on initial install and I hated MSFS2020 taking up a quarter of my game drive.
I 100% disagree. Any game that requires connection to a remote server for single player functionality is dead to me. And any suggestion otherwise I take personal offense to.
This makes your local game dependent on someone else’s server. That someone else, at any time, can shut down that server with zero consequences. They can change the terms of the deal, with zero consequences. Their servers may unintentionally go down or experience other technical issues, depriving you of the product you paid for, with zero consequences. Also you simply cannot use it away from an internet connection.
You are at the mercy of the provider, who has absolutely no legal obligations to you.
Their servers were so damn slow to download on initial install
And you can’t see why that would be a massive problem while trying to livestream your game from their server?
Only the installs were slow. Terrain streaming worked just fine right from the start (I played it from day one) - and once it’s cached on your machine, they can shut down the servers all they want, it’s still on your machine.
More than that, actually. I measured well over 250 over large cities. Others have reported more than 300.
That’s not how cache works.
In this case, it does. The cache for this simulator is a disk cache - and it’s completely configurable. You can manually designate its size and which parts of the world it’ll permanently contain. There’s also a default rolling cache (also on SSD - this program doesn’t even support hard drives), which does get overwritten over time.
The CDN to download the initial files were slow, the in game streaming was fine.
Yes, ownership sucks these days, but I don’t know how they’d technically pull this off as well without using a remote server. As a philosophy, if we’re purchasing games the only real choice is GoG, anything else ends up with us locked into some server-based licensing system.
FS 2020 had an offline mode. I don’t see why this one wouldn’t have one as well. It’s either using procedurally generated or cached data.
You can not get the same visual fidelity and low latency with game streaming. I’ve tried nearly every service there is (going as far back as OnLive - remember that one?) and they are all extremely subpar, including Microsoft’s own game streaming service.
FS 2020 is available for streaming, by the way, and FS 2024 is likely going to be as well. You’re only getting the console version though. Officially, the resolution is “up to” 1080p, but due to extremely heavy compression, it looks far worse than that. It’s comparable to 720p at best, which means that nearly all fine detail is lost behind huge compression artifacts. On anything larger than a smartphone screen, it looks horrible. That’s on top of connection issues and waiting times that are still plaguing this service.
But…that’s what you’re doing? Streaming the game at 180mbps…
No. Map and weather data is being streamed, cached on your SSD and then the game engine loads it from there into RAM and uses it in combination with other locally stored data and locally performed physics calculation to render the game on your machine. You get an uncompressed, high quality image and low-latency input, freshly baked by your graphics card for your eyes only. At 1080p and 60 fps, that’s already 2.98 Gbit/s per second generated by your GPU and sent to the screen as is. At 1440p, we are at 5.31 Gbit/s and at 4K, 11.94 Gbit/s. DisplayPort can handle up to 20 Gbit/s per lane and use up to four lanes, by the way.
Xbox Cloud Streaming only uses up to 20 Mbit/s (and that’s very optimistic). At the advertised 1080p, this means that only 6.7% as much data as generated on the server is reaching your screen.
The problem with game streaming is that in order to limit latency, they have to compress the image and send it very quickly, 60 times per second, which means they have just 16.7 milliseconds for each frame - and do this for potentially millions of users at the same time. This cannot physically be done at any decent level of quality. It is far easier to send much larger amounts of map data that is not time critical: It doesn’t matter if it’s even a few seconds late on your machine, since the game engine will render something with the data it already has. At worst, you get some building or terrain pop-in, whereas if even a single of the 60 frames required for direct game streaming is being dropped, you’ll immediately notice it as stuttering.
That sounds like a great reason not to buy this game.
If you don’t have the hardware to play this game locally, then I would not recommend it. If you have - and a base Xbox Series S is enough for a reasonable experience, which costs just 300 bucks new or about half as much used - then there is no reason for using the streaming service, unless you absolutely have to play it on your phone at work.
I can agree but with two conditions. Benchmarks must always be done in native resolution. Hardware capability / system requirement must not take any upscaling into account.
For example, if a studio publishes the requirements for playing at 1080p, 60 FPS, High RT, it must be native 1080p and not 1080p with upscaling.
Benchmarks should not be disconnected from actual games. If games don’t play in native resolution, then benchmarks should not be limited to native resolution. they should check both native and upscaled rendering, and rate the quality of the upscaling.
RT + DLSS is less cheating than most other graphics effects, especially any other approach to lighting. The entire graphics pipeline for anything 3D has always been fake shortcut stacked on top of fake shortcut.
I’m installing Mint for the first time at this very moment. So far, it’s easier than I anticipated. Fuck You Microsoft.
Edit: bro, firstly, what the fuck and where did all this performance come from?!?! I vastly underestimated how many resources windows was hogging. I downloaded Steam (easy-peasy) and then Project Zomboid just as a test. This game runs like butter now. I was having major problems with it before. To the point I basically stopped playing. I know its just one example but I haven’t had my machine run this well in several years, I feel. Also, got Spotify running. Super easy. I need to figure out how to get my VPN set up (ProtonVPN) but so far, I’m kind of in shock. I can’t wait to actually dig in and see what I can do with this new setup.
Windows 10 did that to us. My work workstation and my wife’s laptop suffered with W10, so I searched alternate OS and found Linux. Luckily our CAD software had a Linux version and I got productivity back.
My wife’s 2010 laptop on w10 was not usable. Its super fast with Linux. Faster than my work issued brand-new Lenovo laptop with W11. The only performance problem would be rendering video or other hardcore tasks.
This is just how I felt when I first switched, also to Mint. I’ve experienced it a couple other times too when switching from some proprietary application to the FOSS option.
I like to describe it as feeling the different priorities of the teams working on each project. When one is made by passionate users who care about it being good software for its purpose, and the other is designed by a committee to hit as many different corporate metrics as possible, it shows.
Well yeah their business isn’t to “serve users.” It’s to “farm consumers.”
That’s why I’m glad I do embedded systems in a niche industry. I’m not trying to drive engagement across the globe. I’m just making a device that serves the needs of a user who has other important work to worry about.
It’s honestly surprising how bloated Windows has become, and for no clear reason either. Even with all of the obvious bloat disabled and resource-intensive features turned off there’s still a significant overhead, it’s just so constant that you don’t notice it. Then you load up Linux on the same hardware and realize what you’ve been missing.
How is device support? Direct drive steering wheels, gamepad, VR, status LED or info displays (ie. Making your keyboard glow red on low health) and bunch of other things like my Sound Blaster G6
Hit and miss since those tend to not have actual standards and generally do their own thing. If it’s popular, there’s a decent chance someone has reverse engineered it and there’s at least partial support (mostly applies to simpler things like steering wheels), but there will be concessions to make until device manufacturers officially support Linux.
If you’re willing to replace equipment, there’s something that works for most of those categories, if not all.
Which one? Support varies wildly depending on manufacturer.
gamepad
I have never seen a gamepad that doesn’t work on Linux. You may not be able to update their firmware if they only provide a Windows tool but they work perfectly fine.
VR
Valve Index and HTC Vive work out of the box. SteamVR is pretty rough in Linux and plagued by issues but it works.
For any other headset you will have to depend on community support. Some work, some don’t.
Which ones? They usually use completely proprietary protocols.
Sound Blaster G6
It will work like any other bog-standard sound card has for years. You will lose any features that are custom to the sound card (dialogue mode, virtual surround, equalizer, …) but those are rarely necessary because there is lots of other software that achieves this for every sound card.
I recommend you boot Linux from USB and take a look. No need to install anything, just boot from USB and take a look if your hardware works.
tomshardware.com
Ważne