I don’t get this “raw pixels are the best pixels” sentiment come from, judging from the thread everyone has their own opinion but didn’t actually see the reason behind why people doing the upscalers. Well bad news for you, games have been using virtual pixels for all kinds of effects for ages. Your TV getting broadcast also using upscalers.(4k broadcast not that popular yet.)
I play Rocket Leauge with FSR from 1440p to 2160p and it’s practically looking the same to 2160p native AND it feels more visually pleasing as the upscale also serve as extra filter for AA to smooth out and sharpen “at the same time”. Frame rate is pretty important for older upscaler tech(or feature like distance field AO), as many tech relies information from previous frame(s) as well.
Traditionally, the render engine do the stupid way when we have more powerful GPU than engine demand where the engine allows you to render something like 4x resolution then downscale for AA, like sure it looks nice and sharp BUT it’s a bruteforce and stupid way to approach it and many follow up AA tech prove more useful for gamedev, upscaler tech is the same. It’s not intended for you to render 320x240 then upscale all the way to 4k or 8k, it will pave way for better post processing features or lighting tech like lumen or raytracing/pathtracing to actually become usable in game with decent “final output”.(remember the PS4 Pro checkboard 4k, that was a really decent and genuinely good tech to overcome PS4 Pro’s hardware limit for more quality demanding games. )
In the end, consumer vote with their wallet for nicer looking games all the time, that’s what drives developers gear toward photo real/feature film quality renderings. There are still plenty studio gears toward stylized, or pixel art and everyone flip their shit and praise while those tech mostly relies on the underlying hardware advance pushed by photo real approach, they just use the same pipeline but their way to reach their desired look, Octopath Traveler II used Unreal Engine.
Game rendering is always about trade-offs, we’ve come a LONG way and will keep pushing boundaries, would upscaler tech become obsolete somewhere down the road? I have no idea, maybe AI can generate everything at native pixels, right?
I don’t have anything against upscaling per se, in fact I am surprised at how good FSR 2 can look even at 1080p. (And FSR is open source, at least. I can happily try it on my GTX 970)
What I hate about it is how Nvidia uses it as a tool to price gouge harder than they’ve ever done.
I mean, I didn’t say it looked great or anything. Just better than I expected.
But of course my expectations were extremely low when I saw so many comments like yours, so I was actually pleasantly surprised with what it can do for what it is.
Though to be fair to the Deck, the native resolution is already so low that there isn’t a whole lot FSR can work with.
My next card will be AMD, but that doesn’t change the fact that Nvidia is the biggest authority in this market. They do whatever they want, and AMD doing their best to only be slightly worse isn’t helping.
nvidia is using their investor’s dollars really efficiently, which is what leads them to today’s dominance, but also make them like bully toward their business partners(like EVGA, who knew what other vendors are being treated. )
some of the early investment to push dominance in cuda:
NV directly fund researches and provide equipment for accelerated computing(both graphic and non-graphic), which in return researcher are really familiar with cuda and their results improve cuda’s design/driver/compiler. the AI training side eventually leads to tensor cores.
NV then use those to help software developers to integrate CUDA-accelerated application, like GPU-renderer, GPU-simulation, GPU-deep learning, GPU-denoiser, GPU-video encoding.
NV also helps game developer implement or integrate techs like RTX, DLSS, or ealier ones like hair/physx, etc. And those notorious game specific driver enhancement. ie. they basically work with the game and have ways to set driver side parameters for each game. These collaboration also leads to that GeForce Experience’s auto best quality settings for your pc feature.
they also make CUDA only card for number crunching at data center.
all above leads to when making purchase, if you are not just playing games, your most viable cost efficient is to buy NV if your work software also use those CUDA features.
The business plan and result is then positive feedback cycle, crytpo surge of sales or investment money is extra but Nvidia did put them to good use. But above plan make more investors willing to pump money into NV. There are no better business than monopoly business.
Then, some thing happened for consumer end, don’t know exactly when or reasons they start selling flag ship and crank up their GPU’s prices. People would be like, dude their used GPU with crypto is selling 3x~5x higher then MSRP, why wouldn’t they just increase and get all the revenue themselves. That maybe “part” of the reason but I think they probably testing water in both front(their data center number crunching card were way, way more expensive than even the top tier consumer cards.) They took the chance, with global chip shortage and other “valid reason” to up the price and then check what the market respond, now they have about 2 generation worth of “price gouging” the market data to set their price properly.(plus the door in your face effect. ) Note, big manufacturers sign component deals in years, not quarters, the chip shortage might affect difference sector heavily, like say laundry machines, but for NV you can bet your ass their supply is top priority.
They did lose out on the console front, and like many already mentioned, NV’s CEO no longer have passion in pushing game tech, he is all AI now. Depending on how they aim their business, their game side gpu business may not doing something really worth mentioning until AMD can put up a serious threat.
This is pretty normal behavior in response to any game published by an AAA studio.
Intel is trying to break into the home GPU market, and you’re surprised that they’re trying to make sure a game that has a lot of interest is able to be run on their GPU?
People who buy or recommend GPUs expect to be able to use them to run any software that relies upon a GPU. It’s already a bad look for Intel that this is a problem. The article says you can’t even launch the game at the moment.
Imagine if Word or Excel or Chrome failed to launch because of the GPU you had installed?
They always do. The main reason graphics drivers are so fucking huge is that they contain tons of game specific patches. Nvidia has what they call “game-ready” updates which are supposed to increase performance of popular games or patch specific bugs.
Why? They do that pretty much with every major release, especially for demanding titles. People tend to build PCs specifically for a specific game, so the major GPU vendors want to fill that high end need.
In terms of looks, I will say the rocky textures are pretty nice. Also they managed to map actors faces without getting that weird bugeye effect so many other games suffer from.
The character models seemed pretty simple for such a demanding game. I was hoping at least major characters would be a little more detailed. Then again, this was from watching a stream on my phone, so maybe it looks better in person.
Aside from looks, the voice acting I saw seemed a little odd. It could also just be a poor script, but it just didn’t seem all that great.
But overall, the game seemed pretty good, but not something I’m dying to run out and buy. I’ll have some more time this fall, so I’ll probably wait for a few patches to land.
This is pretty common. A graphic card company bragging it can now run X game. Cyberpunk did this. Doom eternal. Hell, I remember when Dishonored 2 from a few years ago was the highlight.
I had initially lowered it a bit at some point. I didn’t realize it was steam recording for a while and spent a day or two trying driver updates and various things. next time I have a chance I’ll try a significant decrease just for testing.
I would go so far as to say that I would count that as a negative if I considered buying it. Who wants to be the weirdo whose handheld device lights up their surroundings with changing colors?
There are people for everything. But this group ain’t big. Being a handheld also means staring into the shifting lights. Like facerolling your keyboard instead of looking at the screen.
Remember back when triple AAA games were released and didn’t need to be patched immediately so you can play because the game devs actually decided to make a proper functioning game instead of going for greed?
I lived it and fucking NOPE so many broken games. Check out AVGN or any Games Done Quick glitch speed run for many examples.
Games had less moving parts back then so they seemed like they worked, until you find out that there were entire spells that didn’t work in Final Fantasy or how you can jump just right and enter a game breaking bug thay required a reset (SMB1 minus world). Or how uninstalling the game would uninstall Windows (Kohan? And Pools Of Radiance 2)
Plus now they can make patches in between the time they start pressing discs (gone gold) and release, hense the “day 1 patch”. Personally I’m glad games can be fixed post release although I would prefer it to be more complete/fix at launch than usual.
Yeah, a mix of both would be ideal. The fact that we’re surprised that Starfield and Baldur’s Gate 3 were solid on release is a problem, all AAA releases should have that level of quality at a minimum on release.
If games are consistently solid at release, I’d probably preorder like I used to. Now I wait and see because, more often than not, it’s a buggy mess the first few weeks.
I remember those days as having no Internet let alone high speed, I recall reading old ads for some PC games. So they don’t seem as great as you’re implying because at least the game will most likely be fixed with an accessible patch these days. You’d have to do a lot more work to get one before or completely wasted your money in rare cases.
Also usually publishers set the release date, though in this case I’m not sure if it was in house or not so may not be a point, though you called out developers so figured I’d add it in.
If only it was just that… this bricks your console. You can’t play games on it, or update it.
It would work for piracy eventually probably, but without firmware updates you’re stuck with older games (unless there are new games released with firmware updates included and pirating those games works)
I mean, it is supposedly already a thing on Android. And there are rumors that Valve have been trying it out for Deckard. So this could very much come this year or so.
Probably not. Steam for macOS still has no SteamPlay support, so your best bet is installing the regular Steam through a separate Heroic prefix. Works great, but it does still require Rosetta.
That said, Box64 and FEX are both making a lot of progress, so it’d be awesome to see these in action officially soon
Not very likely. Translating cpu architectures is completely different from from what wine/proton does. A compatibility layer for arm would be even more difficult and expensive, and have a performance penalty. They might plan that for further into future though, if arm pcs take off. A Mac implementation would probably need a lot of apple-specific work, and there aren’t many mac gamers out there.
There already are some projects that make it work. I haven’t looked at the specifics yet but as far as I understand it everything that can be handled as a library call as native ARM code does just that and only pure x86 calls are emulated. And since nowadays so much stuff is abstracted away and the heavy lifting is done by Vulkan the performance tends to be very good.
Asahi linux already ships a VM to run steam on macbooks. And the VM is not even doing the heavy lifting. They do cpu instruction translation on the go, the VM is there just to solve some memory allocation quirks.
On Windows you may be right. A buddy I game with regularly has had trouble with DX12 games crashing randomly.
On Linux they run just fine and frequently perform better than DX11 on Linux or DX12 on Windows.
Imagine you would save all YouTube videos on your hard drive. You don’t have enough space for that (and time to download anyway). So the next best thing is to just stream those videos and parts you actually watch.
And this is kind of how this game works; it will only deliver those parts and download in the background (which is called streaming) what you currently visit and need. Because you don’t have enough space on your drive.
mbtrhcs wasn't saying that you specifically don't have a big enough hard drive, they're saying that MS Flight Simulator is simply too big of a game to completely store on a player's computer.
MS Flight Simulator has a fairly accurate 3D model of the entire earth. Like, the whole thing. So it's constantly downloading the parts that the player is currently in, and deleting the parts that they are not in.
I hope there is a manual download function for your favorite areas to play them offline, that do not get deleted over time. Kind of how maps on your phone work, just with lot more requirements.
Live information from the earth like weather and other data. If its raining in your city, then it will be raining in the game at this place too. Plus the game does not have all other data anyway, because entire earth is too big for your drive.
My first reply said it was streaming high-res data from the cloud. Considering it’s a flight simulator advertising to cover the entire world, most people would intuit that would include textures and 3d models.
I’m not going to sit here and argue with you, have a nice day.
My first reply said it was streaming high-res data from the cloud.
Your first reply also stated that it needed 180mbps to stream weather data.
Considering it’s a flight simulator advertising to cover the entire world, most people would intuit that would include textures and 3d models.
You can fit the entire world’s texture and 3d models on a super small file. The file size is entirely dependent on the level of detail of those textures and models. Hence the MS Paint analogy.
I appreciate you not arguing anymore, at least you know when to quit.
Then I have no idea what reply you were referring to. Your first reply to me was a snarky one about digital representation of the Earth. Maybe check usernames next time.
And your point about fitting the entire world’s albedo, normal, roughness, specular, height, etc etc textures as well as high-fidelity 3D models is laughably false.
It would be, had I made such a comment. But I didn’t. You just pulled that out of your ass. I made a comment about storing “The Earth” on your local machine.
you are literally the only person confused about this.
Confused because people like you are making me that way.
It’s not weather, it’s terrain and textures. It’s a high resolution stream of where you are flying over so you don’t need to keep the earth on your PC. The base install is supposed to be only ~30GB data, that’s not enough to see your house.
It’s dumb. I’d much rather have a 500GB install. They might as well just make the game a streaming service. It also ensures an early death for the game and no functionality without an internet connection.
I don’t think requiring online functionality is the death knell of a game in the year 2024. Personally, I’m excited. Their servers were so damn slow to download on initial install and I hated MSFS2020 taking up a quarter of my game drive.
I 100% disagree. Any game that requires connection to a remote server for single player functionality is dead to me. And any suggestion otherwise I take personal offense to.
This makes your local game dependent on someone else’s server. That someone else, at any time, can shut down that server with zero consequences. They can change the terms of the deal, with zero consequences. Their servers may unintentionally go down or experience other technical issues, depriving you of the product you paid for, with zero consequences. Also you simply cannot use it away from an internet connection.
You are at the mercy of the provider, who has absolutely no legal obligations to you.
Their servers were so damn slow to download on initial install
And you can’t see why that would be a massive problem while trying to livestream your game from their server?
Only the installs were slow. Terrain streaming worked just fine right from the start (I played it from day one) - and once it’s cached on your machine, they can shut down the servers all they want, it’s still on your machine.
More than that, actually. I measured well over 250 over large cities. Others have reported more than 300.
That’s not how cache works.
In this case, it does. The cache for this simulator is a disk cache - and it’s completely configurable. You can manually designate its size and which parts of the world it’ll permanently contain. There’s also a default rolling cache (also on SSD - this program doesn’t even support hard drives), which does get overwritten over time.
The CDN to download the initial files were slow, the in game streaming was fine.
Yes, ownership sucks these days, but I don’t know how they’d technically pull this off as well without using a remote server. As a philosophy, if we’re purchasing games the only real choice is GoG, anything else ends up with us locked into some server-based licensing system.
FS 2020 had an offline mode. I don’t see why this one wouldn’t have one as well. It’s either using procedurally generated or cached data.
You can not get the same visual fidelity and low latency with game streaming. I’ve tried nearly every service there is (going as far back as OnLive - remember that one?) and they are all extremely subpar, including Microsoft’s own game streaming service.
FS 2020 is available for streaming, by the way, and FS 2024 is likely going to be as well. You’re only getting the console version though. Officially, the resolution is “up to” 1080p, but due to extremely heavy compression, it looks far worse than that. It’s comparable to 720p at best, which means that nearly all fine detail is lost behind huge compression artifacts. On anything larger than a smartphone screen, it looks horrible. That’s on top of connection issues and waiting times that are still plaguing this service.
But…that’s what you’re doing? Streaming the game at 180mbps…
No. Map and weather data is being streamed, cached on your SSD and then the game engine loads it from there into RAM and uses it in combination with other locally stored data and locally performed physics calculation to render the game on your machine. You get an uncompressed, high quality image and low-latency input, freshly baked by your graphics card for your eyes only. At 1080p and 60 fps, that’s already 2.98 Gbit/s per second generated by your GPU and sent to the screen as is. At 1440p, we are at 5.31 Gbit/s and at 4K, 11.94 Gbit/s. DisplayPort can handle up to 20 Gbit/s per lane and use up to four lanes, by the way.
Xbox Cloud Streaming only uses up to 20 Mbit/s (and that’s very optimistic). At the advertised 1080p, this means that only 6.7% as much data as generated on the server is reaching your screen.
The problem with game streaming is that in order to limit latency, they have to compress the image and send it very quickly, 60 times per second, which means they have just 16.7 milliseconds for each frame - and do this for potentially millions of users at the same time. This cannot physically be done at any decent level of quality. It is far easier to send much larger amounts of map data that is not time critical: It doesn’t matter if it’s even a few seconds late on your machine, since the game engine will render something with the data it already has. At worst, you get some building or terrain pop-in, whereas if even a single of the 60 frames required for direct game streaming is being dropped, you’ll immediately notice it as stuttering.
That sounds like a great reason not to buy this game.
If you don’t have the hardware to play this game locally, then I would not recommend it. If you have - and a base Xbox Series S is enough for a reasonable experience, which costs just 300 bucks new or about half as much used - then there is no reason for using the streaming service, unless you absolutely have to play it on your phone at work.
tomshardware.com
Ważne