Is unity and c# really that bad by itself? I don’t have much experience in c# development, but I was in the impression that c# is a relatively fast language (not as much as c++ but much, much more than js, python and even java)
I see no justification for why CS 2 is this resource intensive.
It’s a heavy city simulation game, so high CPU usage is kind of expected (though I think it could be better), but what about the RAM and GPU requirements and actual usage?
The problem is, this sets a precedence in the gaming industry (and in the consumer’s minds too) that it’s fine to consume 16 GB of RAM, not on a late game megacity but on a new save.
Free all working sets what the fucking hell??? No, no, no, I don’t want to send my full browser to swapfile just because of a greedy game. Loading back all the memory pages will take a lot of time when I want to switch back to the browser, and it will lag for quite some more time until all the not too frequently used but important is loaded back too. This also applies to the reverse: swapping the game out and back in will take a ton of time, and then it will have lag spikes when it needs a dozen of memory page that is somewhat more rarely used and haven’t been loaded back with all the rest. This nonsense of literally using all your ram “as a cache” but as working set just makes everything slower in the end. This just cannot be justified. There’s a reason I’m using a multi tasking PC instead of a single-tasking gaming console, which you can only use for one purpose at a time.
And don’t tell me to put my swapfile on my SSD. This is the perfect way of killing yours, with writing 16 GB of data every time you switch between windows.
60fps doesn’t matter. It’s not a shooter. Even CS1 I could only get 50ish on a new map, and that’s with hardware that’s 6 years newer than the game
It does not sound like 50 FPS on 6 years old hardware. Maybe half?
RAM should be used. For gaming it would be wasteful not to use it.
Don’t be afraid, I do use my RAM. Like, it’s full of other important programs and filesystem cache.
But the game shouldn’t take it away from other programs, and it should also be aware of the fact that windows starts swapping out programs when RAM usage has reached ~70%. This will significantly affect any programs you run simultaneously, but the game itself tooz because it’s less used memory pages will be swapped out more. Random access for reading back swapped pages is much slower than loading the resources in smaller groups sequentially.
16 GB usage sounds like the game has loaded ALL of its models and resources, even those that are not needed (not in view, and probably not even accessible to the player), and probably has multiple copies of most with different resolution and such.
Loading to RAM that much data would be fine if they managed it to only be loaded to a cache, that can be released for other programs, but I don’t think you can do that in any other way than using the filesystem cache, at which point the RAM usage does not even count against your process, or as usage at all.
If you aren’t using all your ram then you’re loading textures, shaders, and everything from disk, which is thousands of times slower and that would lead to .
Obviously the game does not have to use all the RAM. It only needs to preload textures and models that are useful on your system (based on graphics settings) and are in use right now or can be in use very soon.
Also, loading from disk is not as slow as you make it seem. Yes it is if your users install games to a drive that’s bad for that purpose (like SMR tech hard drives), or if you haven’t placed the resources strategically, by which I mean grouping resources so that commonly-used-together resources are placed sequentially for a quick and efficient read.
The first problem shouldn’t be your concern: the player shouldn’t expect top performance from hardware that was designed for a totally opposite task.
Marketers are paid to lie.
Yes, but they shouldn’t touch any technical information, including the hardware requirements section. Marketers don’t know shit about the game, just that they want to sell at much licenses as humanly possible.
The hardware requirements, however, is to be defined by those who know shit about the game. Preferably core developers or performance testers, who have an idea about the game’s inner workings and about how much is it expected to use in average and in the worst case.
I find that people who watch reviewers are exponentially more disappointed in games because they let reviewers tell them how to feel.
I can agree with that and your point on Cyberpunk. I haven’t played that game, but not because I’m not interested. It looked fun from content that I have seen.
But the performance concerns sound like that it’s actually a huge problem.
I like it that so far it has been described a solid lunch except land leveling and performance, because the first one can probably be addressed in a few months at most if they want it. But even the published hardware requirements were disappointing, and this is a signal that the game will hardly get any better than that, if it can reach it.
I don’t get much FPS on CS 1, and it’s not pleasant. It’s probably somewhere between 20-30. But the news above mean that I shouldn’t even dream about running CS 2 with this hardware, because it runs much worse than the first game, but also compared to other games.
Honestly I was expecting that CS 2 would run better than 1. I have a little hope that they will fix their shit, but now I don’t expect significant improvements over the first game’s performance.
I think this is rather about checking the MAC “from the inside”, as a program running on the computer. This will work on a PC, as I think neither Windows or Linux systems restrict reading the MAC addresses of network interfaces and such, by default at least. On phones, I don’t know. But the point is that now the “attacker” is not on the wifi network you want to connect to, but inside your computer, and wifi mac randomization is worthless. Not just that they might have access to the original MAC of the wifi interface, what about the MAC of other interfaces like the cellular data interface or ethernet (over USB, when its supported), and then theres tons of other info too by which they can identify the device.
Consider upping the cache for youtube. By default it is 2 minutes, which is mostly fine, except that if you are speeding through a part with 2x or faster, you may quickly run out of that small cache, because it only loads with the speed of something between 1x and 2x.
And then you may also set up saving the cache to disk instead of RAM, because it may be quite larger. Single config option.
You can make these only apply for youtube videos only with conditional auto profiles. The doc has an example for an automatic youtube profile, it’s perfect.