startrek.website

7heo, do gaming w It's very rude, Toad.

Why does Mario looks like a better looking version of musk?

TheRealLinga,

I came here to say this! The internet is a strange place

caseyweederman,

Limmy is a better version of Musk.

RedIce25, do gaming w It's very rude, Toad.

MY EYES

MystikIncarnate, do gaming w Then vs Now

I see stuff like this and I don’t blame developers/coders for all the shit that’s happening. If you objectively look at gameplay and such, most games are actually pretty decent on their own. The graphics are usually really nice and the story is adequate, if not quite good, the controls are sensible and responsive…

A lot of the major complaints about modern games isn’t necessarily what the devs are making, it’s more about what the garbage company demands is done as part of the whole thing. Online only single player is entirely about control, keeping you from pirating the game (or at least trying to) plus supplying on you and serving you ads and such… Bad releases are because stuff gets pushed out the door before it’s ready because the company needs more numbers for their profit reports, so things that haven’t been given enough time and need more work get pushed onto paying customers. Day one patches are normal because between the time they seed the game to distributors like valve and Microsoft and stuff, and the time the game unlocks for launch day, stuff is still being actively worked on and fixed.

The large game studios have turned the whole thing into a meat grinder to just pump money out of their customers as much as possible and as often as possible, and they’ve basically ruined a lot of the simple expectations for game releases, like having a game that works and that performs adequately and doesn’t crash or need huge extras (like updates) to work on day 1…

Developers themselves aren’t the problem. Studios are the problem and they keep consolidating into a horrible mass of consumer hostile policies.

Shepy, do gaming w It's very rude, Toad.

That’s Limmy all day long

MurphysPaw,

Blahem !

Psaldorn,
@Psaldorn@lemmy.world avatar

Needs a muffit o’tea

daniyeg, do gaming w Then vs Now

games made with agile teams and with passions are probably good, regardless of when they were made. i’m young but growing up i only had access to really old computers and saw that most of the stuff that was made back in the day was just garbage shovelware. it was hard not to get buried in them.

most triple A developers today are far more skilled in both writing and optimizing the code however when the management is forcing you to work long hours you’re gonna make more mistakes and with tight deadlines, if you’re doing testing and bug fixing after developing the entire game then it’s going to be the first thing that’s getting cut.

that being said i wish they really did something about the massive size games take on disk. my screen is 1080p, my hardware can barely handle your game on low in 1080p so everything is gonna get downscaled regardless and despite how hard you wanna ignore it data caps are still here, why am i forced to get all assets and textures in 4k 8k? make it optional goddamit.

Soleos,

AAA games are turning into luxury/super cars. At the top end, they’re just not made for average consumers anymore where you need money for infrastructure to even drive the thing. But then you also have plenty of Indie/AA studios creating games that surpass AAA from 10-15 years ago with much smaller teams cause tools and skills make it feasible. Of course there’s also the starcrafts and the counterstrikes that are over 20 years old and will never die, the Toyota Camrys and Honda Civics of games, they just get perpetually refreshed

hemmes, do gaming w It's very rude, Toad.
@hemmes@lemmy.world avatar

This is hilarious, upvote earned

WindowsEnjoyer, do gaming w Then vs Now

This is so true. Also let’s not forget where game is almost unplayable and constantly crashing on release.

MonkderZweite, do gaming w How can it be so bad?

Steam can’t be run in minimal anymore, internal browser always eats a GB of RAM if you want to play a steam game.

FlyingSquid, do gaming w How can it be so bad?
@FlyingSquid@lemmy.world avatar

I decided about 10 years ago that I just can’t afford to keep updating my computer every couple of years just so I can play new titles. So now I just play old titles. And if you play old enough titles, you don’t even have to go to a torrent site. You can go directly to the Internet Archive.

I keep rediscovering games I loved when I was younger. I’ve been playing Skyroads lately. I still love the music. You know how long it takes to load Skyroads on a computer from 2015? I have no idea either because it loads faster than I can measure time.

This has been your lecture from a crotchety old man.

MonkderZweite, do gaming w How can it be so bad?

That thing triggered a “too much open files” for me, while the games didn’t.

ICastFist, do gaming w Then vs Now
@ICastFist@programming.dev avatar

For those that are unaware, the second chad is most likely referring to .kkrieger. Not a full game, but a demo (from a demoscene) whose purpose was to make a fully playable game with a max size of 96kb. Even going very slow, you won’t need more than 5 minutes to finish it.

The startup is very CPU heavy and takes a while, even on modern systems, because it generates all the geometry, textures, lighting and whatnot from stored procedures.

KuroiKaze,

I remember beating that and just being really surprised at how well it worked

Suavevillain, do gaming w Then vs Now
@Suavevillain@lemmy.world avatar

I never thought game patches would become such a terrible thing. But the state some games have released in has been crazy.

Blackmist, do gaming w How can it be so bad?

Well, now opens to a black screen in two minutes.

Might take a restart if you want some content in it.

mtchristo, do gaming w Then vs Now

There used to be a time when game devs wrote their masterpieces using assembly. Now it’s all crap Unreal Engine

olmium,

Whats wrong with Unreal engine? 🤔

mtchristo,

Enormous resources hog

olmium,

It also has A LOT of benefits and can run very demanding games well while other engines struggle.

echodot,

It’s incredibly well optimized for what it’s doing.

ICastFist,
@ICastFist@programming.dev avatar

Most devs either don’t or can’t bother with proper optimization. It’s a problem as old as Unreal Engine 3, at least, I remember Unreal Tournament 3 running butter smooth on relatively weak computers, while other games made with UE3 would be choppy and laggy on the same rigs, despite having less graphical clutter.

olmium,

That doesn’t sound like an engine problem tho

LouNeko,

I could write a whole essay on whats wrong with UE from a players perspective. But here’s the skinny.
Light Bloom, Distance Haze, TAA and Upscaling, no visual clarity, Roboto Font for 90% of all UIs, lower framerate for distant objects, no performance diffrence between highest and lowest graphical settings.
The only good looking and optimized UE games come from Epic themselves, so basically just Fortnite (RIP Paragon). Most of the games released by third parties are Primo Garbagio. They run like ass and look like ass.

ricdeh,
@ricdeh@lemmy.world avatar

What’s wrong with Roboto though lol? It’s my favourite font

LouNeko,

There’s nothing wrong with it. It just doesn’t fit everywhere. There’s a thematic difference between Action platformers, Horror and Milsims, yet they all use the same font and UI. Imagine if most games would use Naughty Dogs “Yellow ledges”. It would get old very quickly.

farngis_mcgiles,

predecessor is shaping up to be a good replacement for paragon. im hoping in doesnt die before they get out of early access

people_are_cute,
@people_are_cute@lemmy.sdf.org avatar

Literally all of that is in control of developers. Don’t blame the tools.

LouNeko,

But UE is the common denominator for all those problems. I actually don’t know any positive examples for UE. Satisfactory, maybe, but it still checks most of the issues. They are just less prevalent because the game itself is good.

people_are_cute,
@people_are_cute@lemmy.sdf.org avatar

Ruiner, Ghostrunner and DmC are the only UE titles I have played and they are all FLAWLESS.

rdri,

Some devs just enable raytracing and make it a requirement, to not care about properly optimized alternative lights and shadows stuff.

olmium,

Doesn’t sound like a game engine problem

rdri,

Same as using an AI in games is not an AI problem.

olmium,

Correct. If you build a house with cheap labour and bad materials it’s the builders fault. That doesn’t make all houses bad and unreliable.

rdri,

I mean, if the world makes it very convenient to use such instruments and call the task finished, this is not okay. I wish at some point we would come to conclusion that we need to optimize the code and software products to reduce CO2 emissions or something, so devs’ laziness finally becomes less tolerated.

Bloodyhog, do gaming w Then vs Now

Ok, that got me. I still remember the days of ZX and that funny noise… But I do have a question for one part of the meme: can someone explain to me why on Earth the updates now weigh these tens of gigs? I can accept that hires textures and other assets can take that space, but these are most likely not the bits that are being updated most of the time. Why don’t devs just take the code they actually update and send that our way?

PsychedSy,

I’ve got 2gig fiber, not 56k dialup. It’s Steam’s bandwidth now. They paid Valve their 30%. Why bother with insane compression that just makes it feel slow for us?

Bloodyhog,

That is also a factor I do not understand. Bandwidth costs the storefront money, would Steam and others not want to decrease this load? And well done you with that fiber, you dog! I also have a fiber line but see no reason to upgrade from my tariff (150mib, i think?) that covers everything just to shave that hour of download time a year.

xX_fnord_Xx,

The trick is to download the Fitgirl repack. Cheaper on your wallet and your hard drive.

Bloodyhog,

I am perfectly fine with paying developers, as I buy only the games i do like after some testing ) Going the repack route is unpredictable - no updates, may contain whatever viruses repacker is interested in adding (and given the particular one is likely Russian, I do have my reservations at this crazy time…), etc.

xX_fnord_Xx,

Joking aside, when you download an update, many times it is completely replacing chunks of the game, not just a couple lines of code.

ICastFist,
@ICastFist@programming.dev avatar

I mean, I understand when they chuck everything into a single file, but they used to know how to make their updaters unpack and replace only the stuff that needed updating, instead of just throwing the whole fucking file at you, redundancy be damned.

For instance, stuff in Quake 3 engine is kept in .pk3 files. You don’t need to download the full, newest .pk3, you send a command to remove/replace files X, Y and Z within it and call it a day.

echodot,

Yeah but then people go completely ballistic when games require you to install their own launches I don’t think Steam would necessarily be able to handle the myriad of different formats that would be needed to make that work. So either you have custom lunches or you don’t have particularly efficient patches.

I guess most people care more about the launcher.

ICastFist,
@ICastFist@programming.dev avatar

That’s because games don’t need “their own launcher” to apply updates like that. Ask anyone that’s been playing on PC, patches were these self extracting files or “mini installers” that you just needed to point to the installed game’s folder. Even vanilla World of Warcraft let people download the patches for offline install, it even included a text with all the changes applied.

echodot,

People don’t want the fiddling on of that. I just want to be able to install the patch and then it be there.

That’s why lunches are a thing there’s no other reason to have them.

You might enjoy the technical solution but 99% of people don’t care. I never understand why people seem to think that the 1% of the most experienced people are the standard when they are anything but. Most gamers build their own PCs, they don’t want to have to understand about file systems and formats and compiling. They just wanted to work and then they want to play their game.

MystikIncarnate,

For modern games, from what I’ve seen, they’ve taken a more modular approach to how assets are saved. So you’ll have large data files which are essentially full of compressed textures or something. Depending on how many textures you’re using and how many versions of each textures is available (for different detail levels), it can be a lot of assets, even if all the assets in this file, are all wall textures, as an example.

So the problem becomes that the updaters/installers are not complex enough to update a single texture file in a single compressed texture dataset file. So the solution is to instead, replace the entire dataset with one that contains the new information. So while you’re adding an item or changing how something looks, you’re basically sending not only the item, but also all similar items (all in the same set) again, even though 90% didn’t change. The files can easily reach into the 10s of gigabytes in size due to how many assets are needed. Adding a map? Dataset file for all maps needs to be sent. Adding a weapon or changing the look/feel/animation of a weapon? Here’s the entire weapon dataset again.

Though not nearly as horrible, the same can be said for the libraries and executable binaries of the game logic. This variable was added, well, here’s that entire binary file with the change (not just the change). Binaries tend to be a lot smaller than the assets so it’s less problematic.

The entirety of the game content is likely stored in a handful (maybe a few dozen at most) dataset files, so if any one of them change for any reason, end users now need to download 5-10% of the installed size of the game, to get the update.

Is there a better way? Probably. But it may be too complex to accomplish. Basically write a small patching program to unpack the dataset, replace/insert the new assets, then repack it. It would reduce the download size, but increase the amount of work the end user system needs to do for the update, which may or may not be viable depending on the system you’ve made the game for. PC games should support it, but what happens if you’re coding across PC, Xbox, PlayStation, and Nintendo switch? Do those consoles allow your game the read/write access they need to the storage to do the unpacking and repacking? Do they have the space for that?

It becomes a risk, and doing it the way they are now, if you have enough room to download the update, then no more space is needed, since the update manager will simply copy the updated dataset entirely, over the old one.

It’s a game of choices and variables, risks and rewards. Developers definitely don’t want to get into the business of custom updates per platform based on capabilities, so you have to find a solution that works for everyone who might be running the game. The current solution wastes bandwidth, but has the merit of being cross compatible, and consistent. The process is the same for every platform.

Bloodyhog,

The console argument does actually make a lot of sense to me, thank you for the detailed response. It would still (seemingly) be possible to structure the project in a way that would allow replacing only what you actually need to replace, but that requires more investment in the architecture and likely cause more errors due to added complexity. Still, i cannot forgive the BG 3 coders for making me redownload these 120gb or so! )

MystikIncarnate,

The issue is the compression. There’s hundreds of individual assets, the process to compress or more accurately, uncompress the assets for use takes processor resources. Usually it only really needs to be done a few times when the game starts, when it loads the assets required. Basically when you get to a loading screen, the game is unpacking the relevant assets from those dataset files. Every time the game opens one of those datasets, it takes time to create the connection to the dataset file on the host system, then unpack the index of the dataset, and finally go and retrieve the assets needed.

Two things about this process: first, securing access to the file and getting the index is a fairly slow process. Allocating anything takes significant time (relative to the other steps in the process) and accomplishes nothing except preparing to load the relevant assets. It’s basically just wasted time. The second thing is that compressed files are most efficient in making the total size smaller when there’s more data in the file.

Very basically, the most simple compression, zip (aka “compressed folders” in Windows) basically looks through the files for repeating sections of data, it then replaces all that repeated content with a reference to the original data. The reference is much smaller than the data it replaces. This can also be referred to as de-duplication. In this way if you had a set of files that all contained mostly the same data, say text files with most of the same repeating messages, the resulting compression would be very high (smaller size) and this method is used for things like log files since there are many repeating dates, times, and messages with a few unique variances from line to line. This is an extremely basic concept of one style of compression that’s very common, and certainly not the only way, and also not necessarily the method being used, or the only method being used.

If there’s less content per compressed dataset file, there’s going to be fewer opportunities for the compression to optimize the content to be smaller, so large similar datasets are preferable over smaller ones containing more diverse data.

This, combined with the relatively long open times per file means that programmers will want as few datasets as possible to keep the system from needing to open many files to retrieve the required data during load times, and to boost the efficiency of those compressed files to optimal levels.

If, for example, many smaller files were used, then yes, updates would be smaller. However, loading times could end up being doubled or tripled from their current timing. Given that you would, in theory, be leading data many times over (every time you load into a game or a map or something), compared to how frequently you perform updates, the right choice is to have updates take longer with more data required for download, so when you get into the game, your intra-session loads may be much faster.

With the integration of solid state storage in most modern systems, loading times have also been dramatically reduced due to the sheer speed at which files can be locked, opened, and data streamed out of them into working memory, but it’s still a trade-off that needs to be taken into account. This is especially true when considering releases on PC, since PC’s can have wildly different hardware and not everyone is using SSDs, or similar (fast) flash storage; perhaps on older systems or if the end user simply prefers the less expensive space available from spinning platter hard disks.

All of this must be counter balanced to provide the best possible experience for the end user and I assure you that all aspects of this process are heavily scrutinized by the people who designed the game. Often, these decisions are made early on so that the rest of the loading system can be designed around these concepts consistently, and it doesn’t need to be reworked part way through the lifecycle of the game. It’s very likely that even as systems and standards change, the loading system in the game will not, so if the game was designed with optimizations for hard disks (not SSDs) in mind, then that will not change until at least the next major release in that games franchise.

What isn’t really excusable is when the next game from a franchise has a large overhaul, and the loading system (with all of its obsolete optimizations) is used for more modern titles; which is something I’m certain happens with most AAA studios. They reuse a lot of the existing systems and code to reduce how much work is required to go from concept to release, and hopefully shorten the duration of time (and the amount of effort required) to get to launch. Such systems should be under scrutiny at all times whenever possible, to further streamline the process and optimize it for the majority of players. If that means outlier customers trying to play the latest game on their WD green spinning disk have a worse time because they haven’t purchased an SSD, when more than 90% + have at least a SATA SSD, all of whom get the benefits from the newer load system while obsolete users are detrimented because of their slow platter drives, then so be it.

But I’m starting to cross over into my opinions on it a bit more than I intended to. So I’ll stop there. I hope that helps at least make sense of what’s happening and why such decisions are made. As always if anyone reads this and knows more than I do, please speak up and correct me. I’m just some guy on the internet, and I’m not perfect. I don’t make games, I’m not a developer. I am a systems administrator, so I see these issues constantly; I know how the subsystems work and I have a deep understanding of the underlying technology, but I haven’t done any serious coding work for a long long time. I may be wrong or inaccurate on a few points and I welcome any corrections that anyone may have that they can share.

Have a good day.

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • muzyka
  • Blogi
  • sport
  • giereczkowo
  • lieratura
  • Spoleczenstwo
  • rowery
  • esport
  • slask
  • Pozytywnie
  • fediversum
  • FromSilesiaToPolesia
  • niusy
  • Cyfryzacja
  • krakow
  • tech
  • kino
  • LGBTQIAP
  • opowiadania
  • Psychologia
  • motoryzacja
  • turystyka
  • MiddleEast
  • zebynieucieklo
  • test1
  • Archiwum
  • NomadOffgrid
  • m0biTech
  • Wszystkie magazyny