A lot of comments in this thread are really talking about visual design rather than graphics, strictly speaking, although the two are related.
Visual design is what gives a game a visual identity. The level of graphical fidelity and realism that’s achievable plays into what the design may be, although it’s not a direct correlation.
I do think there is a trend for higher and high visual fidelity to result in games with more bland visual design. That’s probably because realism comes with artistic restrictions, and development time is going to be sucked away from doing creative art to supporting realism.
My subjective opinion is that for first person games, we long ago hit the point of diminishing returns with something like the Source engine. Sure there was plenty to improve on from there (even games on Source like HL2 have gotten updates so they don’t look like they did back in the day), but the engine was realistic enough. Faces moved like faces and communicated emotion. Objects looked like objects.
Things should have and have improved since then, but really graphical improvements should have been the sideshow to gameplay and good visual design.
I don’t need a game where I can see the individual follicles on a character’s face. I don’t need subsurface light diffusion on skin. I won’t notice any of that in the heat of gameplay, but only in cutscenes. With such high fidelity game developers are more and more forcing me to watch cutscenes or “play” sections that may as well be cutscenes.
I don’t want all that. I want good visual design. I want creatively made worlds in games. I want interesting looking characters. I want gameplay where I can read at a glance what is happening. None of that requires high fidelity.
Visuals are very important in games, but Nintendo pursues clear and readable designs. Their games are easy to look at, and they age more gracefully than games pursuing realism.
Not as much as you’d think. I keep my soldiers faceless and unattached until they are fairly leveled up. By the the time they get customized, they tend not to get meatgrindered. Usually.
Keep in mind Freelancer was released after Microsoft acquired Roberts’ company, kicked him out of a leadership role, and drastically slashed the scope.
Star Citizen is what happens when there is nobody above Roberts to say no, and now after years plenty of people under him with an interest in keeping the development churning.
People are buying the dream. There is personal investment now- this isn’t a game, this is their game. Supporters tend to talk like this is a community project, not a transaction between a customer and a studio.
Whenever the studio finally folds, I guarantee there will be whales lamenting that if they’d only spent a little more they’d have kept the game afloat.
Wikipedia isn’t the end all, but in this case I think it provides a working definition.
Enshittification (alternately, crapification and platform decay) is a pattern in which online products and services decline in quality. Initially, vendors create high-quality offerings to attract users, then they degrade those offerings to better serve business customers, and finally degrade their services to users and business customers to maximize profits for shareholders.
There’s a danger in any game where it might be largely designed and marketed to be one thing, and then has lengthy mandatory sections where it becomes another.
Poorly made stealth sections are a prime example. Game designers want to change things up, but if the game isn’t made to do stealth, it can easily turn into an annoying mess. There are a few (not a ton, but a few) games where the mandatory stealth sections are well liked, but they were made to carefully take advantage of the game’s strengths and knew when to end.