How hard is it for them to realize this? Graphics are a nice to have, they’re great, but they do not hold up an entire game. Star wars outlaws looked great, but the story was boring. If they took just a fraction of the money they spent on realism to give to writers and then let the writers do their job freely without getting in their way they could make some truly great games.
It’s hard for them to realize because good graphics used to effectively sell lots of copies of games. If they spent their graphics budget on writers, they’d have spent way too much on writing.
Yep, it’s a byproduct of the “bit wars” in the gaming culture of the '80s and '90s where each successive console generation had much more of a visual grqphical upgrade without sacrificing too much in other technical aspects like framerate/performance. Nowadays if you want that kind of upgrade you’re better off making a big investment in a beefy gaming rig because consoles have a realistic price point to consider, and even then we’re getting to a point of diminishing returns when it comes to the real noticeable graphical differences. Even back in the '80s/'90s the most powerful consoles of the time (such as the Neo Geo) were prohibitively expensive for most people. Either way, the most lauded games of the past few years have been the ones that put the biggest focus on aspects like engaging gameplay and/or immersive story and setting. One of the strongest candidates for this year’s Game of the Year could probably run on a potato and was basically poker with some interesting twists: essentially the opposite of a big studio AAA game. Baldur’s Gate 3 showed studios that gamers are looking for an actual complete game for their $60, and indie hits such as the aforementioned Balatro are showing then that you can make games look and play great without all the super realistic graphics or immense budget if you have that solid gameplay, story/setting and art style. Call of Duty Black Ops 48393 with the only real “innovation” being more realistic sun glare on your rifle is just asking for failure.
Baldur’s Gate 3 showed studios that gamers are looking for an actual complete game for their $60
This language always misses me. Every game I buy is complete. Adding an expansion to it later doesn’t make it less complete, and it’s not like BG3 wasn’t without major bugs.
I think we landed in a situation where some people don't understand the different between graphical style and graphical quality. You can have high quality graphics that are still very simplistic. The important part is that they serve their purpose for the title you're making. Obviously some games benefit from more realistic graphics, like TLoU Part 2 depicted in the thumbnail & briefly mentioned. The graphics help convey a lot of what the game tries to tell you. You can see the brutality of the world they are forced to live in through the realistic depiction of gore. But you can also see the raw emotion, the trauma on the character's faces, which tells you how the reality of this world truly looks like. But there's plenty of games with VERY simplistic graphic styles that are still high quality. CrossCode was one of the surprise hits for me a couple years ago and became one of my favorite RPGs, probably only topped by the old SNES title Terranigma. They both have simple yet beautiful graphics that serve them just as well as the realistic graphics of TLoU. Especially the suits / publishers will make this mistake since they are very detached from the actual gaming community and just look at numbers instead, getting trapped in various fallacies and then wonder why things don't go as well as they calculated.
Sure you can, just do like “reviewers/players gush about ‘riveting plot’ and ‘characters that feel real’ and ‘a truly compelling story’” or whatever it is.
Look, I’m gonna be real with you, the pool of writers who are exceptionally good at specifically writing for games is really damn small.
Everyone is trained on novels and movies, and so many games try to hamfist in a three-act arc because they haven’t figured out that this is an entirely different medium and needs its own set of rules for how art plays out.
Traditional filmmaking ideas includes stuff like the direction a character is moving on the screen impacting what the scene “means.” Stuff like that is basically impossible to cultivate in, say, a first or third-person game where you can’t be sure what direction characters will be seen moving. Thus, games need their own narrative rules.
I think the first person to really crack those rules was Yoko Taro, that guy knows how to write for a game specifically.
I don’t think any amount of money could’ve saved the writing for Outlaws. People should not expect great writing from studios like Ubisoft. Not to say that Ubisoft doesn’t have great talent, but it’s a “too many cooks” situation there.
I have a computer from 2017. It’s also a Mac. I can’t play recent games and I think I’ve just gotten more and more turned off by the whole emphasis on better graphics and the need to spend ridiculous amounts of money on either a console or a really good graphics card for a PC has just turned me off of mainstream gaming completely.
Mostly I just go play games I played when I was a kid these days. 1980s graphics and yet I have yet to get tired of many of them…
I can think of many older games in dire need of facelifts, but the thing is they don’t need a facelift into photo-realistic territory. Just enough to bring the vision out from developers reaching just a little further than their old tech could support. I’m thinking of a lot of early 3D games. Many of the older sprite based games still hold up great.
The AAA gaming industry has gone off the rails trying to wow us with graphics and the novelty has long worn off.
I would argue they don’t even need to be updated. They were fun already in their time. I wish people would just come up with totally new ideas. I don’t need the same characters in every game I play. Same with movies now too Everything is a remake or a sequel.
I think the issue is a bit more nuanced. Graphics have gotten so good that it is relatively easy to get character animations which sit in the uncanny valley.
The uncanny valley is bad. You can have beautiful, photorealistic graphics everywhere, but if your characters are in the uncanny valley, the overall aesthetic is more similar to a game which didn’t have the photorealism at all.
In the past, the goalpost was at a different spot, so putting all the resources towards realism still wouldn’t get you into the valley, and everyone just thought it looked great.
I spoke against the need for realistic graphics last time the topic came up, and I'll say a word in favour of it now: It's pretty awesome having realistic lighting and shadows when you're admiring the scenery in Skyrim. My 6600 can barely keep up, but the work it's doing there is fully aesthetically worthwhile. The same can't be said for every GPU-hungry game that comes out, and it may not have the central importance that it used to, but nice graphics are still nice to have. I say that as someone who appreciates NetHack at least as much as any new AAA game.
And I don‘t think games have to look that good either… I‘m currently playing MGSV and that game‘s 8 years old, runs at 60 fps on the Deck, and looks amazing. It feels like hundreds of millions are being burned on deminishing returns nowadays…
It’s bullshit accounting, they’re not spending it on the devs or the games, they’re spending it on advertising and the c levels Paydays. There are a ton of really good looking games, that had what would be considered shoestring budgets, but these companies bitching about it aren’t actually in it for the games anymore, its just for the money.
36mil for KC:D one of the prettiest games of the time…which includes marketing.
The cost of some of these “AAAA” titles is a complete joke.
Another one, before THQ took over, metro 2033, 10-20 million estimate, and while it’s aged a bit, when it first was released it had a good bit of eye candy for a horror game.
Another one would be the first stalker, it’s not what we would consider it today, visual candy, but it’s also over a decade old now, but it was a pretty game when it came out. Cost 5mil to make
Another one:
Crysis 22mil, and if you remember it was one of the benchmarks for the longest times…can it run crysis.
Alright, not like for like exactly, and at 34M, we’re stretching the definition of shoestring. I’ll bet KC:D’s sequel spent far more, for one. I’m with you that more of these studios ought to be aiming for reasonable fidelity in a game that can be made cheaply, but when each of those studios took more than 5 years to build their sequels, that becomes more and more unlikely.
34 mil is nothing when you start looking at the cost of some of these other games, even Skyrim was over 100 million. Like GTA5, with marketing, was like 250 million. Just insanely expensive, and I guarantee you the devs are not pulling in a mil or two a year.
It’s true, and I’d certainly like to see some of these studios try to target making many games at that budget than a single game at ten times that every 7 or 8 years, but even these “cheaper” games you listed still take a long time to make, and I think that’s the problem to be solved. Games came out at a really rapid clip 20-25 years ago, where you’d often get 3 games in a series 3 years in a row. We can argue about the relative quality of those games compared to what people make now and how much crunch was involved, but if the typical game is taking more than 3 years to make, that still says to me that maybe their ambitions got out of hand. The time involved in making a game is what balloons a lot of these budgets, and whereas you could sell 3 full-priced games 3 years in a row back in the day, now you’re selling 1 every 6 years, and you need to sell way, way more of them to make the math work out.
Games had a lot less in them as well though, but even then games still took time. OoT, one of the biggest RPGs, released in 98, two 1/2 years of dev time. Games still required time, maybe less, but they also had less in them.
I’d make that trade, easily. More often I find games these days are too long to their own detriment than that they felt like they ought to be that long. Your mileage may vary on a game by game basis, but in general, that’s how it’s been lately.
When I say less, I mean as in assets and things to do. Side quests alone in a lot of games these days are damn near as many hours as the main story is.
Unpopular opinion but I preferer the graphics of a game were absolute trash but the ost be awesome. I can forget easyly how much individual hairs are in a 3d model, but good OST will live in my mind and heart forever.
The Wii was a fantastic example of this. Less capable hardware used in very imaginative ways, and had the capacity to bring older people into the games
This is why so many indie games are awesome. The graphics don’t need to be great when the soundtracks and gameplay more than make up for it. Those are what actually matter. I have most of Undertale’s OST committed to memory at this point lol
A lot of comments in this thread are really talking about visual design rather than graphics, strictly speaking, although the two are related.
Visual design is what gives a game a visual identity. The level of graphical fidelity and realism that’s achievable plays into what the design may be, although it’s not a direct correlation.
I do think there is a trend for higher and high visual fidelity to result in games with more bland visual design. That’s probably because realism comes with artistic restrictions, and development time is going to be sucked away from doing creative art to supporting realism.
My subjective opinion is that for first person games, we long ago hit the point of diminishing returns with something like the Source engine. Sure there was plenty to improve on from there (even games on Source like HL2 have gotten updates so they don’t look like they did back in the day), but the engine was realistic enough. Faces moved like faces and communicated emotion. Objects looked like objects.
Things should have and have improved since then, but really graphical improvements should have been the sideshow to gameplay and good visual design.
I don’t need a game where I can see the individual follicles on a character’s face. I don’t need subsurface light diffusion on skin. I won’t notice any of that in the heat of gameplay, but only in cutscenes. With such high fidelity game developers are more and more forcing me to watch cutscenes or “play” sections that may as well be cutscenes.
I don’t want all that. I want good visual design. I want creatively made worlds in games. I want interesting looking characters. I want gameplay where I can read at a glance what is happening. None of that requires high fidelity.
I’ve seen a lot of cool indie games pop up out of heavily modified classic idTech engines like the DOOM and Quake engines. They’re definitely not high fidelity, but a lot of them scratch an itch that slower paced modern games can’t seem to scratch.
There are a number of theories why gamers have turned their backs on realism. One hypothesis is that players got tired of seeing the same artistic style in major releases.
Whoosh.
We learned all the way back in the Team Fortress 2 and Psychonauts days that hyper-realistic graphics will always age poorly, whereas stylized art always ages well. (Psychonauts aged so well that its 16-year-later sequel kept and refined the style, which went from limitations of hardware to straight up muppets)
There’s a reason Overwatch followed the stylized art path that TF2 had already tread, because the art style will age well as technology progresses.
Anyway, I thought this phenomena was well known. Working within the limitations of the technology you have available can be pushed towards brilliant design. It’s like when Twitter first appeared, I had comedy-writing friends who used the limitation of 140 characters as a tool for writing tighter comedy, forcing them to work within a 140 character limitation for a joke.
Working within your limitations can actually make your art better, which just complements the fact that stylized art lasts longer before it looks ugly.
Others speculate that cinematic graphics require so much time and money to develop that gameplay suffers, leaving customers with a hollow experience.
Also, as others have pointed out, it’s capitalism and the desire for endless shareholder value increase year after year.
Cyberpunk 2077 is a perfect example. A technical achievement that is stunningly beautiful where they had to cut tons of planned content (like wall-running) because they simply couldn’t get it working before investors were demanding that the game be put out. As people saw with the Phantom Liberty, given enough time, Cyberpunk 2077 could have been a masterpiece on release, but the investors simply didn’t give CD Project Red enough time before they cut the purse strings and said “we want our money back… now.” It’s a choice to release too early.
…but on the other hand it’s also a choice to release too late after languishing in development hell a la Duke Nukem Forever.
I honestly feel like this with Genshin Impact. It looks absolutely breathtaking and in 20 years it will still be beautiful. It runs on a damn potato. I personally like the lighting in a lot of scenes way better than the lighting in some titles that have path tracing.
I have always liked art styles in games better than realism.
In what world does Genshin runs well on a potato? Unless you have a different definition of potato than me. My Galaxy S10e can barely play the game, and it’s not even slow enough to be called a potato
Haha, opposite experience for me! I don't play it but know some people that do, and I only ever heard about them playing it on their PCs, so it was their comment that made me realize it was also available on phones :P
Unfortunately, Cyberpunk is exactly the kind of product that is going to keep driving the realistic approach. It’s four years later now and the game’s visuals are still state-of-the-art in many areas. Even after earning as much backlash on release as any game in recent memory, it was a massively profitable project in the end.
This is why Sony, Microsoft, and the big third parties like Ubisoft keep taking shots in this realm.
Borderlands 1 and 2 still look great in comparison to a lot of games that came out around the same time. The stylized cel-shaded textures help hide the lower-poly environments and really make the world stand out. Most games at the time were trying to go for a “realistic” look that just resulted in bland brown and gray environments that look terrible.
Shout out to Borderlands 1, one of the last game to have some of the best comedy delivered by text, instead of audio.
I actually am in the minority of preferring 1 over 2 because 2 is just so fucking loud. Handsome Jack in my fucking ear for hours on end, refusing to shut the fuck up and let me play the game.
I much much much preferred the quiet reading of Borderlands 1.
Yeah I did read the article. That’s why I know what the article is about, and the fact that he’s complaining about graphical fidelity in games and not getting the profit benefit. clearly AAA studios aren’t actually having this issue because, like I said, the winner of the game awards this year was a cartoony game, so clearly they are well aware that graphics aren’t everything.
I’d say it’s less about imagination than gameplay. I’m reminded of old action figures. Some of them were articulated at the knees, elbows, feet, wrists, and head. Very posable, but you could see all the joints. Then you had the bigger and more detailed figures, but they were barely more than statues. Looked great but you couldn’t really do anything with them.
And then you had themed Lego sets. Only a vague passing resemblance to the IP, but your imagination is the limit on what you do with them.
I may be outsider but lower graphic level horror games actually work more for me, because imagination fills the gaps better than engine rendering plastic looking tentacles can
I just played Dragon Age Veilguard, and I’m now playing Dragon Age Origins, which was released 15 years ago. The difference in graphics and animation are startling. And it has a big effect on my enjoyment of the game. Origins is considered by many to be the best in the series, and I can see that they poured a ton into story options and such. But it doesn’t feel nearly as good as playing Veilguard.
Amazing graphics might not make or break a game, but the minimum level of what’s acceptable is always rising. Couple that with higher resolutions and other hardware advances, and art budgets are going to keep going up.
Agreed; Veilguard has pretty okay graphics. Not great, but acceptable - the high mark for me is BG3. But moving back to the earlier entries, they may have had stories that felt more ‘real’ (e.g., the setting felt more internally consistent) and gave more options, but the graphics and gameplay haven’t aged well.
Similarly, Fallout: New Vegas hasn’t aged so well. It was a great game, but it looks pretty rough now, unless you load it down with hi-res mods.
I don’t demand photorealism, but I’d like better visuals than PS3-level graphics.
Yeah, that frustrates me a lot, too. They almost had it right, that they need to go beyond realism to make truly good-looking games. But in practice, they say that only to show you the most boring-ass graphics known to humanity. I don’t need your pebbles to cast shadows. I can walk outside and find a pebble that casts shadows in a minute tops. Make the pebbles cast light instead, that could look cool. Or make them cast a basketball game. That’s at least something, I haven’t seen yet.
I like the way you think. The logic of video games and what they display don’t have to be limited by anything in the real world. They can invent entirely new forms of perception even (like that Devil Daggers sequel that lets you see behind yourself using colour overlays).
nytimes.com
Aktywne