I spoke against the need for realistic graphics last time the topic came up, and I'll say a word in favour of it now: It's pretty awesome having realistic lighting and shadows when you're admiring the scenery in Skyrim. My 6600 can barely keep up, but the work it's doing there is fully aesthetically worthwhile. The same can't be said for every GPU-hungry game that comes out, and it may not have the central importance that it used to, but nice graphics are still nice to have. I say that as someone who appreciates NetHack at least as much as any new AAA game.
A lot of comments in this thread are really talking about visual design rather than graphics, strictly speaking, although the two are related.
Visual design is what gives a game a visual identity. The level of graphical fidelity and realism that’s achievable plays into what the design may be, although it’s not a direct correlation.
I do think there is a trend for higher and high visual fidelity to result in games with more bland visual design. That’s probably because realism comes with artistic restrictions, and development time is going to be sucked away from doing creative art to supporting realism.
My subjective opinion is that for first person games, we long ago hit the point of diminishing returns with something like the Source engine. Sure there was plenty to improve on from there (even games on Source like HL2 have gotten updates so they don’t look like they did back in the day), but the engine was realistic enough. Faces moved like faces and communicated emotion. Objects looked like objects.
Things should have and have improved since then, but really graphical improvements should have been the sideshow to gameplay and good visual design.
I don’t need a game where I can see the individual follicles on a character’s face. I don’t need subsurface light diffusion on skin. I won’t notice any of that in the heat of gameplay, but only in cutscenes. With such high fidelity game developers are more and more forcing me to watch cutscenes or “play” sections that may as well be cutscenes.
I don’t want all that. I want good visual design. I want creatively made worlds in games. I want interesting looking characters. I want gameplay where I can read at a glance what is happening. None of that requires high fidelity.
Realistic does not equal to good looking. In example Zelda Breath of the Wild looks good, but its hardly realistic. And if all games are very realistic, then it gets a little bit boring, as all games start to look the same. The AAA gaming industry is too much focused on lip sync, realistic faces, grass and puddles. I don’t feel like getting lost in a game, but more like watching a movie. It’s so boring to me (I’m looking at you Red Dead Redemption 2).
I disagree. The art design and realism was one of the reasons why it was so good. It’s still one of the best looking games of all time. It also proves that you can make a good looking game that also is fun and fulfilling. It’s honestly a success story all around.
I’ve always disliked how washed out BotW looks. It’s like they could only process limited colours so they reduced the contrast and everything is light grey with a hint of colour.
It’s actually a deliberate stylistic choice. The colors are washed out with a post-processing filter. Textures are actually much more colorful. You can fix this in an emulator, but the problem is that it’s difficult to find a color preset that works in all lighting conditions. BotW has a consistent, almost painterly art style, even if it’s relatively muted.
Yep, did this, looked great and I loved it. Botw modded was a great experience. Just skipping all those unskippable cutscenes was worth it already. (Teleport animations, sign repair dialogue, etc etc)
Good games don’t automatically sell, on the contrary. Your average Ubisoft open world slop is “good”, but that’s not enough. Even very good, exceptional games don’t automatically sell. Game development is inherently risky. Large publishers tried to game the system by making “safe” bets, by offering spectacle in combination with tried and true mechanics and narratives. This worked for a long time, but due to changing market conditions, the core audience for these types of games getting tired of them and younger gamers not caring about the presentation, these publishers are spending more on a shrinking segment of the market.
The problem is that they maneuvered themselves into a corner. They have built huge, art-heavy studios in expensive cities to make large games that bring in large sums of money that finance this costly development. You can’t easily downsize this kind of operation, you can’t easily change your modus operandi after having built entire companies around it. I’m convinced that this will result in the death of most large publishers and developers. Ubisoft is only the start.
Why should EA, Microsoft or Sony fare any differently? Each can only hope that enough of their major competitors die so that they don’t have to fight around the same segment of the market anymore. They are all fundamentally unable to meaningfully capture the P2W and Gacha markets (same thing, really), especially in Asia, a segment where companies that were built to serve these types of games are truly at home. Those will slowly take over, until they too are too large and bloated to respond to changing market conditions - or until some event outside of their control, like a major conflict and/or economic crisis, wipes them off the map, paving the way for someone else entirely to lead the industry. The only thing that will remain constant is millions of small Indies fighting for scraps, with a tiny handful having the right combination of luck and skill (although mostly the former) to make a decent living.
I’ve seen a lot of cool indie games pop up out of heavily modified classic idTech engines like the DOOM and Quake engines. They’re definitely not high fidelity, but a lot of them scratch an itch that slower paced modern games can’t seem to scratch.
I have a computer from 2017. It’s also a Mac. I can’t play recent games and I think I’ve just gotten more and more turned off by the whole emphasis on better graphics and the need to spend ridiculous amounts of money on either a console or a really good graphics card for a PC has just turned me off of mainstream gaming completely.
Mostly I just go play games I played when I was a kid these days. 1980s graphics and yet I have yet to get tired of many of them…
I can think of many older games in dire need of facelifts, but the thing is they don’t need a facelift into photo-realistic territory. Just enough to bring the vision out from developers reaching just a little further than their old tech could support. I’m thinking of a lot of early 3D games. Many of the older sprite based games still hold up great.
The AAA gaming industry has gone off the rails trying to wow us with graphics and the novelty has long worn off.
I would argue they don’t even need to be updated. They were fun already in their time. I wish people would just come up with totally new ideas. I don’t need the same characters in every game I play. Same with movies now too Everything is a remake or a sequel.
I mean, look at Nintendo. Obviously aggressive legal tactics aside, they make some damn fun games because they know that gameplay matters more than graphics.
Oh don’t dismiss that they’re also graphics and programming wizards. They don’t work with the cutting edge, but they run circles around anyone on the lower end, making games look and run better on potato hardware is no easy feat.
I’d argue the optimization required to make something like that happen is significantly more skillful than all of the crap AAA stuff that takes 250gb and requires shader compilations every boot.
What a group of Wizards. Xenoblade games are great jrpgs but i just cant get over how bad they look at times and performance is often times horrendous. This is only good as long as you don’t care.
The Xenoblade series is made by a developer that is owned by Nintendo. If Nintendo doesn’t want people to rag on their products, they should make them better.
They call this design philosophy, “Lateral Thinking with Withered Technology.” Basically, “using old tech we understand very well in new and innovative ways.” For example, they were slower to get their 16-bit console to market, but while working on it, they used their expertise in 8-bit consoles to release the first cartridge-based handheld system.
Visuals are very important in games, but Nintendo pursues clear and readable designs. Their games are easy to look at, and they age more gracefully than games pursuing realism.
This is a good example. The cartoony graphics work well for Nintendo because it fits their hardware better as well.
For my personal example I can still play Starfox64 easily, but Goldeneye (one of my favorite childhood games) literally gives me a headache to look at. Goldeneye was going for a more realistic look on the engine of the time and aged terribly. Starfox is all big bright cartoon designs.
I have spent years trying to find a Super Mario World or Super Mario Galaxy feel to games. I am not looking for photo realistic. I am looking for a game.
Breath of the wild is a technical masterpiece though. The way that they’ve managed to do lights, shadows, LODs, distant effects. And they’ve managed to add even more to ToTK, plus physics based audio, plus physics objects interacting better than any modern AAA game on “big” consoles. They squeezed every last bit of performance that switch could provide to make these games look as good as humanly possible.
They work with what they have in terms of hardware, and care a lot about gameplay, but they also do invest heavily into graphics and other technical aspects of their games.
The worst thing is that some brilliant sound design is held back by some folks who will buy a top of the line video card but some cheap shitty headphones.
Cheap shitty headphones, when the Koss KSC75 exist for $20 and sound better than anything I had bought before. I have better headphones now, but $20 is $20, and I still like how small they are. Despite having HD600s and HE1000s, they’re still my go-to for the average use case.
EDIT: Here’s a list of headphones worse than $20 funny disc with ear clip:
All Bluetooth headphones (and your $500 AirPods Max)
All gaming headsets
All in-store headphones that aren’t that one set of Audio-Technicas
In other words, 100% of what the average consumer buys. Get them in on this simple trick.
In my experience it’s both cheap shitty headset and expensive shitty headset. The arguments is always wireless + mic so people go and spend ridiculous money on something that can be done better with a cheap good headphones like the Koss + dedicated mic that may not be as convenient to use but will sound much better.
Still, good sound design may be enjoyed on a shitty headset, it’s just that good audio can add more to gameplay than just graphics. Like for example walking in the main HUB area in Starfield is like soul numbingly hilarious… it does not feel alive at all even when you see NPCs walking around all the time.
Gifted my kids, both of them already young adults, one of those retro gaming sticks. An absolute bang/for/buck wonder, full of retro emulators and ROMs. Christmas Day, at grandmas was a retro fest, with even grandma playing. Pac man, frogger, space invaders, galaga, donkey Kong, early console games…. Retro gaming has amazing games, where gameplay and concepts had to make do with the limited resources.
My son has a Steam deck, but he had a blast with the rest.
I’d say it’s less about imagination than gameplay. I’m reminded of old action figures. Some of them were articulated at the knees, elbows, feet, wrists, and head. Very posable, but you could see all the joints. Then you had the bigger and more detailed figures, but they were barely more than statues. Looked great but you couldn’t really do anything with them.
And then you had themed Lego sets. Only a vague passing resemblance to the IP, but your imagination is the limit on what you do with them.
I may be outsider but lower graphic level horror games actually work more for me, because imagination fills the gaps better than engine rendering plastic looking tentacles can
Yeah I did read the article. That’s why I know what the article is about, and the fact that he’s complaining about graphical fidelity in games and not getting the profit benefit. clearly AAA studios aren’t actually having this issue because, like I said, the winner of the game awards this year was a cartoony game, so clearly they are well aware that graphics aren’t everything.
One way to understand the video game industry’s current crisis is by looking closely at Spider-Man’s spandex.
For decades, companies like Sony and Microsoft have bet that realistic graphics were the key to attracting bigger audiences. By investing in technology, they have elevated flat pixelated worlds into experiences that often feel like stepping into a movie.
Designers of last year’s Marvel’s Spider-Man 2 used the processing power of the PlayStation 5 so Peter Parker’s outfits would be rendered with realistic textures and skyscraper windows could reflect rays of sunlight.
That level of detail did not come cheap.
Insomniac Games, which is owned by Sony, spent about $300 million to develop Spider-Man 2, according to leaked documents, more than triple the budget of the first game in the series, which was released five years earlier. Chasing Hollywood realism requires Hollywood budgets, and even though Spider-Man 2 sold more than 11 million copies, several members of Insomniac lost their jobs when Sony announced 900 layoffs in February.
Cinematic games are getting so expensive and time-consuming to make that the video game industry has started to acknowledge that investing in graphics is providing diminished financial returns.
“It’s very clear that high-fidelity visuals are only moving the needle for a vocal class of gamers in their 40s and 50s,” said Jacob Navok, a former executive at Square Enix who left that studio, known for the Final Fantasy series, in 2016 to start his own media company. “But what does my 7-year-old son play? Minecraft. Roblox. Fortnite.”
Joost van Dreunen, a market analyst and professor at New York University, said it was clear what younger generations value in their video games: “Playing is an excuse for hanging out with other people.”
When millions are happy to play old games with outdated graphics — including Roblox (2006), Minecraft (2009) and Fortnite (2017) — it creates challenges for studios that make blockbuster single-player titles. The industry’s audience has slightly shrunk for the first time in decades. Studios are rapidly closing and sweeping layoffs have affected more than 20,000 employees in the past two years, including more than 2,500 Microsoft workers.
Many video game developers built their careers during an era that glorified graphical fidelity. They marveled at a scene from The Last of Us: Part II in which Ellie, the protagonist, removes a shirt over her head to reveal bruises and scrapes on her back without any technical glitches.
But a few years later, costly graphical upgrades are often barely noticeable.
When the studio Naughty Dog released a remastered version of The Last of Us: Part II this year, light bounced off lakes and puddles with a more realistic shimmer. In a November ad for the PlayStation 5 Pro, an enhanced version of the Sony console that retails for almost $700, the billboards in Spider-Man 2’s Manhattan featured crisper letters.
Optimizing cinematic games for a narrow group of consumers who have spent hundreds of dollars on a console or computer may no longer make financial sense. Studios are increasingly prioritizing games with basic graphics that can be played on the smartphones already in everyone’s pocket.
“They essentially run on toasters,” said Matthew Ball, an entrepreneur and video game analyst, talking about games like Roblox and League of Legends. “The developers aren’t chasing graphics but the social connections that players have built over time.” Going Hollywood
Developers had long taught players to equate realism with excellence, but this new toaster generation of gamers is upsetting industry orthodoxies. The developer behind Animal Well, which received extensive praise this year, said the game’s file size was smaller than many of the screenshots used to promote it.
A company like Nintendo was once the exception that proved the rule, telling its audiences over the past 40 years that graphics were not a priority.
That strategy had shown weaknesses through the 1990s and 2000s, when the Nintendo 64 and GameCube had weaker visuals and sold fewer copies than Sony consoles. But now the tables have turned. Industry figures joke about how a cartoony game like Luigi’s Mansion 3 on the Nintendo Switch considerably outsells gorgeous cinematic narratives on the PlayStation 5 like Final Fantasy VII Rebirth.
There are a number of theories why gamers have turned their backs on realism. One hypothesis is that players got tired of seeing the same artistic style in major releases. Others speculate that cinematic graphics require so much time and money to develop that gameplay suffers, leaving customers with a hollow experience.
Another theory is that major studios have spent recent years reshaping themselves in Hollywood’s image, pursuing crossover deals that have given audiences “The Super Mario Bros. Movie” and “The Last of Us” on HBO. Not only have companies like Ubisoft opened divisions to produce films, but their games include an astonishing amount of scenes where players watch the story unfold.
In 2007, the first Assassin’s Creed provided more than 2.5 hours of footage for a fan edit of the game’s narrative. As the series progressed, so did Ubisoft’s taste for cinema. Like many studios, it increasingly leaned on motion-capture animators who could create scenes using human actors on soundstages. A fan edit of Assassin’s Creed: Valhalla, which was released in 2020, lasted about 23 hours — longer than two seasons of “Game of Thrones.”
Gamers and journalists began talking about how the franchise’s entries had gotten too bloated and expensive. Ubisoft developers advertised last year’s Assassin’s Creed Mirage, which had about five hours of cut scenes, as “more intimate.”
The immersive graphics of virtual reality can also be prohibitive for gamers; the Meta Quest Pro sells for $1,000 and the Apple Vision Pro for $3,500. This year, the chief executive of Ubisoft, Yves Guillemot, told the company’s investors that because the virtual reality version of Assassin’s Creed did not meet sales expectations, the company was not increasing its investment in the technology. ImageA person plays a video game on a tablet. Live service games that are playable on mobile devices, like Genshin Impact, can generate large amounts of revenue. Credit…Ina Fassbender/Agence France-Presse — Getty Images
Many studios have instead turned to the live service model, where graphics are less important than a regular drip of new content that keeps players engaged. Genshin Impact, by the studio Hoyoverse, makes roughly $2 billion every year on mobile platforms alone, according to the data tracker Sensor Tower. Going Broke?
It was clear this year, however, that the live service strategy carries its own risks. Warner Bros. Discovery took a $200 million loss on Suicide Squad: Kill the Justice League, according to Bloomberg. Sony closed the studio behind Concord, its attempt to compete with team-based shooters like Overwatch and Apex Legends, one month after the game released to a minuscule player base.
“We have a market that has been in growth mode for decades,” Ball said. “Now we are in a mature market where instead of making bets on growth, companies need to try and steal shares from each other.”
Some industry professionals believe there is a path for superb-looking games to survive the cost crunch.
“I used to be a high-fidelity guy; I would log into games and if it didn’t look hyperrealistic, then it was not so interesting,” said David Reitman, a managing director at PricewaterhouseCoopers, where he leads the consulting firm’s games division. “There was a race to hyperrealism, and it’s tough to pivot away. You have set expectations.”
Reitman sees a future where most of the heavy costs associated with cutting-edge graphics are handled by artificial intelligence. He said that manufacturers were working on creating A.I. chips for consoles that would facilitate those changes, and that some game studios were already using smart algorithms to improve graphics further than anything previously seen.
He expects that sports games will be the first genre to see considerable improvements because developers have access to hundreds of hours of game footage. “They can take feeds from leagues and transpose them into graphical renderings,” Reitman said, “leveraging language models to generate the incremental movements and facial expressions of players.”
Some independent developers are less convinced. “The idea that there will be content from A.I. before we figure out how it works and where it will source data from is really hard,” said Rami Ismail, a game developer in the Netherlands.
Ismail is worried that major studios are in a tight spot where traditional games have become too expensive but live service games have become too risky. He pointed to recent games that had both jaw-dropping realism — Avatar: Frontiers of Pandora (individual pebbles of gravel cast shadows) and Senua’s Saga: Hellblade II (rays of sunlight flicker through the trees) — and lackluster sales.
He recalled a question that emerged early in the coronavirus pandemic and has become something of an unofficial motto in the video game industry.
In a lot of cases, I find I’ve already read the underlying content or skipped it with my reader and therefore can go right to the comments. But ymmv of course.
Maybe it’s just me, but I like the style it’s presented in, and I have major adblockers in service so I’m not sure how it’s a drug fueled hellscape. It basically becomes a normal NYT article after a half-page of scrolling. Not all their readers are familiar with these games, so the NYT is doing its diligence by trying to show what they’re talking about, so their readers have a frame of reference. (Remember the NYT is actually aimed at an investor class who owns a second house in the Hamptons and may not be gamers at all. Go look at their Lifestyle section sometime.)
I think it’s fine but I guess I’m in the minority, but also maybe it’s less worse for me because of uBlock/Pihole/Bypass Paywalls Clean.
Unpopular opinion but I preferer the graphics of a game were absolute trash but the ost be awesome. I can forget easyly how much individual hairs are in a 3d model, but good OST will live in my mind and heart forever.
The Wii was a fantastic example of this. Less capable hardware used in very imaginative ways, and had the capacity to bring older people into the games
This is why so many indie games are awesome. The graphics don’t need to be great when the soundtracks and gameplay more than make up for it. Those are what actually matter. I have most of Undertale’s OST committed to memory at this point lol
GSC in my opinion ruined stalker 2 in the chase for “next gen” graphics. And modern graphics are now so dependent on upscaling and frame gen, sad to see but trailers sell.
nytimes.com
Gorące