It’s worth noting too that trash mobs aren’t limited to random encounters. Baldur’s Gate 1 and 2 are littered with trash mobs, and none of them are random except for maybe traversing between towns.
Random encounters tend to be trash mobs, and I hate trash mobs. I know even in the late 90s, there were some prehistoric internet memes about FF7, and having just played it recently, I remember why. There were so many of them. You’d easily forget where you were going and what you were doing because you’d be interrupted by random encounter trash mobs every couple of seconds. They weren’t too hard, so you didn’t have to think very much to get through them, which made them uninteresting, and they also, like you said, just kind of screwed with the flow of the game. So generally, I don’t like them.
The average person has absolutely no idea how much it costs to make a game, so any report that comes out for any game is enlightening. When Skullgirls developers tell people that it’ll cost $150k to make a single new character, and when other fighting game developers weighed in and said, “actually, that’s insanely cheap,” it level sets expectations for what a customer can actually expect out of a producer. The largest productions of their day during the era of the original Xbox and PS2 didn’t even typically come in at $50M per game. There are a lot of reasons why it can’t be exactly that anymore, but ballooning budgets are why the industry is in this spot where it’s wholly unsustainable, because if you’re spending hundreds of millions of dollars and you didn’t make one of the most successful games in the history of the medium, it won’t be making its money back.
Iterating on a trend is smart business. Iterating on a trend over the course of 6 to 8 years is not, not only because it makes the game more expensive to make and raises the floor for success, but also because the audience for that trend has likely moved on. If Concord truly cost $400M to make, it adds one more data point for people to understand how much a game can cost, and maybe, just maybe, it will make more companies focus on building a game that they know they can afford to make rather than being all or nothing on one of the riskiest projects in history. That will keep people employed rather than rapid expansion from investment into a bubble and hundreds of layoffs when the project goes south.
why would I do what you suggest? So that games journalism can continue to beat a dead horse?
Because the truth is worth knowing, and it sucks that this stuff is obfuscated the way it is compared to something like the movie industry. If true, I’d call it constructive reporting if the message becomes clear that this is an example of what’s ravaging the industry; trend chasing with absurd amounts of money designed to extract some mythical amount of money from people rather than building good products on sane budgets that keep people employed. But the point is moot if you not only don’t agree but also aren’t in a position to refute it.
They also outsourced a ton to make CG cut-scenes and such, which can rack up a bill very quickly. ProbablyMonsters was an incubator, not a parent company, as I understand it. I too am skeptical of there only being one source in Colin Moriarty, but I trust Jordan Middler to vet the story, even if he isn’t corroborating it, and as others have mentioned, the credits are literally over an hour long, which is evidence that supports the high costs.
I could take one look at those models and animations and tell you it wasn’t cheap. Then probably a lot of money went into those CG cut-scenes that were intended to be rolled out weekly.
I played it on Xbox and then PC even back in the day, and I’d 100% believe that it’s poorly optimized; they patched it a few months after launch to remove a lot of extraneous, unseen detail on the map that was hurting performance. It’s still surprising if you can’t run a 10 year old game well on a modest modern PC.
Unity was the best in my opinion. Origins, Odyssey, and Valhalla are all the new design of Assassin’s Creed games that earned their own set of fans, but they’re so different from what came before with their faux RPG design. The fantasy is broken for me when I sneak up behind someone, stab them in the neck, and their health bar only goes down a little bit.
The first Assassin’s Creed game was very repetitive, but they gave you small assassination missions for you to figure out how to get, kill your target, and get out. The next several games in the series were better in every way except for perhaps these missions that mattered most, which they made extremely linear and scripted action missions.
Unity (set in Paris in the late 1700s) was an answer to those frustrations. There was a point in the dialogue where they specifically called it out. “So what’s the plan?” “The plan? Come up with your own plan. I’m not here to hold your hand.” They gave you expansive areas to carry out your mission, and you could find your own way in, kill your target, and get out. The game has some of its own baggage, like the loot system taking any challenge out of the combat later in the game, when the whole idea was that you were squishy that you should avoid combat, but it delivered on the experience the best since the first game.
Then Syndicate came out next, and they highlighted different ways to do your assassination like you were a big dummy, and they made a significant part of the game about street brawling, so I gave it a hard pass. The next game in the series was Origins, which brings us to the modern faux RPG era.
When the game had a free beta, there was hardly anyone playing it. At some point you’ve just got server costs and promises of live service content rollouts that can only cost you money.