This is selection bias. You remember Metal Gear Solid, but do you remember Iron & Blood: Warriors of Ravenloft? Do you remember Mortal Kombat Mythologies: Sub-Zero? Bubsy 3D? The million-and-one licensed games that were churned out like baseball cards back then?
and make gamers replay the game with unlockable features based on skills, not money
If we’re going to say that a full-price game today costs $70, Metal Gear Solid would have cost the equivalent of $95. Not only that, but that was very much the Blockbuster and strategy guide era. Games would often have one of their best levels up front so that you can see what makes the game good, but then level 2 or 3 would hit a huge difficulty spike…just enough to make you have to rent the game multiple times or to cave in and buy it when you couldn’t beat it in a weekend. Or you’d have something like Final Fantasy VII, which I just finished for the first time recently, and let me tell you: games that big were designed to sell strategy guides (or hint hotlines) as a revenue stream. There would be some esoteric riddle, or some obscure corner of the map that you need to happen upon in order to progress the game forward. The business model always, at every step of the medium’s history, affects the game design.
“Value” is going to be a very subjective thing, but for better or worse, the equivalent game today is far more packed full of “stuff” to do, even when you discount the ones that get there just by adding grinding. There are things I miss about the old days too, but try to keep it in perspective.
There’s just so much everything now a days. There’s tons of great new music and tons of great new games buried in all the new stuff thats being pumped out that it’s hard to find the gems. There’s lots of passionate people out there taking the time and effort to try and make the best
“Value” is going to be a very subjective thing, but for better or worse, the equivalent game today is far more packed full of “stuff” to do, even when you discount the ones that get there just by adding grinding. There are things I miss about the old days too, but try to keep it in perspective.
Exactly this.
Games back then were pricier - once you account for inflation.
Games back then did expect you to pay extra - in fact quite a few were deliberately designed to have unsolvable moments without either having the official strategy guide or at least a friend who had it who could tell you.
Games back then were pricier - once you account for inflation.
That's commonly said but ignores other economic factors such as income, unspent money, and cost-of-living.
Though lots of things are better now: the entire back-catalogue of games, more access to review/forums, free games (and also ability to create your own games without doing so from nothing) etc. Aside from when video store rental was applicable, early gaming was more take-what-you-can-get (niche hardware/platforms might still have that feel somewhat).
That’s commonly said but ignores other economic factors such as income, unspent money, and cost-of-living.
Inflation is derived by indexing all of those things. Some things are far more expensive or far cheaper relative to each other, but we approximate the buying power of a dollar by looking at all of it.
a few were deliberately designed to have unsolvable moments without either having the official strategy guide or at least a friend who had it who could tell you.
Do you have an example?
I knew kids that bought strategy guides, I worked at a game shop that sold strategy guides, and as far as I could tell they were for chumps. People who has more money than creativity.
Cosmetic DLC feels like it’s for chumps too, but it’s lucrative. The best example is going to be Simon’s Quest, without a doubt. The strategy guide was in an issue of Nintendo Power. I’m sure they were also happy to let social pressures on the playground either sell the strategy guides or the game just by word of mouth as kids discussed how to progress in the game. A Link to the Past is full of this stuff too. The game grinds to a halt at several points until you happen to find a macguffin that the game doesn’t even tell you that you need. Without the strategy guide, you could end up finding those things by spending tons of hours exploring every corner of the map, but by today’s standards, we’d call that padding.
I forgot about hint hotlines. They’d charge per minute and did everything they could to keep you on the phone. I called a hotline once and my parents weren’t too happy about it.
That was a large part of the charm for me in Tunic. The core mechanic was collecting pages of the instruction booklet as you adventured so you could learn the mechanics of the game. The other part of that being the manual was written in an unknown language* and you’d need to infer what the instructions meant using context clues. It was an absolute blast and hit the dopamine button when I figured out some puzzles.
That’s some survivorship bias shit right here. I can’t tell you how many shitty, buggy games I played in the days of early console and PC gaming. Even games that were revolutionary and objectively good games sometimes had game-breaking bugs, but often it was harder to find them without the internet.
Plus, don’t you remember expansion packs? That was the original form of DLC.
Yeah, if a DLC isnt just content taken out of the main game (in a way that makes the main game worse) and is reasonably priced for the amount of content it contains, then it is a good way for developers to get paid for continuing development of a game after launch when it was already finished at launch.
Oh man, while I was reading the first part of your comment I was thinking of the Witcher 3 DLCs the whole time, I’m so glad that you mentioned them at the end there!
I don’t see how the amount of “completeness” can even be measured. Is it really so much worse that you can buy extra fighters for the Street Fighter 6 that you already own rather than buying Super, Turbo, and then Super Turbo at full price every time? Or that you can choose to buy just the stuff you want for Cities: Skylines for half the price instead of paying twice as much to get stuff that don’t care about along with it? Plus, expansions like Phantom Liberty and Shadow of the Erdtree are bigger than most entire video games from the 90s.
Except for when they did not, which was actually somewhat common.
But it also became quickly known, respectively stores stopped stocking buggy games. So in return, larger publishers tried their utmost to ensure that games could not have bigger bugs remaining on launch (Nintendo Seal of Excellence for example was one such certification).
But make no mistake, tons of games you fondly remember from your childhood were bugged to hell and back. You just didn’t notice, and the bigger CTDs and stuff did not exist as much, yes.
PC:
It was just flat-out worse back then. But we also thought about it the reverse way: It wasn’t “Oh this doesn’t work on my specific configuration, wtf?!” but “Oh damn I forgot I need a specific VESA card for this, not just any. Gonna take this to my friend who has that card to play it.”.
Counterpoint: budget re-releases of games (e.g. ‘Platinum’ on PlayStation) were often an opportunity to fix bugs, or sometimes even add new features. A few examples:
Space Invaders 1500 was a re-release of Space Invaders 2000, with a few new game modes.
Spyro: Year of the Dragon’s ‘Greatest Hits’ release added a bunch of music that was missing in the original release.
Ridge Racer Type 4 came with a disc containing an updated version of the first Ridge Racer, which ran at 60fps.
Super Mario 64’s ‘Shindou Edition’ added rumble pak support, as well as fixing a whole bunch of bugs (famously, the backwards long jump).
Those are just off the top of my head. I’m certain there are more re-releases that represent the true ‘final’ version of a game.
That’s the exception rather than the rule. If you have the opportunity to make some changes in a new batch, why not take it?
Generally, when the game was released, it had to be done. If there were any major bugs, then people would be returning their copies and probably not buying an updated release. It’d also hurt the reputation of the developer, the publisher, and even the console’s company if it was too prevalent of a problem.
I don’t think anybody I knew ever got an update to a console game without just happening to buy v1.2 or something. There were updated rereleases, but aside from PC gaming, I don’t think most console gamers back then ever thought “I hope they fix this bug with an update”.
A few days ago, I found out that one of the first games I ever owned, The Broken Land, was abandonware. I knew that it was generally considered a bad Diablo knock-off, but I had it remembered as at least the items and enemies being ‘meaningful’ in ways I don’t see it today anymore.
Lots of games just look formulaic and predictable to me now. Like, there’s a small and a medium potion, yeah alright game, I’m slowly getting too large of a health pool for you to not give me the big potions.
Well, I looked a little closer at the screenshots, and yeah, fuck me, the game doesn’t even try to hide its formulaicness. Health potions are literally just PNGs with a number attached, in variants, small, medium, big. There’s like 10 different PNGs of armor. And you’ll frequently have just one or two enemy types copy-pasted all over an area.
I guess, that is why people call it a bad Diablo knock-off. But having been a kid without expectations when I played it, that had me remember specifically that part as comparatively good, when it was objectively pretty bad…
When I was a kid, some piece of computer hardware came with some game demos. There was one called Taskmaker. It was not good graphically, but I really enjoyed it. It allowed access to three areas, I think. I played so much that I was able to beef up my character significantly. I was eventually given the full game. I played it so much. I tried bribing all of the NPCs to see what they’d say/do. There was a text box where you could type spells into. The normal progression of the game didn’t really give you many of them, but bestowing stuff to NPCs was one way to learn some.
Anyway, I found an abandonware version of it a while back and installed it on an old Mac virtual desktop. It still holds much of the same magic for me. I don’t have time to bribe every NPCs now, but I remember a lot, and google helps me with the rest.
Games back then were also typically made by two dudes: one programmer and one artist. Heck, the original doom was made by five dudes: two programmers, two artists and one designer. I wonder what kind of nes games could be made back then if they had AAA budget like modern games.
One of those programmers was John Carmack, who happens to be ridiculously talented, and actually revolutionized PC gaming multiple times throughout his career.
Games were definitely buggy and I honestly think people forget how much better the quality is nowadays.
I also think there is something to it just being the 90s or so and not having much choice. If you only have one game to play then of course you’re going to replay it to death. If I have a steam library of 1000 games then I’m much less likely to.
A lot of this is just nostalgia for the past and the environment as opposed to games being any better.
I mean technical wise, games are better now and could easily be patched, but I think that’s why games had better gameplay in the past to make up for the lack of gamer accessibility to patching.
You’re saying that because games couldn’t be patched, they had better gameplay? That makes no sense at all.
Lots of games had crap gameplay. There are more junk vintage games than good ones. The gameplay was simple because it had to be. The consoles didn’t have the power to do more. Chips were expensive. So they had to invent simple gameplay that could fit in 4k of ROM. If dirt simple gameplay is your thing, great. The Atari joystick had one stinking button for crying out loud.
You think Space Invaders has better gameplay than Sky Force Reloaded? Or Strider has better gameplay than Hollow Knight? You’re insane.
E.T. for the 2600 had gameplay so bad it crashed the entire video game industry.
Double Dragon on NES had a jump that was impossible to make forcing the company to make a new cart and give refunds.
I might be misremembering what game it was. I was just a kid when I learned about it. I can’t seem to find anything about it other than an impossible jump in the PC port of TMNT.
It’s a nostalgia thing - I don’t remember the games where I got stuck on the first level and could never finish the game (which happened). Or were just boring so I quit after a half hour.
I do remember donkey Kong country, super Mario bros, sonic Etc. Which all worked well and were fun.
Yeah quality has improved massively, maybe not the initial release but 90% of games i recently played were regarded as buggy messes on release. After years of updates they mostly work.
I’m unfamiliar with that game. Was World Games buggy or just bad? The quality the OP referred to was bugs, not gameplay.
Even the worst AAA game today has better game play than anything from 30 years ago. It’s the nature of extreme complexity that allowing players freedom makes complete debugging impossible.
Actually, OP very explicitly said to ignore bugs and was only talking about gameplay. Which is why they talk about extreme replayability being the requirement on old games.
I just realized you were talking about who i responded to, not OP. But still, they weren’t only talking about bugginess.
The basic mechanics of a game (eg. Mario) better be fun, and those first couple of levels better be fun, because that’s what you’ll be doing a lot. It’s similar to how the swinging in Spider-Man better be fun because you’ll be doing it a lot. But the it also has more complex fighting, side content, and a story. You can mess up a lot more while there’s still enough to keep it entertaining.
But people don’t remember the majority of games that were not very good. World Games was just a game that came to mind on the NES as being not very fun, but more importantly forgotten.
Hehe. World Games was an Olympic event type of game for the NES and other systems back in the late 80’s.
It was actually a well reviewed and enjoyed game, so I’m not sure why he decided to use it as an example when there were so many other actually bad games back then. It also caused a “spoof” game to be made on the NES called “Caveman games”, which did a similar game style, but set in caveman times with caveman events. I preferred caveman games as a kid, and still do. Racing against a friend on who can rub sticks together and blow on the smoke to make fire first is still a blast. So is beating the other guy with a caveman club. Good times.
World Games was so good they made a spoof sequel of sorts called caveman games. A lot of people remember world games, it was a well received game. You had so many actually forgettable garbage games to choose from…
I have never heard anyone talk about that game, ever. But I remembered hating it as a kid. But social media wasn’t a thing back then. So I don’t know if it was talked about elsewhere.
If that was a well received game, I guess it speaks volumes about the rest of the NES library.
It’s because it wasn’t really a young kids game. It was aimed at a bit older of a crowd. They made a later version of it called caveman games that was geared more towards kids and it was a lot of fun, with mostly the same game mechanics.
What games were buggy for you? I’ve been replaying a lot of older games I used to play from my childhood (SNES to Xbox 360/PS3/Wii era) and not coming up with a lot of bugs except from emulation.
They weren’t as buggy. People making excuses classify exploits as bugs ignoring that modern games have more bugs and exploits.
I played Atari 2600 games like space invaders, adventure, and pitfall for thousands of hours without ever running into a bug. The only game with an exploit was Combat where you could put your tank muzzle into a corner and make it loop across the map. But both players could do it.
I’ve grown up with a PS1 and a handful of pc games, and I don’t remember any of them being any more bugged than modern gaming. The only exception being Digimon World 1, a notoriously buggy game (but to be fair, half of those bugs were introduced by the inept translation’s team).
I know people nowadays know and use a bunch of glitches for speedruns and challenge runs (out-of-bounds glitches being the norm for such runs), but rarely, if ever, those glitches could be accessed by playing through the game normally, to the point that I don’t remember finding any game breaking bug in any of the games I played in my infancy (barring the aforementioned Digimon World).
A couple years back I found my old Gameboy advanced. I tried to play Kirby on it and I was taken back by how much it sucked. The screen was way smaller than I remember it being and there was no backlight which meant I had to play the game in a well lit room. I don’t think I could ever go back to those days.
Nah, in the 80s we had hundreds probably thousands of games for the commodore 64 and later the amiga 500, all of them pirated. The piracy scene was huge, and often the games were free as we just copied them from friends
I think it’s because people only remember the good games and not the stinkers.
I played a lot of shit games I can’t recall because I played for 30 minutes max. There was one game I never passed the first level as I couldn’t figure out what to do, I think something to do with jelly beans and a blob. How is that good gameplay lol?
But of course myself and others can tell you about the games we played for hours like Super Mario Bros which didn’t really have bugs and were good.
A boy and his blob! That was a great game! But it did not hold your hand at all, you had to figure out what every different jerky bean did to your blob. It was a good enough game that there was a modern remake I think it’s on Nintendo virtual console.
But yeah, that was a legitimately hard game for a kid. And with nothing, it wasn’t buggy, the gameplay was just different from anything else people were familiar with and it didn’t explain itself.
The difference is back in the day the great games were the highly advertised “big ones” and the “stinkers” usually fell flat. Now you have a mountain of AAA stinkers and have to go scavenging for indie gems.
Not sure that’s right - before the internet I had no clue what was supposedly good or not. I’d rent games from blockbuster and just try them one by one. Lots of shitty games and I had no idea that Mario or sonic or anything was meant to be good.
Now it’s a lot easier just based on metacritic or steam reviews to figure out if something is good or not.
Well yes, maybe going that far back it was kind of a shot in the dark, but the late ‘90 to early ‘10 period was a time where you had internet (or at least tv/magazines) to know which games were “popular”, most of those were actually well done, and you’d rarely have an AAA title launch as a bugridden mess.
Reviews are also a hit-or-miss because they’re highly subjective. The Steam review system sucks as well, being only positive/negative and with troll reviews always at the top.
I also think there is something to it just being the 90s or so and not having much choice.
Absolutely. I enjoyed and played a lot out of King of Dragon Pass back in the day. Yesterday I sat down to finally play its spiritual successor Six Ages: Ride Like the Wind. From what I remember from KoDP it plays exactly the same (at least during the first hour). Yet I couldn’t force myself to keep playing it. Same way nowadays I can’t seem to get hooked with genres I used to play a ton as a kid: RTS games like Age of Empires II and Warcraft 3, life sims like The Sims, point & click graphic adventures like Monkey Island, traditional roguelikes, city builders, etc. Other genres I try to get back into and I do manage to play a ton of hours of but I’m never able to finish like when I was young (e.g. JRPGs)
When I try to play many of those games I tend to feel kinda impatient and wanting to use my limited time to play something else that I feel I might enjoy better. A good modern 4X game with lots of mod support like Stellaris or Civ6 instead of RTS games which have always felt a bit clunky to me. Short narrative games like Citizen Sleeper or Roadwarden instead of longer ones I’m not able to finish. Any addictive modern roguelite, especially if it features mechanics I particularly like (like deckbuilding and turn-based combat). If I ever feel interested to play a life sim or a city builder nowadays it has to feature more RPG elements and/or iterative elements and/or deckbuilding and a very compelling setting to me. And so on.
It feels like many of the newer genres (or the updated versions of old genres) are just more polished and fine-tuned than genres that used to be popular in the 90s and the 2000s. They just feel better to play. And to be fair in some cases they might be engineered to be more addicting, too. Like, I did finish Thimbleweed Park some years ago but I feel like nowadays no one is going to play witty point & click graphic adventure games with obscure puzzles if they can play a nice-looking adventure game filled with gacha waifus.
Genuine question: why does bedrock exist? What does it bring? Why is there the choice between java, bedrock and “Minecraft for windows”?
How do you fuck up this badly?
I tried using the launcher to move a java install from C: to another drive and it just points there and doesn’t do anything? Steam had this stuff figured years ago
Minecraft rewritten for better performance with platform interoperability in mind and so on. Essentially what could’ve or should’ve been a replacement to Minecraft if done right. It was not done right. Quite the opposite.
What’s that got to do with making things cross platform? Java programs only need to run in a Java runtime environment of which there’s one for basically everything. If you make something that runs in a JRE, it’ll be able to run on any device with a JRE that’s up to date for it.
given how many targets are supported by llvm there’s really little difference in cross platform support asides from building artifacts for the specific target platform. wrapping package delivery in a package manager removes the additional complexity to the end user.
Oh yeah that totally explains why it’s always been perfectly fine for me as long as I’m not looking at a giant wall of those shelves that display their contents from whatever modpack that was.
I wasn’t strictly talking about cross platform. I was talking about performance, which is tangential to the cross platform thing.
If you’re planning on making a game cross platform, you should choose a language that performs well for gaming on all platforms. Java ain’t that. Which answers your question:
In what world is c++ better for cross platform than Java?
Because Bedrock runs on phones, tablets, consoles, and a host of other random crap, and does so relatively well. Because of that the install base and playtime especially among younger players is actually massively skewed toward Bedrock being the more used. Add to that rumors that the Java codebase at least was a terrible mess, and the performance issues Java edition still has to this day and it’s no wonder they wanted to do a full rewrite, especially after having to make things like the console editions and even one for the 3DS.
There’s also the fact that Bedrock patches bugs that the Java community freaks out about patching. Several chunk update glitches and undesirable redstone behavior are exploited by the Java players, and they go nuts over the idea of fixing the issues. Bedrock, being a new codebase, obviously didn’t port over old crusty bugs and therefore doesn’t have to carry over those expectations.
To be fair I’d call it a wash. Bedrock fixes a lot of weird stuff like quasi connectivity and being able to push things like chests with pistons but also introduces it’s own bugs like weird timing things and randomly taking fall damage. There’s also weird differences like being able to do things with cauldrons or just like minor texture differences that they are slowly bringing into sync.
Yes exactly. Java runs on Windows, macOS, Linux and any x86 compatible operating system that supports the Java runtime environment. Minecraft bedrock removes support for all of those but Windows.
Go into desktop mode, there’s a bedrock launcher in the package manager store thing, I forgot the names of both of those things, but search “Minecraft” and you should find it. Anyway, it basically loads the Android version of the game. It works pretty well. I play bedrock because everyone I play with is on Xbox
Also swapped performance issues with more bugs that were there years ago and still persist because they are almost impossible to fix.
In the end, we all know it was done either because they wanted to shove microtransactions down our throat and/or had some kind of deal to maintain the Java edition without microtransactions.
Or, maybe, just maybe, they though “it’s just a block game, what could be so hard to rewrite it?” and absolutely failed what a single person got right almost immediately back in the day (like not falling through the floor all the time).
I haven’t modded as of yet but I started off with Java and am now a C# Dev and the transition wasn’t too hard since most of the same principles apply to both languages. Unity games, that are often written in C# to me are the most moddable ones, especially considering that there’s a ecosystem for Unity mods out of the box
Minecraft. Runescape. Mindustry. Slay the Spire. Project Zomboid. Doodle Jump. Shattered Pixel Dungeon. Delver. Lots of mobile games. Also It’s rediculous to say Java is inappropriate for games when C# is used for games via Unity (unity is the value proposition there - c# is very similar to Java)
Only if you’re incompetent. Otherwise just not optimal.
Starsector, Rise to Ruins and Project Zomboid run well and are made in Java for example. It’s harder to pull off but it can be done. (still needs native libraries though)
Is there even a choice? You now get both games when you buy one(and you get the other game for free if you already own one) and you can play on bedrock on java servers with geysermc
I accidentally bought Minecraft for windows for someone when I wanted java… so that was fun.
The launcher just says “here are the options” it doesn’t say why you might want to choose one over another. Ive played since day 0 so I’ve always gone with java.
Just felt like if bedrock was meant to replace java they should have just done it and dealt with it instead of having so many choices.
Honestly? Most large companies are more like high school. It’s all, friend groups, people rubbing each other’s backs, and in-fighting between departments.
A lot of VERY LARGE decisions get made for the stupidest reason.
Yea, back in the day I dreamt about a Minecraft that didn’t run on Java and thus better on the low end hardware I had. In my dreams it just still had all the benefits of the Java edition which is why I now dream of old Java Minecraft
Yeah the infuriating part is not the mere existence of bedrock, but the fact that they purposefully made it suck. It could have been much better than the java version if they did it right.
So just fuck everyone who doesn’t play on Pc? There are aspects about bedrock that should be gotten rid of, but it’s existence is for the ability for cross play to exist. I play both versions but bedrock made making a cross platform server for my friends and I (who all play on different consoles) possible.
I will not touch Bedrock edition,
especially not since it requires you to sign in on your Windows with a M$ account, while my Windows KVM is Ameliorated, which strips the ability to do so, nor would I want to if I could.
you don’t need to sign in, you can just sideload the appx package (it’s likely to fail due to license verification, there are ways around it tho, like stopping the licensing service)
The mods are shit too. I don’t know what their API is like but it’s clearly not good if you have this entire legacy of modded minecraft, a game which is (presumably) way better programmed and they’re actively paying people to do it, yet they can barely accomplish a 10th of the quality.
Even if they were good you’d have to interact with that horse-shit mobile game premium currency model (which absolutely should be made illegal) where you have to buy currency in packs with bigger packs having a discount and are never in sizes that are usable for a single purchase. Having to pay for mods is contentious enough as is, but putting it behind abusive MTX is going to be a deal breaker for the rest.
We’ll see if the trend holds in a month, a year, and a decade. I think the flaws holding it back will prevent any growth, charging money for mods is radioactive to the community as Railcraft had proven before they were forced by law not to paywall their updates.
Best case scenario we get the Google Play Store where people don’t make stuff because they want to, but because they want to make money, but like I pointed out that MTX scheme is absolutely going to result in bad and confusing payouts which will drive away even those people. If it turns out they’re paid in scrip- I mean minecoins, than at best you’re getting a bunch of kids who don’t understand labour exploitation yet.
EDIT: I looked into it and it’s mostly just kids who don’t understand how exploitative the whole thing is. The API is also extremely lacking.
lemmy.world
Najnowsze