I always assumed that NPCs represented mroe than one actual citizen, because otherwise the world would become far to cluttered, and teh system requirements far to high to manage literally thousands of NPCs that exist for no reason.
Yeah, from a historical perspective it makes sense to cut down. Maybe not really as of Skyrim and later games, though. GTA games, including ones almost a decade older than Skyrim, manage to have a fairly reasonably high "population" of NPCs.
GTA and Skyrim are very different games and therefore need a different approach. GTA has some key characters which drive the story. The other NPCs are ‘set dressing’: fairly simple, nothing interesting to say, don’t have a home. They are just spawned in where needed. Skyrim has a different approach. Except soldiers/guards pretty much every NPC has a ‘life’: they have daily routines, unique voice lines, a place to sleep and go there every night - except they are on a quest themselves.
GTA and Skyrim go for a different feeling, which is why they need different solutions.
You’re aware nearly every npc has a name I wager? Those who aren’t are typically guards or bandits, half the NPCs in any given tavern have names but don’t have quests associated with them
I recall ages ago having read a theory about this concept of compression. That most game worlds that we see aren’t literal, but rather are compressions of the world that characters experience. A city that we see might have just 5 streets, but that’s just the city being compressed to a manageable size. For what characters experience, there’d be hundreds of streets. And same thing for NPCs, as you put it. We mostly only see the important NPCs and a small sample of others, but there’s many NPCs that really are there for story telling purposes, they just aren’t shown.
It’s a really good technique if pulled off well. After all, it’s really hard to have cities in game. You have to do something to limit it. Either padding it out, making most of it unvisitable, “compressing” it, or… just not having cities. Every option has downsides, but at least the compression approach optimizes for gameplay and your time.
Starfield proves that with New Atlantis. Its annoyingly huge, spread out pointlessly (for gameplay purposes, obviously a capital city is going to be huge lore/realistically), and is all around irritating to find stuff in… As an example
We want cities that feel big and vast, while being manageable and navigable for players playing a game.
We want cities that feel that are full of life and bristling with NPCs, without actually having so many NPCs that that you’d need a cray supercomputer to process it all.
But how are we going to emulate proprietary online services for games relying on them?
Games preservation should be legally enshrined, and require client and server source code to be published if a provider decides to stop running the online services required to play.
Not that most modern multiplayer games are worth preserving due to their toxic design, but this isn’t a huge issue. BF2 servers started back up thanks to Russians loving the shit out of that game. Warcraft 3 is still very much playable online and NOT on battlenet thanks to W3Connect. Fightcade made 90s 2D fighters playable online. Numerous console emulators support netplay.
This is a fairly lofty and unrealistic goal. Unfortunately, the right for companies to keep their source code private isn’t going to go away anytime soon and if they were legally compelled to release binaries, the setup for a modern cloud based online experience is not for the faint of heart.
A more realistic goal would be to say that all products should be usable offline (with exceptions for impossibilities like an instant messenger or something)
If the online servers don’t exist anymore, there should be a path to functionality without them. For everything, given the rise of iot especially. If there’s a path to functionality without the online service there’s a path to preserving the game
Private servers don’t really happen much or at all anymore, “here’s a .exe you can run” idea doesn’t scale on modern online infrastructure well
Emulation is typically a very difficult thing to do, often requiring cracking the original game to get it to work with non official servers and also mapping and building out all the online subsystems. It’s rare.
There are lots of examples of online services being REd to bring old games back to life, but doing it after the service has been killed off is A LOT of work
If more people captured network traffic of these services before they’re killed it’ll probably make REing the service much easier later
Well, if your games is popular enough some may start to do revival project or create these custom servers.
Back in late 2000s I rememver my brother who used to play WoW on private server (which unaffiliated with Blizzard) and mostly these unofficial server are popular for MMOs game back then.
Nowaday, you can have something like OpenSpy which emulates GameSpy servers runs by communities. It is all depend how deeper you want to venture each games.
What you can’t preserve is the joy of playing on period correct experience :)
I find the video from LTT kinda hilarious with the 96 core threadripper. Breaking records in cinebench but Cities Skylines 2 still runs like shit (in a 1mio pop city).
Because chances are the 7800X3D will be faster due to the cache.
Real-world applications often can only be parallelized so and so much, before you start hitting diminishing returns for many reasons. A lot of it is about the actual technical design as much as it is the technical execution (you can’t parallelize two operations if one depends on the result of the other).
That’s the point I think. I haven’t delved into the specifics too much but the 7800X3D favors big cache sizes (at all layers iirc) over cores/threads quantity. So, it should fare better with games that aren’t very optimized for multi threading (ie, most games)
Except you’re wrong. The controller is on the BACK of the head, you know, where you’re looking. So both x and y should be inverted. Anything else makes 0 sense
Well the analogy doesn't perfectly work with pad controllers.
It does work with flight sticks. Vertical controls pitch (up and down), horizontal controls roll (tilt left and right). You've got pedals/stick twist for yaw (turn to right or left) or the hat/thumb stick for view angle change.
I got familiar with it from TimeSplitters 2, but that’s basically same devs.
Edit: not sure how the control schemes of the FPS sections in Banjo Tooie were, but I could have acquainted myself with it back then. I guess it’s similar due to being a Rare game.
Played a bunch of flight simulators and similar games back in the day. If it's universally considered the best way to steer a goddamn aircraft safely and accurately, who am I to argue.
I used to hate inverted as a kid, but a few games had them by default and my brain switched and could never go back. I’m playing Outer Wilds now and I had to immediately switch it.
I get it’s a meme but I usually play on 144fps and when I go back to 60 fps I literally don’t notice a difference even down to like 40-45 I barely see much difference. 30 is noticeable and a bit shit, but my eyes get used to it after like 30 mins, so not a big deal.
I think it takes significantly more mental extrapolation between frames and general adjustment to your eyes not receiving frames at as quick of a rate but if the frame rate is fairly stable the human brain adapts.
The brains visual processing is so powerful that the difference between 30fps and 144fps on paper is much smaller in reality, especially if your brain has already learned the muscle memory of “upscaling” a low framerate to work with its perception of a 3D environment.
Competitively, for games like arena shooters or rocket league the frame rate is real but for most games it is a matter of the smoothness occurring on the physical monitor screen or it occurring on some level of mental image processing. What someone sees who has let the mental skill of processing a lower frame rate atrophy is a temporary sensation like putting on colored glasses for awhile, then taking them off and seeing everything washed out in a particular color. Weird, uncomfortable, but temporary.
The real problem is inconsistent framerate where the clock your brain has gotten used to receiving new visual information with the arrival of each new frame of visual information is slow enough to be on the edge of perception but keeps speeding up or slowing down chaotically. Your brain can’t just train a static layer of mental image processing to smoothen that out. Time almost feels like it is slowing up and speeding down a little, and it becomes emotionally discouraging that every time something fun happens the framerate dips and reality becomes choppier.
Yeah no, a game I regularly play just had 120Hz support added, and I’m never changing that back. I once even tried editing the .ini config just to change the framerate after coming back to it from a 120Hz game. It just is night and day, both in the input lag and the smoothness of the image
Someone already mentioned those graphics were optimized for old CRT TV’s, but also consider the fact that it was simply the best wed seen, and it blew our minds.
Just imagine what top notch realism will be 20 years from now, assuming it’s not all DLC for the same old stuff, obviously.
It’s still happening, there’s just so much of it now we’re more aware of what the improvements actually are on a technical level, that we’ve come to expect it even before release of the lastest thing. And it’s mostly disappointing now because we’re chasing that same high.
The changes are more incremental now too. It’s slightly better textures here, better lighting there, maybe a studio puts extra effort into motion nature and animations. But it’s not leaps and bounds better every generation anymore like it used to be.
There was that video going around a couple days ago comparing Arkham Knight to Suicide Squad and that’s a great example of graphics not getting noticeably better if a studio doesn’t really try for it.
But I’ll bet games that start coming out with the latest Unreal Engine, like Senua’s Saga, are going to give some of that feeling of amazement again.
Honestly, a good CRT shader is a real game changer for emulation. Many emulators have the ability to add a mesh grid over the top of the image, but this is just about the worst way to try to emulate a CRT; It doesn’t actually emulate CRT pixels, and the black grid laid on top of everything simply reduces the overall image brightness.
For an example of a good CRT shader, consider looking into CRT Royale. The benefit to a shader is that it’s actually running each frame through a calculation before it reaches your screen. So it is actually able to emulate a CRT properly. Shaders can actually emulate the individual red/green/blue pixels of CRTs, emulate the bloom around white text, emulate the smearing that occurs with large color differences, etc… It really does make old games much more pleasant to look at.
This is even earlier, the 80s, but I remember getting a not especially good game called The Halley Project for my Apple II, but I would load the game over and over again because the intro had a song with real vocals and guitar, something basically unheard of on an Apple II, or virtually any other computer at the time.
We hit diminishing returns a while ago. It will be much harder to find improvements, both in terms of techniques and computation.
Consider that there is ten years between Atari Pitfall and Wolfenstein 3D, ten years between that and Metroid Prime, and ten years between that and Mass Effect 3, and then about ten years between that and now. There’s definitely improvement between all those, but once past Metroid Prime, it becomes far less obvious.
We’ve hit the point where artistic style is more important than taking advantage of every clock cycle of the GPU.
Graphics 20 years from now will be incrementally better, but not mind-blowingly so. We’re rapidly approaching games that are 20 years old still looking pretty decent today.
I see stuff like this and I don’t blame developers/coders for all the shit that’s happening. If you objectively look at gameplay and such, most games are actually pretty decent on their own. The graphics are usually really nice and the story is adequate, if not quite good, the controls are sensible and responsive…
A lot of the major complaints about modern games isn’t necessarily what the devs are making, it’s more about what the garbage company demands is done as part of the whole thing. Online only single player is entirely about control, keeping you from pirating the game (or at least trying to) plus supplying on you and serving you ads and such… Bad releases are because stuff gets pushed out the door before it’s ready because the company needs more numbers for their profit reports, so things that haven’t been given enough time and need more work get pushed onto paying customers. Day one patches are normal because between the time they seed the game to distributors like valve and Microsoft and stuff, and the time the game unlocks for launch day, stuff is still being actively worked on and fixed.
The large game studios have turned the whole thing into a meat grinder to just pump money out of their customers as much as possible and as often as possible, and they’ve basically ruined a lot of the simple expectations for game releases, like having a game that works and that performs adequately and doesn’t crash or need huge extras (like updates) to work on day 1…
Developers themselves aren’t the problem. Studios are the problem and they keep consolidating into a horrible mass of consumer hostile policies.
Sometimes a good game is just a good game regardless of age.
And graphics in particular are often overrated, as design is often more important than resolution. 4k isn’t going to help a game that’s bland and uninspired.
I always use dark souls as an example; you zoom into the textures and it looks awful, but you look at it as a whole and it looks incredible.
I use dedicated cpu cores and other tweaks on my setup to reduce game input latency. Steam is always the 1 fucking program that randomly starves my remaining cores for absolutely no reason
It’s such a garbage-tier app and always has been. Credit to Valve for busting open online app stores, but I have no idea why people like Steam so much.
Absolutely hated it when I was forced to use it. Nowadays I don’t mind that much anymore. But if a game needs to launch another launcher first, that drives me crazy.
Looking at you, EA and Ubishit.
I got into steam because I could (using modded files) download valve games for free. It was like piracy but without the torrenting and gameboxart.jpg.exe shenanigans.
Then I liked Counter Strike, and portal was coming out… and now I have a few thousand bucks in games.
Still don’t like having to run a nanny program to be “allowed” to play the shit I paid for. But steam is the best of that garbage pile.
I got into it because I got Total War Napoleon (another annoying launcher with the Total War series nowadays…) and didn’t realize beforehand that you needed an Steam account for it.
I came around to appreciate the storefront and library for my purchased digital games. But as you said, I don’t want to have to run a nanny program to play my games. Especially my single player games.
That’s what I like about gog though, I can just download an offline installer for my games from them. Although I think by now even they have games on their store that require launchers.
That’s what I always find funny about the EGS hate crowd… They complain about the features it didn’t have at launch (that it now has) but in the end having to launch any launcher pisses me off, might as well use the one with the least garbage! No Valve, I don’t need fucking trading cards with my games!
Steam has a whole lot of useless features compared to Epic or GOG Galaxy, it’s the most bloated launcher available at the moment. Do you need cards and tradable items linked to your account to play your games?
What’s funny is that if the roles were reversed Epic would be accused of trying to monetize whales and gamblers, but no one bats an eye because it’s Valve doing it.
It’s clearly not part of this discussion because you can’t compare the user experience on Linux when the product you want to compare to doesn’t exist on it.
Also, you can just shut up if you can’t have a discussion without insulting the people you’re talking to, no one asked you to take part in the first place since clearly you’re trying to derail the discussion.
In a discussion where we’re comparing the user experience of two products we need to compare them in an environment where both products are working so the user can experience them both.
I don’t know how you can’t figure that out but here we are.
Does it? You would say that’s one of the ever I had of a steak sandwich, if it was one of the best. It seems like it needs best/worst/mediocre sort of descriptive word in there to make sense. Mean you can guess from the scene and tone but I’d never write it out or even say it that way. Of course I never did well in English and am glad to learn new things.
I read it correctly and then re-read because I felt like I must have missed the word and it would actually be there. Then I read it a third time to really make sure.
startrek.website
Ważne