Not to mention the mob of henchmen. I just slogged through a small country’s worth of peons meant to serve as a thousand papercuts, whittling my health and resources down… Then this boss comes walking in popping healing kits/potions/whatever like candy.
Traditional roguelikes tend to do NPCs/enemies which often have comparable stats to the player. With the benefit the player has is a “better AI” aka, an intelligent human behind them.
It is my opinion that we reached peak graphics 6 or 7 years ago when GTX1080 was king. Why?
Games from that era look gorgeous (eg Shadow of Tomb Raider), yet were well optimized to run high/ultra at FHD on RX570.
We didn't need to rely on fakery like DLSS and frame generation to get playable frame rates. If anything, people used to supersample for the ultimate picture quality. Even upping the rendering scale to 1.25 made everything so crisp.
MSAA and SMAA antialiasing look better, but somehow even TAA from that era doesn't seem as blurry. Today, might as well use FXAA.
Graphics today seem ass-backward to me: render at 60...70% scale to have good framerates, FX are often rendered at even lower resolution, slap on overly blurry TAA to hide the jaggies, then use some upsample trickery to get to the native resolution. And it's still blurry, so squirt some sharpening and noise on top to create an illusion of detail. And still runs like crap, so throw in frame interpolation to get the illusion of higher frame rate.
I think it's high time we should be able to run non-raytracing graphics at 4k native and raytracing at 2.5k native on 500€ MSRP GPU-s with no trickery involved.
GPUs are getting better, but the demand from the crypto and ML AI markets mean they can just jack up the price of every new card to higher than the last so the prices have stopped dropping with each new generation.
We didn’t need to rely on fakery like DLSS and frame generation to get playable frame rates.
If truly believe what you wrote, then you should never look into the details of how a game world is rendered. It’s fakery stacked upon fakery that somehow looks great. If anything, the current move of ray tracing with upscaling is less fakery than what was before.
There’s a saying in computer graphics: if it looks right, it is right. Meaning you shouldn’t worry if the technique makes a mockary of how light actually works as long as the viewer won’t notice.
But there's a stark difference between optimization like culling, occlusion planes, LOD-s, half-res rendering of costly FX (like AO) and using a crutch like lowering the rendering resolution of the whole frame to try and make up for bad optimization or crap hardware. DLSS has it's place for 150...200€ entry-level GPU-s trying to drive a 2.5k monitor, not 700€ "midrange" cards.
But there’s a stark difference between optimization like culling, occlusion planes, LOD-s, half-res rendering of costly FX (like AO) and using a crutch like lowering the rendering resolution of the whole frame to try and make up for bad optimization or crap hardware.
There is not a stark difference if you were to describe the techniques objectively and not twist it to what you feel they’re like.
There are so many steps in the render pipeline where native resolution isn’t used. Yet I don’t here the crowd complaining about shadow map size or how reflections are half res. Upscaling is just another tool that allows us to create better looking frames at playable refresh rates. Compare Alan Wake or Avatar with DLSS with any other game without DLSS and they will still come out on top.
DLSS has it’s place for 150…200€ entry-level GPU-s trying to drive a 2.5k monitor, not 700€ “midrange” cards.
Just because you’re unhappy with Nvidia’s pricing strategy doesn’t mean you should slander new render techniques. You’re mixing two different topics.
For those that are unaware, the second chad is most likely referring to .kkrieger. Not a full game, but a demo (from a demoscene) whose purpose was to make a fully playable game with a max size of 96kb. Even going very slow, you won’t need more than 5 minutes to finish it.
The startup is very CPU heavy and takes a while, even on modern systems, because it generates all the geometry, textures, lighting and whatnot from stored procedures.
Yes and this isn’t necessary because the two sides are completely identical. No differences in pieces or terrain or anything so there’s no need to change a piece to make it stronger or weaker.
Actually it has had balance changes. Chess clock for instance is a balance update between the players, but there’s also been balancing between pieces. En passant and castling but also changing how the pieces work (for example bishop).
Despite the obvious symmetry of the game there’s still a lot to balance.
You are quite correct that an asymetrical game is much harder to balance.
However having identical sides and a symmetric playing field doesn’t always guarantee a balanced game. For example, if one piece or position dominates all others it can lead to a lack of viable options and just one way to play, making the game uninteresting. You don’t just want the players to have equal strength, you also want the universe of possible playing strategies to contain many different strong options.
I used to have a subscription to Game Informer magazine. I very specifically remember the multi page preview for the upcoming game, Oblivion. The pictures they had in there, I swear to God, were actually pictures of trees and grass. The fidelity was unparalleled and it was the peak of what games could do. Idk why that article sticks out so much, but it felt like the top of the mountain.
Hah I get that but it was for half life 1 and I thought the graphics were amazing. Rainbow 6 rogue spear was my first PC game and I thought that was the pinacle of graphics… fuck I’m old.
For me it was reading in Playstation Magazine that there were melting ice cubes in the then upcoming Metal Gear Solid 2. I’m not even sure PS2 had been released yet at the time, so I was just awe struck thinking wow it’s getting so powerful and detailed that even ice cubes in a sink are accounted for.
Man for me it was playing Halo CE on the original Xbox, you could see the individual blades of glass on the ground texture! I was absolutely blown away haha
I remember upgrading to a voodoo 3dfx card around the time transparent water was possible in Quake. The graphics blew me away and the ability to see players in the water gave a ridiculous advantage.
I remember the graphics “blue” me away too - I mean that lovingly. That the graphics colours looked much cooler compared to on the Riva TNT (actually this is my memory of Quake 2 (particularly Q2DM1),).
I can relate, but by the time Oblivion came out I was already starting to get jaded about graphical fidelity. What I can tell you is that I ogled over a similar preview for Morrowind, and actually built my first PC specifically targeting the recommended specs to run it in all its glorious glory!
Christmas of probably 98 or 99, my older brother gave my younger brother and I his PlayStation. He had Final Fantasy VII, and that was probably when I popped my graphics cherry. I was astounded when I went back to play it years later.
Tested 5 clients on my PC 3 times each. Times were more or less consistent on each run, biggest variation seemed for Uplay.
Setup: You are already logged in, there are no pending updates, you terminate client after each run (did not see significant time difference between repeated runs and 1st run after you log in), your logged in Windows account has admin rights so time is not wasted entering password (EGS and Uplay require admin rights to launch), time stops once launcher is usable.
EGS - 8 - 10 seconds
Steam - 20 seconds
GOG - 11 seconds
Uplay - 20 - 24 seconds
Heroic - 5 seconds
System: Ryzen 2600 with Samsung 970 EVO (2400 MB/s R/W as per Samsung Magician benchmark)
Really exposing the Valve fanboyism in the room. Steam genuinely takes quite a long time to start. People pretend it’s quick because they leave it running, while cold booting Epic/alternatives and complaining.
Yknow… even going into 3D the “big” cities haven’t felt as actually big as Castellia did. I think the amount of places you could go into and secrets you could find in it really helped.
My brother or sister in pixels, this is not the same. I’m not a graphics snob. I still play pixelated, barely discernible nonsense games. When I updated from 30 to 144, it was a whole new world. Now even 60 can feel sluggish. This is not a graphical fidelity argument. It’s input and response time and motion perception. Open your mind, man. Accept the frames.
And that matters for certain games, a lot. But it doesn’t functionally matter at all for others. Same as the transition to polygons. My point, which I thought I stated clearly, was not “FPS BAD!!”, it was “FPS generally good, but stop acting like it’s the single most important factor in modern gaming.”
Simply put, if everything was 144fps then it would be easier on the eyes and motions would feel more natural. Even if it’s just navigating menus in a pixel style game.
Real life has infinite frames per second. In a world where high fps gaming becomes the norm, a low 24 fps game could be a great art style and win awards for its ‘bold art direction’.
That article states people can perceive images as rapidly as once every 13 milliseconds, which they math out to 75 fps, 25% higher than 60.
Looking at the study itself, they were testing whether participants could pick out a picture that displayed for 13-80 ms when “masked” by other brief pictures, with a focus on whether it made a difference if the participant was told what image they were looking for before or after seeing the images. What they found was that participants could pick out the image as low as the 13 ms mark (albeit with less accuracy) and could generally do so better if told what to look for beforehand.
What this tells me is that your source has nothing to say about anything over 75 fps. It also was testing in a fundamentally different environment than a video game, where your brain will constantly expect an image similar to and stemming from the image before it rather than seeing a completely different image. If you were to draw conclusions based on the study despite the differences, what the study would suggest is that knowing what to look for, as your brain does gaming, would make you better able to pick out individual frames. This makes me want to think that your source does not support your assertion, and that in a game you could perceive frame rates higher than 75 fps at a minimum.
From my own knowledge, there’s also a fundamental difference between perceiving reality and computer screens in the form of motion blur. Objects moving in real time will leave a faint blur behind when perceiving it that your brain can use to fill in any blanks it may have missed, making reality appear smoother than it is. For an example of this wobble a pencil back and forth to make it “bend.” Movies filmed at 24 fps capture this minute motion blur as they film which makes it easier for our brains to watch them despite the lower frame rate. Real time rendered video games do not have this effect, as there are no after images to fill in the blanks (unless you turn on motion blur, which doesn’t do a good job emulating this).
This means video games need to compensate, and the best way to do that is more frames per second so your brain doesn’t need to fill in the blanks with the motion blur it’s used to seeing in the real world. You’ll obviously get diminishing returns from the same increase, but there will still be returns.
That’s more or less the placebo effect at work, though. Most people cannot see “faster” than 60FPS; the only actual upside of running higher FPS rate is that you don’t go below 60 in case the game starts to lag for whatever reason. Now, you may be one of the few who actually see perceive changes better than normal, but for the vast majority, it’s more or less just placebo.
That’s just wrong. I couldn’t go back to my 60Hz phone after getting a 120Hz new one. It’s far from placebo, and saying otherwise is demonstrably false.
Yeah, as much as I can give a shit about ray tracing or better shadows or whatever, as a budget gamer, frame rate is really fucking me up. I have a very low end PC so 60 is basically max. Moving back to 30 on the PS4 honestly feels like I’m playing PS2. I had the [mis]fortune of hanging out at a friends house and playing his PC rig with a 40 series card, 240hz monitor, etc, and suffice it to say it took a few days before I could get back to playing on my shit without everything feeling broken.
it depends on if its a good 30 or not. if inputs are quick and responsive, and the framerate stays at 30 then its fine. but if my device is struggling to run the game and its stuttering and unresponsive then its awful
sm64 comes to mind as the best 30fps experience ive had, and i am spoiled rotten on high refresh rate games
One of most insufferable aspects of video game culture (pc gaming in particular) other than the relentless toxic masculinity from insecure nerds is an obsessive focus on having powerful hardware and shitting on people who think they are getting a good experience when they don’t have good hardware.
The point is to own a computer that other people don’t have so you can play a game and get an experience other people don’t have, the point isn’t to celebrate a diversity of gaming experiences and value accessibility for those without the money for a nice computer. It really doesn’t matter if these people are intending to do this consciously or not, this is a story as old as time. It is the same exact bullshit as guitar people who only think special exotic or vintage guitars are beautiful, claim to absolutely love guitar but never once in their life have stopped to think about how much more beautiful it is that any random chump can get an objectively wonderful sounding guitar for a couple of hundred dollars than it is that they own some stupid special edition guitar with a magic paint job that cost as much as my shitty car.
Good thing these people don’t fully dictate the flow of all of video game development, but they will never ever learn because this is the kind of pattern that arises not from conscious intention but rather from people uninterested in critically examining their own motivations.
It is the same damn nauseating thing with photography too….
That’s a leap. EA sucks. There are plenty of games out there and there are plenty ways of managing them better than steam. Especially if you know your way around a PC.
It’d be another big player, it’s not going to be some indie startup that suddenly breaks out into the light to dazzle everyone. And all the other players in this space have their own, worse, storefront and launchers…
Just imagine uplay, but with steam/valves loyal userbase and therefore everyone else sold their games on it. shudders
I mean, who cares? Like having 15 launchers is not a real thing for most gamers I know. It’s saved in a cloud server under your profile. Just uninstall after you’re done with a game.
I can second the hijacking thing. Steam has its own controller input driver (?) it installs which sometimes clash with the actual drivers for the controller, which can lead to games registering input twice or not at all etc.
You can disable it but sometimes it randomly seems to re-enable itself and it’s super annoying.
On the other hand though, Steam Input is really powerful for remapping inputs and setting up controller maps to use for keyboard and mouse games. I’ve never had trouble with it apart from with an old handheld PC that registers its built in controller as an Xbox 360 controller instead of Xbox One
That’s totally fine but it should be opt in, or just prompt you when you launch the game for the first time. It doesn’t ask and then actively breaks things for some users.
Yeah it could definitely be more interactive, but then it would annoy users for the entirely opposite problem. I can see the complaints now in my head: “Steam prompts me about Steam Input before every single game I play! I get it valve” or similar (you and I both know the person would be ignoring some “don’t remind me” option
You don’t even need one for GOG - GOG Galaxy is totally optional and you can download absolutelly run of the mill offline installers for all games on GOG which do not require the installer or any network access, and you even keep the forever and ever (so all your old games are still installable and run on an old gaming computer with the old OS, so long as the hardware hasn’t died).
I get where you’re coming from, but ads don’t just want people talking, they want to turn people into buyers.
After the success of the PS2, virtually their entire market knew about the PS3. But even with those weird ads and their previous massive success with the PS2, the PS3 was selling terribly poorly around the time of the baby ad, IIRC.
I think I flipped through the manual of the Legend of Zelda on nes multiple times. The artwork in that thing is so great and all the lore and information that was in there was awesome.
startrek.website
Ważne