My brother or sister in pixels, this is not the same. I’m not a graphics snob. I still play pixelated, barely discernible nonsense games. When I updated from 30 to 144, it was a whole new world. Now even 60 can feel sluggish. This is not a graphical fidelity argument. It’s input and response time and motion perception. Open your mind, man. Accept the frames.
And that matters for certain games, a lot. But it doesn’t functionally matter at all for others. Same as the transition to polygons. My point, which I thought I stated clearly, was not “FPS BAD!!”, it was “FPS generally good, but stop acting like it’s the single most important factor in modern gaming.”
Simply put, if everything was 144fps then it would be easier on the eyes and motions would feel more natural. Even if it’s just navigating menus in a pixel style game.
Real life has infinite frames per second. In a world where high fps gaming becomes the norm, a low 24 fps game could be a great art style and win awards for its ‘bold art direction’.
That article states people can perceive images as rapidly as once every 13 milliseconds, which they math out to 75 fps, 25% higher than 60.
Looking at the study itself, they were testing whether participants could pick out a picture that displayed for 13-80 ms when “masked” by other brief pictures, with a focus on whether it made a difference if the participant was told what image they were looking for before or after seeing the images. What they found was that participants could pick out the image as low as the 13 ms mark (albeit with less accuracy) and could generally do so better if told what to look for beforehand.
What this tells me is that your source has nothing to say about anything over 75 fps. It also was testing in a fundamentally different environment than a video game, where your brain will constantly expect an image similar to and stemming from the image before it rather than seeing a completely different image. If you were to draw conclusions based on the study despite the differences, what the study would suggest is that knowing what to look for, as your brain does gaming, would make you better able to pick out individual frames. This makes me want to think that your source does not support your assertion, and that in a game you could perceive frame rates higher than 75 fps at a minimum.
From my own knowledge, there’s also a fundamental difference between perceiving reality and computer screens in the form of motion blur. Objects moving in real time will leave a faint blur behind when perceiving it that your brain can use to fill in any blanks it may have missed, making reality appear smoother than it is. For an example of this wobble a pencil back and forth to make it “bend.” Movies filmed at 24 fps capture this minute motion blur as they film which makes it easier for our brains to watch them despite the lower frame rate. Real time rendered video games do not have this effect, as there are no after images to fill in the blanks (unless you turn on motion blur, which doesn’t do a good job emulating this).
This means video games need to compensate, and the best way to do that is more frames per second so your brain doesn’t need to fill in the blanks with the motion blur it’s used to seeing in the real world. You’ll obviously get diminishing returns from the same increase, but there will still be returns.
That’s more or less the placebo effect at work, though. Most people cannot see “faster” than 60FPS; the only actual upside of running higher FPS rate is that you don’t go below 60 in case the game starts to lag for whatever reason. Now, you may be one of the few who actually see perceive changes better than normal, but for the vast majority, it’s more or less just placebo.
That’s just wrong. I couldn’t go back to my 60Hz phone after getting a 120Hz new one. It’s far from placebo, and saying otherwise is demonstrably false.
Yeah, as much as I can give a shit about ray tracing or better shadows or whatever, as a budget gamer, frame rate is really fucking me up. I have a very low end PC so 60 is basically max. Moving back to 30 on the PS4 honestly feels like I’m playing PS2. I had the [mis]fortune of hanging out at a friends house and playing his PC rig with a 40 series card, 240hz monitor, etc, and suffice it to say it took a few days before I could get back to playing on my shit without everything feeling broken.
it depends on if its a good 30 or not. if inputs are quick and responsive, and the framerate stays at 30 then its fine. but if my device is struggling to run the game and its stuttering and unresponsive then its awful
sm64 comes to mind as the best 30fps experience ive had, and i am spoiled rotten on high refresh rate games
One of most insufferable aspects of video game culture (pc gaming in particular) other than the relentless toxic masculinity from insecure nerds is an obsessive focus on having powerful hardware and shitting on people who think they are getting a good experience when they don’t have good hardware.
The point is to own a computer that other people don’t have so you can play a game and get an experience other people don’t have, the point isn’t to celebrate a diversity of gaming experiences and value accessibility for those without the money for a nice computer. It really doesn’t matter if these people are intending to do this consciously or not, this is a story as old as time. It is the same exact bullshit as guitar people who only think special exotic or vintage guitars are beautiful, claim to absolutely love guitar but never once in their life have stopped to think about how much more beautiful it is that any random chump can get an objectively wonderful sounding guitar for a couple of hundred dollars than it is that they own some stupid special edition guitar with a magic paint job that cost as much as my shitty car.
Good thing these people don’t fully dictate the flow of all of video game development, but they will never ever learn because this is the kind of pattern that arises not from conscious intention but rather from people uninterested in critically examining their own motivations.
It is the same damn nauseating thing with photography too….
I spent days in Wraith wandering around on the army map with just my party before finally figuring out I had an actual army too. Now I can’t find my camp to rest up in lol. It might be a failed campaign.
I can go down to 30 probably a bit lower as long as it is consistant, that is the most important part.
It can also have a bit to do with me powering through Watch Dogs at 1 fram per second in some parts. You never notice how good 25-30 is until your frames starts camping in the singel digits.
I remember playing Assassins Creed II on pc with a 9500GT and getting sub 20fps constantly to the point I had to wait for character animations to catch up with the dialogue so the next person could talk. Halfway through the game I upgraded to a GTX 560 and was astounded that everything was in sync and oh so smooth. I always remember that when I start getting annoyed I can’t get over 90fps in a game. As long as it’s playable!
It pretty much only makes a difference in FPS games where you’re constantly switching back and forth between crosshairs focus and peripheral vision flick reactions. At 144Hz, motion blur between frames is largely eliminated, so you have more accurate flicks and your vision at the crosshairs is much sharper.
You can’t play racing games if you believe that. I’d far rather play an FPS at 30 than a racing game at 60. Low frame rates can give me motion sickness at high camera speeds.
People always go on and on about framerate but I’d take 4k at 60fps over 1080p at 144fps any day. I never really noticed a differance over 60fps. But the resolution makes a massive difference.
They both make a difference. 1080p is console TV gaming minimum resolution. A good PC can do much better. I like my ultra-widescreen 144Hz monitor more than a plain boring 16x9 high-res monitor.
But even games at 3440x1440 @ 144Hz don’t look as good as full-on VR at 120Hz and up. VR gaming for much time will make gaming on a monitor feel obsolete.
For me it depends on the game. A menu game from Paradox like Crusader Kings? 4k 60fps. A competitive shooter? Ideally the max resolution (for greater pinpoint accuracy) and 144fps, but between the two I’d want maximum fps for the reaction speed and responsiveness. A pretty game like Ori and the Will of the Wisps? Crank the graphics up and I’m happy with 60fps.
Even most console games run at 60 now, with an option to turn on some RT graphical wankery and run at 30.
I often turn it on to see what it looks like, and then decide it’s not worth it. Ratchet and Clank actually played decently at 30, and one of the Ghostwire Tokyo options allowed you to have RT and decent framerates with a minor hit to resolution.
Gsync/Freesync/VRR is a game changer for lower end hardware, because then all those dips below 60 get smoothed out to an even 45 or so. I’ve spent a lot less time fucking about with setting on PC since getting a monitor that supports that.
startrek.website
Aktywne