Reality Check - Do we need 60 FPS on PS4 and Xbox One?
Cam investigates frame rates and wonders if the next generation truly needs 60 fps.
gamestop, please make a video discussing "cinematic" experiences in gaming and why 30fps in games is closer to a 24fps movie than 60fps in games (speaking strictly visually, not about response time or input lag).
One correction about this video, he mentions that tearing is caused by fps going over the refresh rate which is a common misconception. Tearing can happen at *any* framerate even at framerates significantly below the refresh rate, and will always happen when vsync isn't used regardless of framerate.
Tearing is caused anytime a frame is sent to the monitor by the gpu out of sync with a monitor refresh, the result of this is several frames being displayed at once, which causes the tearing effect. The only way to stop this is to force the gpu to wait on the monitor and only send frames in sync with its refreshes (vsync).
A lot of electronic engineers, computer software engineers must be commenting seeing all the heavily understood and researched facts.
60 FPS is expected. I'm getting to the point in my gaming life where if it's not 60, it looks like garbage. Similar to the way that people got used to 720p (well, upscaled 720 to 1080p to be accurate) during this console generation and to go back to xbox one (not the new xbox one, the original xbox one)
I don't want to come off as pompous or privileged, I'm not. But my
passion is gaming, and as such I've always saved up to get the absolute
best in hardware. I game on my TV with my PC through Steam, so you know
the method. My TV is capable of 240FPS, 3D, ultra HD (it's pretty much
4k, but it's not. When I tested them side by side, the 4k didn't have
any noticeable advantage over the one I chose -- and yes, I brought my
laptop into the store because this is what I was using it for). My
question is twofold:
Since there was no visual or performance difference between my TV (Which is NOT 4k, but did just as well as a brand new Sony 4K set -- my TV is also made by Sony) and the "higher resolution" TV, does that mean my eye just couldn't detect the difference, and thus and response on the "better" TV was just entirely ignored and therefore superfluous?
Also, are there ANY
games that take advantage and actually USE all of the hardware that we
put all this money into obtaining? Will there ever be? (<<<--
30 FPS is all you need. 60 FPS starts to make me feel sick with too much motion. After a while it will catch on to you.
It's quite an interesting topic
If you monitor/TV is 60 Hz, it shows 60 frames per a second.
This means a frame is on for 1/60 s, which is ~17 ms
For 30 fps a frame is on for 1/30 s, which is ~34 ms
For 120 fps a frame is on for 1/120 s, which is ~8 ms
So the latency basically halves going from 30 fps to 60 fps to 120 fps
Which is why increasing the frame make the game much smoother
Also there is also the monitor/TVs response time to take into account
Your average monitors/TV would add another 5-10 ms, a gaming monitor would be about 1 ms.
Was the difference is much more noticeable in first person shooters, where slow reactions get you killed
Lol! Did anybody noticed any difference at 3:45?... No?... Well, that's good, because there isn't any! AC4 on the PS4 runs at 30fps all the time!!!
Honestly, there is a difference, but not DRASTIC. 30fps looks smooth to me and 60fps looks perfectly smooth. I think as long as you can maintain a constant 40fps that's good enough, actually Nvidia states that if you're framerate is always above 40 you're in the most optimal zone. If you're framerate is always at 40, then it probably is at 60 most of the time and maybe dips to 40. Geforce Experience is actually pretty good at giving you settings that are a great balance between quality and performance.
my mongaloid brain only can perceive 8 fps,... no one can perceive anything greater.... errrrrr........ but seriously, the refresh comparison is just semantics. higher frame rate is just smoother,... period, end of story. 120fps max. probably anything more is overkill.
@t_degan This whole "cinematic experience" comes from our familiarity with movies being 24fps. We got so used to 24fps in movies that 60fps video just seems awkward. Obviously 30fps is closer to 24fps which explains why you get "cinematic experience" (if such thing even exists). With movies like Hobbit increasing this boundary for fps, I'm sure that people will have cinematic experience at 60fps in the future.
@kagerancid Not on consoles it isn't, lmao!
@bwat47 There's another way to prevent tearing, without slowing down the game. Force the monitor to wait for the gpu to send the frame. This is what nVidia does with their Gsync.
Not every genre favors that high frame rate. Racing sure. Simulators sure. MMO's perhaps. But characters on screen, like in AC or other 3rd POV titles, appear to look so fake, like really bad movie CGI because there are no small nuances -- muscle and facial reactions -- that a developer can capture with just coding.
Not to mention how jerky they feel, almost as if the characters have been sped up. I'd say 30 frame rate is perfect. Maybe 45 would be better, but 60 is too high. It's like movies captured at 48 frame rate; it might better render vast landscapes, but between close camera character interaction, it feels unnaturally quick, almost sped up.
@saren_dredd Dude, there is no current cable standard capable of passing 4k content faster than 60fps. DisplayPort 1.2, or HDMI 2.0, it doesn't matter.
@perphektxero drawbacks? I'm glad you think you're funny.
@x2rufff4u So you get sick from a more responsive and smooth gameplay experience? Also, do you get sick when you're not gaming because real life is more than 60FPS? I think you didn't even watch the video.
@cristi1990an That's your browser struggling to run 60FPS on your bad PC. There's a huge huge huge difference.
@cristi1990an it clearly shows the difference... if you cant see it you're blind or your computer is a crap that cant play smooth videos. but you can test how many frames it has :P download the video and go into a video editing program and go frame by frame. if it moves than there you have your 60 fps
@cristi1990an Although I'm not saying that's the case here, you can hook a PS3 or 4 controller up to your PC.
Also, as someone who started gaming on PC, switched to consoles in '07, then about six months ago built a fairly high-end gaming rig (http://pcpartpicker.com/b/EII) and ditched my 360, I can tell you that there's a very noticeable difference between 30 and 60 fps.
@Bryjoered07 It's better to aim above 60 FPS. Reason being, typically you want to employ VSync to get rid of screen tearing (which I find incredibly distracting). If you're running at 40 FPS, then maybe you'll be mostly 60 FPS but when it dips even slightly below it will basically hard crash down to 30 FPS and look incredibly stuttery.
But, if you aim *above* 60 FPS, VSync will basically just cap you to 60 FPS.
This could all change with NVidia's GSync technology, as that adapts the monitor's refresh rate to the GFX card, so the game can leave VSync off without suffering from annoying screen tear.
@wowwow27 We can keep going higher. 240fps is possible and would be ideal for a game that's moving extremely fast.
@wowwow27 There is no difference.
@Fireblader70 @moaznasr @x2rufff4u I don't think you know how motionsickness works. It occurs when you have a delay in your input, so you press a button but the guy won't move instantly. That was the problem in Oculus Rift DK1. So if anything, you get more motionsick the less framerate you have. That's why Oculus is trying to get the highest FPS possible, around 120 FPS.