Reality Check - Do we need 60 FPS on PS4 and Xbox One?

Cam investigates frame rates and wonders if the next generation truly needs 60 fps.

Show Info

Reality Check
1538 Comments  RefreshSorted By 
GameSpot has a zero tolerance policy when it comes to toxic conduct in comments. Any abusive, racist, sexist, threatening, bullying, vulgar, and otherwise objectionable behavior will result in moderation and/or account termination. Please keep your discussion civil.

Avatar image for pcgameboy
PCGameboy

352

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

Edited By PCGameboy

Yes.. It does.. Its not controversial, 60 frame looks smoother, looks better, and helps the players far more than 30fps. In every genre, no matter it being 1st or 3rd person. The Xbox X looks to still be playing most games at 30fps as well..

Upvote • 
Avatar image for drumjod
drumjod

847

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

This video is just as relevant now as it was when it was posted. After posting a review on Steam, some users commented and argued that my complaint about the game running at a max of 30 fps was invalid. I was happy to refer them to this video.

Cam, you summed it up pretty well here! Good luck on the Cam and Seb YouTube channel :)

Upvote • 
Avatar image for rodrigo_lima
rodrigo_lima

5

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

loading video...

Upvote • 
Avatar image for t_degan
t_degan

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

gamestop, please make a video discussing "cinematic" experiences in gaming and why 30fps in games is closer to a 24fps movie than 60fps in games (speaking strictly visually, not about response time or input lag).

Upvote • 
Avatar image for sangminlee
SangminLee

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By SangminLee

@t_degan This whole "cinematic experience" comes from our familiarity with movies being 24fps. We got so used to 24fps in movies that 60fps video just seems awkward. Obviously 30fps is closer to 24fps which explains why you get "cinematic experience" (if such thing even exists). With movies like Hobbit increasing this boundary for fps, I'm sure that people will have cinematic experience at 60fps in the future.

Upvote • 
Avatar image for kagerancid
Kagerancid

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

There should be no discussion on this topic, 60fps is the standard now.

Upvote • 
Avatar image for kagerancid
Kagerancid

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

Edited By Kagerancid

@walloftruth @kagerancid 60fps is still the standard, just because consoles aren't reaching it consistently doesn't make it not the standard for them.

Upvote • 
Avatar image for darkcore-x
Darkcore-X

27

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@kagerancid @walloftruth Consoles can reach it.In fact we have 60fps games.Mostly its the developers fault cause some graphic engine eats up a lot of power.But the developers who made/make only exclusive games for the ps4 can achive 60fps cause they know the ps4 in and out.And thanks to the uniqe for now GDDR5 memory.Look at Last of us or Uncharted,Killzone.Look at Bloodbourne and Infamous.These are exclusives that run on 60fps.Even AC unity could have run on 60fps on ps4 but thanks to xbox we should play on 30fps too.So the consoles or at least the ps4 has the power.Just the need to learn how to use it.Look at ps3 games from 2007 and today.Big difference.And ps4 is 1 year old.And dont forget the upcoming direct x 12 that will almost double the video cards preformance.And im a pc player that had enough of upgreading it always.Its better to have a console.

Upvote • 
Avatar image for bwat47
bwat47

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By bwat47

One correction about this video, he mentions that tearing is caused by fps going over the refresh rate which is a common misconception. Tearing can happen at *any* framerate even at framerates significantly below the refresh rate, and will always happen when vsync isn't used regardless of framerate.


Tearing is caused anytime a frame is sent to the monitor by the gpu out of sync with a monitor refresh, the result of this is several frames being displayed at once, which causes the tearing effect. The only way to stop this is to force the gpu to wait on the monitor and only send frames in sync with its refreshes (vsync).

Upvote • 
Avatar image for smokanci11
Smokanci11

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@bwat47 There's another way to prevent tearing, without slowing down the game. Force the monitor to wait for the gpu to send the frame. This is what nVidia does with their Gsync.

Upvote • 
Avatar image for 0o-saxon
0o-Saxon

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

A lot of electronic engineers, computer software engineers must be commenting seeing all the heavily understood and researched facts.

Upvote • 
Avatar image for jlwilliams1981
jlwilliams1981

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

60 FPS is expected. I'm getting to the point in my gaming life where if it's not 60, it looks like garbage. Similar to the way that people got used to 720p (well, upscaled 720 to 1080p to be accurate) during this console generation and to go back to xbox one (not the new xbox one, the original xbox one)

Upvote • 
Avatar image for adon_cabre
adon_cabre

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@jlwilliams1981


Not every genre favors that high frame rate. Racing sure. Simulators sure. MMO's perhaps. But characters on screen, like in AC or other 3rd POV titles, appear to look so fake, like really bad movie CGI because there are no small nuances -- muscle and facial reactions -- that a developer can capture with just coding.


Not to mention how jerky they feel, almost as if the characters have been sped up. I'd say 30 frame rate is perfect. Maybe 45 would be better, but 60 is too high. It's like movies captured at 48 frame rate; it might better render vast landscapes, but between close camera character interaction, it feels unnaturally quick, almost sped up.

Upvote • 
Avatar image for pcgameboy
PCGameboy

352

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

Edited By PCGameboy

@adon_cabre: AC games look far better at 30fps.. Also "fake CGI movie."? They're not movies, they're games, and they explained the difference between movies, and games in this video.. 3rd person games, and 1st both look much better at 60fps.

Upvote • 
Avatar image for dexter-404
dexter-404

36

Forum Posts

0

Wiki Points

0

Followers

Reviews: 28

User Lists: 8

@adon_cabre @jlwilliams1981 forza horizon 2 has been on purposely capped to 30fps


Upvote • 
Avatar image for saren_dredd
Saren_Dredd

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

I don't want to come off as pompous or privileged, I'm not. But my passion is gaming, and as such I've always saved up to get the absolute best in hardware. I game on my TV with my PC through Steam, so you know the method. My TV is capable of 240FPS, 3D, ultra HD (it's pretty much 4k, but it's not. When I tested them side by side, the 4k didn't have any noticeable advantage over the one I chose -- and yes, I brought my laptop into the store because this is what I was using it for). My question is twofold:

Since there was no visual or performance difference between my TV (Which is NOT 4k, but did just as well as a brand new Sony 4K set -- my TV is also made by Sony) and the "higher resolution" TV, does that mean my eye just couldn't detect the difference, and thus and response on the "better" TV was just entirely ignored and therefore superfluous?


Also, are there ANY games that take advantage and actually USE all of the hardware that we put all this money into obtaining? Will there ever be? (<<<-- main question)

I've played Crysis 1-3 and use them as my graphics testers, I've maxed out Skyrim and added UltraHD mods and SMIM, etc. Neither of them, I feel, truly take advantage of everything I have and I do NOT have the best GPU. I have an ASUS Republic of Gamers laptop with an ATI 6800HD graphics (I upgraded from the 5870HD) card. The 5870 did the same job as the 6xxx and they are vastly different in terms of rendering. I haven't upgraded since, because no game out even challenges what I have. This is 2011-2012 technology I'm talking about, and much better has come out since. My real question is why upgrade when what I have now does the trick? I don't understand why companies tout specs and performance enhancements like it's so important, but the games don't take advantage of it. I've heard the argument about the eye not detecting over a certain FPS, I used to both build AND sell the TVs and GPUs/CPUs that made these enhancements possible and I heard those things everyday. I know how they work. What I don't know is why they're made when nothing seems to be able to take advantage of it. I'm not talking about "next gen" (which let's face it, is really just a mid-powered PC), I'm talking about everything as a whole. I'm sorry this has run on so long, but this is pretty heady stuff to talk about. I've heard there is no difference between 120FPS-240, and I've reviewed both and there doesn't seem to be a difference at all. Even from 60 to 120, there's is a negligible difference. The problem is NOT the hardware, the problem is the sources using the hardware. Since we don't have the sources compatible with the hardware (movies are shot at 24fps, meaning the 60-240fps rendering is done by the machine and is artificial), what is the point in all the advances? If we don't change our source, how is the machine (source: video recorder, machine: your TV) ever truly tested to its limits and used?
Upvote • 
Avatar image for quazar87
Quazar87

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

Edited By Quazar87

@saren_dredd Dude, there is no current cable standard capable of passing 4k content faster than 60fps. DisplayPort 1.2, or HDMI 2.0, it doesn't matter.

Upvote • 
Avatar image for 0o-saxon
0o-Saxon

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@quazar87 @saren_dredd lies. How does you home Internet pass content higher than 60htz then?

Upvote • 
Avatar image for 0o-saxon
0o-Saxon

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@quazar87 @saren_dredd like to see your proof of that. 240htz Sony tv?. 60htz max cable speed? Frequency is not limited to 60htz. Research boys... guessing and limited understanding of hardware and software have hurt you logic

Upvote • 
Avatar image for anarchyz11
Anarchyz11

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@0o-saxon @quazar87 @saren_dredd

Display port 1.2 is currently the best cable connection you can get, and it only supports 4k resolution at 60hz/fps. If you have a tv/monitor that is displaying 4k at a higher refresh rate, it's internally upscaling that refresh rate through interpolation. This is not native.

That, or the same is being done with the resolution.

Display port 1.3 is set for this year and can display 4k at 120hz, but I highly doubt any normal consumer, if any at all, have that.

Upvote • 
Avatar image for perphektxero
perphektxero

131

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

This needs to be re-addressed, with the drawbacks to 60fps.

Upvote • 
Avatar image for morkymouse
morkymouse

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@perphektxero drawbacks? I'm glad you think you're funny.

2 • 
Avatar image for marmiteyeast
MarmiteYeast

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

This whole video is 60fps.

Upvote • 
Avatar image for Fireblader70
Fireblader70

62

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Fireblader70

@moaznasr @x2rufff4u *cough*motionsickness*cough*

Upvote • 
Avatar image for moaznasr
MoazNasr

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

Edited By MoazNasr

@Fireblader70 @moaznasr @x2rufff4u I don't think you know how motionsickness works. It occurs when you have a delay in your input, so you press a button but the guy won't move instantly. That was the problem in Oculus Rift DK1. So if anything, you get more motionsick the less framerate you have. That's why Oculus is trying to get the highest FPS possible, around 120 FPS.

Upvote • 
Avatar image for marmiteyeast
MarmiteYeast

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@moaznasr @Fireblader70 @x2rufff4u Actually between 300 - 1000 is what we need to make it feel like reality


Upvote • 
Avatar image for JAIBOT
JAIBOT

38

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@marmiteyeast @moaznasr @Fireblader70 @x2rufff4u What about epileptics?

Upvote • 
Avatar image for Moaz13
Moaz13

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Moaz13

@JAIBOT @marmiteyeast @moaznasr @Fireblader70 @x2rufff4u What about them? Framerates won't change flashing lights and colors.

Upvote • 
Avatar image for Fireblader70
Fireblader70

62

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Fireblader70

@moaznasr @Fireblader70 @x2rufff4u Motion sickness can happen at any frame rate. Not just due to input lag, but also as simple as seeing motion on screen that doesn't match up with what your body is telling you. And FOV.

Upvote • 
Avatar image for moaznasr
MoazNasr

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@Fireblader70 @moaznasr @x2rufff4u Yeah, sure that's true, so going with what you said, 60FPS eliminates most motionsickness because the motion you see on the screen is exactly what you told the game to do as a result of the smoother input.

Upvote • 
Avatar image for Fireblader70
Fireblader70

62

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@moaznasr @Fireblader70 @x2rufff4u I'm not saying it doesn't alleviate it, but my point is that it can indeed happen at that frame rate. It isn't just about input, it's about the visuals and your inner balance telling you different things.

Upvote • 
Avatar image for x2rufff4u
x2rufff4u

55

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

30 FPS is all you need. 60 FPS starts to make me feel sick with too much motion. After a while it will catch on to you.

Upvote • 
Avatar image for moaznasr
MoazNasr

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@x2rufff4u So you get sick from a more responsive and smooth gameplay experience? Also, do you get sick when you're not gaming because real life is more than 60FPS? I think you didn't even watch the video.

11 • 
Avatar image for dexter-404
dexter-404

36

Forum Posts

0

Wiki Points

0

Followers

Reviews: 28

User Lists: 8

@moaznasr @x2rufff4u there is barely any difference to 30 and 60

Upvote • 
Avatar image for Moaz13
Moaz13

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@dexter-404 @moaznasr @x2rufff4u you have to be blind to not notice the huge difference.

Upvote • 
Avatar image for Moaz13
Moaz13

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@dexter-404 @moaznasr @x2rufff4u You have to be blind to not notice the huge difference

Upvote • 
Avatar image for Moaz13
Moaz13

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@dexter-404 @moaznasr @x2rufff4u You have to be blind not to notice the huge difference.

Upvote • 
Avatar image for RevLux
RevLux

28

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

@ESPM400 @cristi1990an Yep, we perceive games to run much smoother in higher frame rates. You also needn't have a giant t.v. or sit very close in to get the benefits of higher fps. (Of course you may not get the full effect depending on monitor size and/or viewing distance.)

Upvote • 
Avatar image for vincelongman
VinceLongman

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

Edited By VinceLongman

It's quite an interesting topic

If you monitor/TV is 60 Hz, it shows 60 frames per a second.

This means a frame is on for 1/60 s, which is ~17 ms

For 30 fps a frame is on for 1/30 s, which is ~34 ms

For 120 fps a frame is on for 1/120 s, which is ~8 ms

So the latency basically halves going from 30 fps to 60 fps to 120 fps

Which is why increasing the frame make the game much smoother


Also there is also the monitor/TVs response time to take into account

Your average monitors/TV would add another 5-10 ms, a gaming monitor would be about 1 ms.


Was the difference is much more noticeable in first person shooters, where slow reactions get you killed


http://30vs60.com/

http://boallen.com/fps-compare.html

http://gfycat.com/MerrySpiritedBass

http://gfycat.com/OblongTautDragonfly

Upvote •