Oculus VR Founder: “30fps Is Not a Good Artistic Decision, It's a Failure”

Palmer Luckey explains what frame rates virtual reality requires.

While many console games still run at 30 frames per second, Oculus VR’s founder Palmer Luckey has said he doesn’t think of it as a choice, but a failure.

Speaking with LinusTechTips, Luckey explained that virtual reality requires much higher frame rates than what you need on a typical screen. He said that 60fps is enough, but that we’ll see huge improvements with up to 90Hz or 120Hz and even small improvements beyond that.

“VR is going to need much higher frame rates than consoles, although even for consoles or traditional PC games, I don’t think 30fps is smart,” Luckey said. “It’s not a good artistic decision, it’s a failure.”

Last month, developer Ready at Dawn said that its PlayStation 4 exclusive The Order: 1886 will run at 30fps by design, as it’s seeking to deliver a "filmic look." As director Dana Jan explained: “60fps is really responsive and really cool. I enjoy playing games in 60fps, but one thing that really changes is the aesthetic of the game in 60fps.” Jan said that at 60fps would cause the game to look like “something on the Discovery Channel, like an HDTV kind of segment or a sci-fi original movie maybe. Which doesn't quite have the kind of look and texture that we want from a movie.”

Luckey said that he starts to see diminishing returns between 90Hz and 120Hz, and that VR is probably going to end up somewhere in that range, at least for the foreseeable future.

As for VR experiences using mobile devices (which Oculus is reportedly working on) Luckey said he thinks we’ll see more simple gaming and communication experiences, like panoramic photo capture and virtual movie theaters.

Emanuel Maiberg is a freelance writer. You can follow him on Twitter @emanuelmaiberg and Google+.

Got a news tip or want to contact us directly? Email news@gamespot.com

Written By

Emanuel Maiberg is a freelance writer in search of the Citizen Kane of burritos.

Want the latest news about The Order: 1886?

The Order: 1886

The Order: 1886

Discussion

795 comments
borlng
borlng

I wonder which one kills your eyesight faster...

Pff, who cares? We're talking about video games here.

LordCrash88
LordCrash88

And another sensationalist headline....Jesus, Gamespot...

Lucker is of course talking about framerates in connection with VR. It's not even worth any news that VR needs at least 60 FPS but that doesn't apply to each and every game out there. There is no need for game to offer 60 FPS if it doesn't even want to offer VR.

There is no reason why Tetris should run in 60 FPS. It just makes no sense at all.

93ChevyNut
93ChevyNut

This all boils down to your preference and the type of game your playing.  I remember playing the first Need for Speed on my PC in college.  There wasn't much you could do in the way of adjusting video options, but interlacing was one such option.  It looked so much better displaying all lines, but when I set it to interlaced, the framerate jumped so much that I never wanted to play it any other way. 

For high speed action games, IMO, 60 fps is more important than graphical fidelity.  For RPGs, I would choose visuals over framerate.

To the untrained eye, I think poor FPS are far easier to identify than poor "graphics".  If you have a sharp enough eye, then you may choose graphics over FPS.

PAL360
PAL360

”I don’t think 30fps is smart,” Luckey said. “It’s not a good artistic decision, it’s a failure.”

I agree. 1080p60 should be mandatory on current gen consoles. It's perfectly possible, though. A few months into the gen, and we already have great looking games like Ground Zeroes at 1080p and flawless 60fps, or Second Son at 1080p40s.

dorjebodh
dorjebodh

i think it all comes down to preference and budget.

i do prefer higher fps, but can i afford it, i would say no. (i am talking about PC)

inuyashagalo
inuyashagalo

Complaining about 30 fps is hypocrisy for most people. The same can't notice being deceived right in front of them, and then, they come here stating differences between 60 and 30 fps... lol
Alright, but enough joking, many comments here are bullshit. Few people like 
Aaronp2k said what should be said. 

dani3po
dani3po

'30fps in a game is perfectly playable' is like saying 'a movie on VHS is perfectly watchable.' It is, obviously, but I prefer to watch it 1080p. We are in 2014!

crushbrain
crushbrain

Just because it doesn't work on his pet project doesn't mean it is a failure.

jacquelineferre
jacquelineferre

Yes, every game on the ps2 and before was a failure and I was completely unable to enjoy them.

Martyr77
Martyr77

The only people saying 60fps doesn't matter are Xbox fanboys because the Xbox One struggles to reach that because it is a weaker system than PS4. 60fps should be a standard at this point.

Aaronp2k
Aaronp2k

60 fps is hugely over-rated. yes it does make a difference I should know since I am a pc gamer. But I also have no problem playing games at 30fps as long as they are capped to 30 fps and do not dip or increase in frame rate. also 30 fps = more power left towards graphics. a 30 fps game will be more visually stunning than a 60 fps game, and the smoothness is hard to tell. it is only when frames dip under 30 fps when you start to notice things like input lag and so on. for multiplayer 60 fps is critical since it requires faster reaction times since players are generally faster acting and have better accuracy than ai, but for a single player game its not necessary. take skyrim for example, the movement in that game is quite slow paced so even if you had it at 60 fps you'd probably never notice.


with all this being said however, I agree from a vr perspective 30 fps would be no good. 60 fps is more lifelike and anything lower than that is unacceptable for virtual reality.

arqe
arqe

60 fps > 30 fps.

Why console gamers still trying to convince people by saying "it only matters when you go multiplayer".

So you could get away with "less" animations , jumpy visuals because you are playing alone ? 
Who are you trying to convince ? People playing on PC or youself for paying more and getting less on console ?

demrocks
demrocks

Just leaving this here for the ignorant console kiddy's who cant understand a diffrence ingame between 30 versus 60 frames per second.


http://30vs60.com/



Dasein808
Dasein808

'sup Maritn Gaston?

How's your disgusting boil and repugnant personality?

tomservo51
tomservo51

2 billion dollars per second is better.

klonoa53
klonoa53

Hey, do you like video games? Because I sure do! Why not just play them instead of fighting like children over the last candy bar. I mean, all this useless arguing could be time you could be using to make games.

jecomans
jecomans

'30fps is more filmic'. I don't think I've read a quote from a dev much closer to pure, unadulterated BS. 

Floymin
Floymin

@LordCrash88 The point of VR is to enhance the FPS (and maybe all open-world) experience. I have always detested open-world games that had frame-rate issues on consoles. I assumed consoles were meant to be FREE of frame-rate issues.

The reason I turn off the shadow, v-sync and AA settings on my PC games is to get a crisp picture. Why have games on a PC if they are gonna look like they belong on a pre-millenial TV?

Zloth2
Zloth2

@93ChevyNut  It's not quite high speed, it's high speed PLUS accuracy.  If all you need to do is click a button ASAP then I think 30 would be fine.  If you have to get a crosshair to point at one tiny area or, in your case, steer a wheel to an exact position then you need to see the screen update more frequently.  When the framerate gets too low, you end up moving past the target/oversteering.

I'm not so sure it would be the same on consoles.  Those analog sticks aren't all that accurate so developers often have to cheat a bit for them anyway.  Maybe they can just fudge it a little more to make up for the framerate as well.  In return, they get prettier graphics and prettier trailers (which are stuck at 24 frames per second anyway) which means prettier pre-order numbers.

93ChevyNut
93ChevyNut

I just reread my post and it's way too mature for the comments section.  Let me see if I can fix that:

XBOTS/SONY PONIES/PC MASTER RACE/PC ELITIST/BASEMENT DWELLERS/CONSOLE PEASANTS!!!!!!

That should just about cover it.

a7x_RoCk3r
a7x_RoCk3r

@inuyashagalo He didn't say anything that needed to be said.  He just stated his own opinion that 30 frames is good enough for him.  It was also full of bullshit, that you can't tell that 30 frames stutters like a horse because his arbitrary line is 29 or lower.  You would absolutely notice Skyrim in 60 over 30 


And who are these 'most' people that can't tell the difference between 30 and 60?  You can't support yourself if you're going to lie like that.  You are spreading uninformed propaganda just like Mr. Aaron and you should be shameful

georgekaplin197
georgekaplin197

@dani3po Tell that to the people who couldn't stand watching the Hobbit HFR. At some point it does become an artistic decision.

arqe
arqe

@crushbrain 400$-500$ device that is so-called "ultimate gaming machine" and also so-called "NEXT-GEN" cant handle graphics and framerates that PC uses almost 10 year now.

Yes its a big fking FAILURE !

shafe-man
shafe-man

@jacquelineferre He means the fact that current consoles aren't running at 60 FPS is a failure for those consoles.

Look at it this way: would Sony or Microsoft be happier if they could have all their games run at 60 FPS? Of course. The fact that they have to settle for 30 FPS means they FAILED to reach the ideal goal of 60 FPS.

a7x_RoCk3r
a7x_RoCk3r

@jacquelineferre Playstation 2 performance was more acceptable when it was out.  Have you heard of the word 'advancing'?  You're a straw man

arqe
arqe

@jacquelineferre re.tards should stay away from these kinda articles. Those ps2 graphics were almost better than PC back in the day. You know some console players called "consoles are better" yeah they are the users of Ps2 and their brains still same size right now and calling console is better.

lonesamurai1
lonesamurai1

@Martyr77 PS4 also struggles to produce a game at 60fps, both consoles are less than spectacular in terms of graphics.

shafe-man
shafe-man

@Aaronp2k "30 fps = more power left towards graphics"

That's exactly what this guy's talking about. None of the current-gen consoles should have to sacrifice visual fidelity for 60 FPS. The fact that some games do just means the consoles are FAILING to offer the smooth 60 FPS both Sony and Microsoft would love to show off.

HUGH_JABALSE
HUGH_JABALSE

@Aaronp2k Agreed. Also, any image latency in VR equates to user nausea so that's probably why Palmer is pushing game devs to strive for 60 fps...

amar1234
amar1234

@arqe


It depends on the game don't be silly. A game like skyrim 30 fps means you can have way more eye candy on which is what the game is all about. Then you have game like wow and diablo 3 which SHOULD be at 60fps and there is no reason why they shouldn't be. 

amar1234
amar1234

@demrocks


You missing the point......it ain't about 60fps not being better ofc it is, its the fact that sometimes 30 fps i good enough. Its never a case of just 30fps vs 60 fps, its nearly always do you want great graphics but then get only 30fps or can you live without the eye candy and then get 60fps.

holtrocks
holtrocks

@klonoa53 Because we need to be pushing the industry forward to get better and better things.

tipsyfreelancer
tipsyfreelancer

@jecomans From a filmmaking perspective, high frame rates look cheaper. Like local news or a used car ad. Theres a reason why movies and most TV shows are shot at 24fps: it looks more filmic.

hellknight40000
hellknight40000

@georgekaplin197 @dani3po Games are not movies. 60 fps is preferable all the time, assuming the visual fidelity can remain the same. 

jecomans
jecomans

@georgekaplin197 The issue with The Hobbit was the frame rate, not the resolution. It is the confusion between the difference between frames in movies and games that The Order dev is using in his BS explanation as to why his game won't run at 60fps. 

klefth
klefth

@lonesamurai1 @Martyr77 The real problem is that the devs are focusing more on visuals than performance. Most of the time they're thinking about cramming more and more effects that we all know will be too taxing for the consoles. They're cutting corners in the wrong areas, imo.

Instead of focusing so much on delivering more and more shinier VFX, they should focus first on a 1080p60 experience and THEN on whatever extras they can deliver in terms of graphics. That is unless the extra visual effects are actually crucial for the game experience, but that rarely ever happens anyway...

arqe
arqe

@amar1234 @arqe how is less fps makes your game "better looking" ? you should go see an eye doctor.

lets see , skyrim is all about looks.

hmm which one i prefer ;

1 - 30 fps choppy animations and camera movements / looks

2 - 60 fps constant fluid , smooth animations with better camera movements / looks

any given day , a decent person with a decent brain selects 2 probably.


arqe
arqe

@amar1234 @demrocks you are missing the point. did you even read the article properly ? consoles are using old hardware and call it next gen. PC's are using 1080p 60fps almost a decade now and we dont call it next-gen , this-gen , that-gen.

Consoles are holding back the industry from evolving.

Yeah what happened with next gen ? Nothing. Same old sh.it with a %10 graphical increase.

EXile1a
EXile1a

@tipsyfreelancer @jecomans 
Do you know why movies were in 24FPS? Because that was the minimum amount needed to fool the human brain to think it's continuous.
Why the minimum? Because the film costs a lot in the '20's.


So the "it looks more filmic" excuse is utter BS, it has been a 100 years it's time for you to catch up to the next century.

jecomans
jecomans

@tipsyfreelancer I am well aware of the effect frame rate has on film. Frames in games are entirely different to films. The dev is intentionally using confusion about this to try and excuse the game not running at 60fps. 

You can really easily jump on a PC and discover that no game has its aesthetic changed by varying the frame rate. 

vassago_
vassago_

@arqe @amar1234 60fps means you're rendering and writing the screen buffer twice as much, ie; you're rendering MORE. That takes away your options for adding in more geometry, textures, particles, lighting and post processing.

vassago_
vassago_

@arqe @amar1234 @demrocks 1080p/60fps on PC for a decade? Sure, if you're buying bleeding-edge tech non-stop. If you want to buy a $1000 game console, have fun with that. Consoles never have been or will be, top-of-the-line. Having 60fps is a privilege, not a right. Developers do the best they can to make the game look great and still get as high a framerate as possible.

tipsyfreelancer
tipsyfreelancer

@jecomans 24 fps isn't the minimum to fool the brain, it's closer to 10-12 (a lot of GIFs are 12).


The advent of digital cinema and HD has resulted in an INCREASE in the amount of productions shooting at 24fps. 20-30 years ago sitcoms were shot at 29.97, now most, if not all of them, are 23.976 (24). Why? It looks more filmic.


One of the big complaints about Peter Jackson's foray into HFR/48fps, was that it made The Hobbit feel cheap, like a soap opera.


I'm not saying it's better for games, but 24fps simply looks better for storytelling and probably will for another 100 years.

arqe
arqe

@vassago_ @arqe @amar1234 and still those "extra" things you think they use ( which they dont care and dont use" will look choppy , crappy etc. and again , no game developer uses extra "visuals" because they stuck at 30fps because of a console that cant handle 60fps in year of 2014 ...

arqe
arqe

@vassago_ @arqe @amar1234 @demrocks No they dont , only reason you need a beast pc for 1080p 60fps is that those "precious" developers doesnt give a sh.it about optimizing their games properly. hardware is more than enough.

jecomans
jecomans

@vassago_ Devs right now are sacrificing play-ability for slightly prettier graphics because they need to showcase the power of the new consoles.  

jecomans
jecomans

@tipsyfreelancer I didn't say the 24 minimum brain thing, that was another poster. Side note: I didn't mind the 48fps at the cinemas (a little dizzying at times), but watching the slower version back home was noticably nicer. I'm interested to see film at 120fps, which apparently gets rid of the 48fps sitcom issue and looks amazing. 


Whilst we are back here... In film - when recording something on either a digital or film camera - you are capturing a snapshot of movement. Depending on your 'shutter speed' you are going to be catching varying amounts of motion blur. This is the key thing that affects what we have been acclimatized to see as 'film-like' with 24fps.


When you are playing a game, each frame is an individually rendered still. There is no movement recorded or played-back. You can add shitty motion blur as a post-effect, but that can occur at any frame rate. So saying that 30 vs 60 is going to give a more filmic look, or do anything else to change the aesthetic like in 'filmed' media, is BS. 


This explanation wasn't necessarily for you, tipsy, but it seems people still read this thread, so I may as well explain what I had posted in a little more detail. 


EXile1a
EXile1a

@tipsyfreelancer @jecomans 
Actually you are wrong.
The reason for the 29.97 FPS for the old sitcoms was because that was the standard for the NTSC coding used in the US. But because a lot of series got sold to other countries they wanted to reduce the cost (Because they had to convert it) they went to the PAL standard of 24 FPS.

About the hobbit in 48fps perhaps people were thrown off because they were used to the lower rate from TV. Then again more and more directors are leaning towards that format; see Camaron and the next Avatar. Perhaps people will resist but with the restrictions of analog filming falling away they will try and get those frames up. (I'd be surprised if the 24fps, as a standard, lasts until 2020.)


tipsyfreelancer
tipsyfreelancer

@EXile1a PAL is 25 not 24, the footage still has to be converted.


Almost all national TV commercials are also shot at 24 (those don't go overseas). The reason that people shoot 24 and not 60 (59.94) today is because it looks better. Has nothing to do with conversions or overseas audiences. Converting is not that expensive these days.


Sitcom frame rates had more to do with budget than NTSC standards. SD video cameras only shot 29.97. Film cameras only shot 24. Shows like Seinfeld and Cheers were shot on 16mm film. Bigger budgets could afford better looking footage. Now everyone shoots 24 because it looks better.


Not saying that it applies to gameplay, but theres is not really any debate about it. 24 is the frame-rate of choice for cinema/drama. Productions have a choice and they always choose 24. I doubt that we'll see widespread adoption of 48fps, except as a novelty, like 3D.

EXile1a
EXile1a

@tipsyfreelancer Actually yes it has a cost to it. Less frames mean less money on after effects and trust me there is a lot of that in commercials. Also the companies that make commercials usually rent their equipment from a facilities company that also rents equipment to networks for news crews outside shots and the like.
So instead of having two sets of cameras that either do 29 or 25 fps And these puppies can range from €35.000 to "Omg I can buy a house with this thing!" It is easier for the facilities to just have one set that does 25 as they would love to get a deal for a sitcom.


(I worked in that business for a while ;) That's also why I don't like Macs the words our editor hurled against those machines would make a sailor cry. XD)

For the rest I will refer to Jecomans reply.
Though perhaps the hobbit was a bit weird because projectors have the same issue as CRT monitors. I'm going to be a prick and say that you're too young to remember those heavy SOB's but you wanted them to hit a minimum of 60hz to look good.