Hey can anyone please please let me know what the song/instrumental at 5:16 is? it really was beautiful...
Consoles are so fucked anyways with 4k monitors, 120hz and G-sync.
You see back then, blu rays were new, HD was relatively new.
Today's filming, movies, shows, are starting to film 4k and 5k quality.
4k is to HD, what HD is to regular quality.
Now if your "next gen" console can't even run at the current gen's resolution. Wtf is your console going to look like in 1-2 years.
In conclusion, asking for 1080p at 60 fps is reasonable. It's not asking too much.
Do we "need" it? No, is it expected or wanted? Yes. The expectations however, is largely little more than wishful thinking and arbitrary reasoning about what constitutes "next-gen," which is largely just a buzzword itself, with no real, definitive meaning.
That being said, 60fps has it's advantages over 30fps in certain circumstances, specifically fast paced action-oriented games, like shooters, as was mentioned in the vid. I'm really glad he addressed the differences between film and games, because this is one of the big things many people never mention or do not even understand.
These new consoles should be powerful enough to run their games at 1080p @ 60fps. Especially when you consider they might be the hardware you'll be using for the next seven or eight years. Imagine if you were still gaming in 720p @ 30fps in 2021!! That'd be ridiculous.
Great job Cam. That's one of the best frame rate pros/cons explanations for the layman that I've seen.
Who has two thumbs and freely admits to being a layman?
"This guy!" (thumb-points to self)
More fps doesn't "look better" as much is at does "feel better". With high fps the latency between your controller and you monitor is less.
30 fps sucks compared to 60 fps and I can clearly tell the difference between the two. Next-gen has no business calling itself next gen if it can't even do 60 fps.
Just to give my opinion on the topic, I can't really tell a huge amount of difference between 30 FPS and 60 FPS. I've played Assassin's Creed III on my PC and when it dips to 15-18 FPS I can really tell the difference and the game becomes a lot less enjoyable. My optimal range seems to be 25-35 FPS, when I play Skyrim and get those numbers it feels fine.
It depends a lot on player preference, I think console gamers are more likely to be used to a lower frame rate. I would expect XOne and PS4 to aim for 60, or at least get something like 40/50
60 fps should be standard for nextgen. it´s just more smooth. the game feels better and more enjoyable to play. 30fps to 60fps is like sd resolution to hd resolution. once you go hd is difficult go go back to sd.
The key for having a really smooth gaming experience is having the FPS match the hz or refresh rate of the monitor. I was playing the old PC game Rune on my Viewsonic P225f which is a high end CRT. My PC was able to run the game at a steady 75fps while the monitor was refreshing 1600X1200 @ 75hz. Having the fps and refresh rate in sync like that is as good as it gets.
I definitely prefer 60fps. Since my PC isn't quite the greatest, I can usually only get 20-30 fps on newer games, but on the occasion where the game puts me in a small room with not much to render, that 60fps I'm suddenly getting is MUCH smoother. The other thing to remember is that the game won't always be at 60fps all the time. It will rise and fall depending on what's happening on screen, and its much better to drop from say, 60 to 40 than from 30 to 15. Not even mentioning stutter, which can have a much more jarring effect on fps perception.
Great reality check this week Cam,I mean they're always very informative,but this one was especially well presented,love it.
The first question that comes to my mind is why don't games employ motion blur like films do to make lower frame rates less jerky.
There was a game that I was following years ago being made by a very small team with the working title "Project Offset" that used this exact idea very successfully as I recall.
Excellent video Cam. The first thing I said to myself was, "I think this video is in 60 fps." It really makes a difference and I hope this becomes the standard for all video/games. Although, I've seen 120 and it is amazing, but as you mentioned, not many people have the hardware capable of that. Furthermore, I would rather see 4K (2160p) resolutions more abundant at this point than 120 fps/Hz.
As Cam pointed out in the video, each frame in game is a perfect render, we all hype up about fps, but do the games have that many frames to display and the hardware capable to display that in a second? Not like in movies where the camera has to try to catch as many frames as it possibly can (comes high speed camera) to display later. Of course, the cost of 60 fps in game with MORE details will leave less profit for game company. So, they can choose to do 30 fps in 60 fps enable system and still don't hurt no one if game still run fine and play great. Even with the worst game, who's going to complain enough to hurt the game makers about the frame rate? We all know the answer.
Same argument since the dawn of computers. 30 years ago I was asking if I needed 64k RAM, or a CGA graphics card. You don't need any of it. You buy it to play newer and better games. Faster is always better.
you can tell the difference between 30 FPS and 60 FPS but most people are used to 30 FPS anyway. The only time I really noticed a slowdown in gameplay was for the ZOE HD collection on the Xbox 360 which runs at 30 FPS in 720p and I was used so much to ZOE 2nd Runner on the PS2 which runs at 60 FPS in 480p and the HD collection felt slower especially the first fight with the 100 of mosquitos onscreen ! Other than that I cant recall having a problem with steady 30 FPS on consoles really.
Games running at 60fps look like TV game shows. Games running at 30 fps look like movies. So I prefer 30 myself. Use the extra juice for other stuff.
As a former Guilty Gear xx online player i can notice a minimum of 10 fps just by playing, the point is from 30 to 60 fps is way more noticeable then from 100 to 130 fps, u feeling me ?
what people dont get is that when they promise 60fps, what they mean is UPTO 60fps. Its for certain games like sports games etc, and the fact out screens can show so many more frames now, so they need to keep up at least in theory. However in reality trust me most games will be 30fps. The consoles simply are not powerful enough to give us the next gen graphics at 60 fps in most games.
30 FPS for single player graphics
for mp based i dont care much about graphics fidelity so 60 FPS for multiplayer sounds about right
Interesting piece, but I had a releted question which I was hoping you could provide some insight to. It seems that for at least some people (myself included), 24 or 30 fps looked "more real" than 60 Hz, despite 60 or 100 Hz in theory being smoother. I was just wondering whether it's because we have been conditioned to watch movies in 24 Hz that we find it hard to adapt to higher frame rates in there, while having no issues with it in games.
human eye can see almost 120fps, each eye may capture 59-60fps smoothly moving actions, its just base on you eye. but with input system(controller or keyboard+mouse) 30-35fps is not enough, you can simply feel the input lagging some where in the game, and more if you blink a lot, over 35fps motion picture will make you not very confortable.
Think about it this way. With 30 fps second, you have 30 times a second to respond to the action. With 60 fps, you have 60. etc, etc. On paper (or LED as is the case here) the difference may sound minimal, but in the heat of action it does make a difference.
@AboAlwe I completely disagree with that. 30FPS is perfectly playable in my eyes and I'm a PC gamer so obviously I would like at least 60 but that is definitely not the minimum requirement for any gamer.
Perhaps, but gaming at 720p/60fps would not be.
@Razor10000 Yea after switching to PC playing on a console is a miserable experience.
@Falzonn The ideal situation is to set a your v-sync or fps cap to match your monitors refresh rate and than adjust your graphics settings to never let the game drop below that cap. In order to do this with ultra settings on new AAA games you need a really fast and expensive computer.
For most of us "PC gamers" we try to keep the frame rate above 40 and to hover around 60 most of the time.
@garrybubba It takes a lot of extra work to create motion blur, and in some cases might actually be more computationally expensive than just rendering more frames.
regardless of how good it looks, motion blur wouldn't reduce the lag
between action and onscreen feedback. My own reflexes are meh and I can't
see any visual differences past ~40fps, but even I notice how much easier games are to control when running at 60fps.
@salmon71 I'm certain we'll see X1 & PS4 games reach 60fps in multiplayer consistently in 2014. It may take a bit longer for devs to become familiar enough with the hardware to get native 1080p @ 60fps in all single player games as well, but I'm confident it will happen soon.
@wildamnesia Someone who also prefers this. I like to keep between 35-45 to get a nice balance between.
@wildamnesia Why not both?
@Xentarim Look up "Uncanny Valley" on Google/Wikipedia. That should help explain.
@jcknapier711 That doesn't make any sense at all.
Even if you have 1 FPS you still have that whole second to respond with, say a keyboard or controller. You just won't see it happening until the next second.
I sure hope that this is not what you're talking about, but it sure sounds like it.
@mischiefmeerkat Why are you shocked? Development of games on consoles has held us back from taking advantage of the latest hardware/power available to us. I hate to sound like a PC elitist, but we have been capable of so much more (as you seem to know) for years and not able to use it because the money has always been in outdated console hardware.
Something to look more forward to though, is the 4K revolution. I would be much happier with a 4K game at 60 fps than a 1080p game at 120/240 fps.
@mischiefmeerkat Because the graphics get more and more demanding. Go back and play some Quake on a 780GTX and you'll probably get over 1000fps. There was a time when I first played the first STALKER PC game and my computer struggled to get over 25fps with the nice lighting enabled. A few years later, when the sequels came out and I had new hardware I went back and played the first on a computer that would run "Call of Pripyat", the third one, OK. While of course my monitor was limited at 60hz, the game's own frame counter reported many scenes to be over 200fps -- but of course action and explosions and other stuff would bring it back down to more realistic levels.
With the new generation of games coming out, I'll personally need to be looking into probably my first SLI set up due to my 2560x1440 27" monitor. While I can play fairly high settings BF3 and BF4 (beta) at around 45-55fps (fluctuating, of course) on my 670GTX, I'm going to need more power if I want to get a better frame rate in games like The Witcher 3. Heck, I'm used to 20fps in Total War Shogun 2 with everything maxxed out.
So, I still think it's going to be all game-to-game as it always was. But maybe more teams will be pushing for it?
The progress is seen in other aspects, like the scope of the games we're playing, and features and functionality of the consoles we play them on. For instance, BF4 is not 1080p on either next-gen console, but it does support 64 player servers. Dedicated servers are another aspect that is becoming increasingly popular on console, an idea that was little more than pipe dreams in 2005.
Then you have features like Game DVR and uploading/downloading content via the cloud, apps like Netflix and Twitch.tv, features like the marketplace that allow players to download full games straight to their console, TV integration, motion controls and voice commands, among others.
Consoles are a closed system, so essentially the amount progress you are talking about - graphics and performance, are essentially dictated by the companies that own them (MS, Sony, Nintendo.) This is one of the reason wy so much progress is seen on PCs, since users are not forced to simply make do with what they have, and can upgrade and swap parts where they see fit.
@revlux88 @d00hicky @Razor10000 This is simply not true. The next gen consoles are not using tri-cores or cell processors, they're using the x86 PC architecture. Which, as evidenced by the whole freaking PC market (games and otherwise), is not a mystery at all. There are very few, if any, optimizations that can be made to have the next-gen consoles have better performance, and we certainly will not see the optimizations that were making a huge difference in the last-gen. Saying that there are is simply wishful thinking/buyer's remorse (though you may be unaware of it).