What's a good frame rate? What should it be?

This topic is locked from further discussion.

Avatar image for hg199
#1 Posted by hg199 (78 posts) -

 

The card I am looking at is the: Inno3D 8800GTS Overclocked 640MB 320-bit DDR3, 570MHz Core Clock, 1800MHz Memory Clock Nvidia Gefore.

The frame rates are reported to be, by example;

 

Battlefield 2142 - 1600x1200x32, 4xAA, 8xAF, Max Quality: 45.2 FPS

Doom 3 - 1600x1200x32, 4xAA, 8xAF, Ultra Quality: 78.7 FPS

 

Is this good? Acceptable? Crap?

I am a TV Editor, so I just work with video on 25 FPS!

 

For example, any game the run below 40 FPS look bad? 

 

Thank you 

Avatar image for coolmonkeykid
#2 Posted by coolmonkeykid (3276 posts) -

40 fps and higher is good(for me atleast). Some people demand atleast 60fps. Depends on the game. With shooters you want a high framerate, in RTS a fps of 30 is playable, and in mmos a framerate of 30 is also playable.

Its a very acceptable card. 

Avatar image for crazymonkey092
#3 Posted by crazymonkey092 (974 posts) -

40 fps and higher is good(for me atleast). Some people demand atleast 60fps. Depends on the game. With shooters you want a high framerate, in RTS a fps of 30 is playable, and in mmos a framerate of 30 is also playable.

Its a very acceptable card.

coolmonkeykid

 

pretty much what he said, I really like 60 the most. 

Avatar image for X360PS3AMD05
#4 Posted by X360PS3AMD05 (36303 posts) -
since i use vsync i want solid 60.
Avatar image for theragu40
#5 Posted by theragu40 (3332 posts) -
since i use vsync i want solid 60.X360PS3AMD05
...of course you could always turn vsync off. For most people, 30 is going to be the 'playable' threshold, and 25 or so will be the cutoff where things start to become noticeably choppy during gameplay. For faster paced shooters (think Unreal Tournament), higher is better. Single-player shooters can often be playable with only 30, however. Basically, the higher the better.
Avatar image for X360PS3AMD05
#6 Posted by X360PS3AMD05 (36303 posts) -
[QUOTE="X360PS3AMD05"]since i use vsync i want solid 60.theragu40
...of course you could always turn vsync off.

then i notice tearing.........
Avatar image for theragu40
#7 Posted by theragu40 (3332 posts) -
[QUOTE="theragu40"][QUOTE="X360PS3AMD05"]since i use vsync i want solid 60.X360PS3AMD05
...of course you could always turn vsync off.

then i notice tearing.........

Yeah that's true. I guess I kind of ignore it. Maybe if my computer was a little faster, I would be a little pickier :).
Avatar image for Zeke129
#9 Posted by Zeke129 (11176 posts) -

Frame rate is objective, and the ability to notice differences between frame rates grows exponentially.

Some people are happy with 30, extremely happy with 60, ecstatic at 100, but then they stop noticing differences. Others DEMAND 60, become happy at 100, and are extremely happy at 250.

While I'm on the subject I'll clear up another myth: "The eyes can't see frame rates over xxx!"

False! You can notice any frame rates, even 2000 and above. It's simply relative. You'll notice a difference between 10 and 20, 40 and 60, 100 and 500, even 1000 and 10000. But not 1000 and 1100.

 

Just find something you're happy with. I'll stop typing now.

Avatar image for Runningflame570
#10 Posted by Runningflame570 (10388 posts) -
30FPS is acceptable, 45FPS is kind of odd IMO, and well 78FPS...I doubt your monitor can even display that (mine goes up to 75FPS) so its very good.
Avatar image for hg199
#11 Posted by hg199 (78 posts) -

 

Thanks so much for all your help. You have answered my questions! 

Avatar image for nohnaimer
#12 Posted by nohnaimer (513 posts) -
hell i even find 20fps playable, to me 30,40,50,60,70 all look identical and feel identical to me.
Avatar image for FPS_Addict
#13 Posted by FPS_Addict (817 posts) -
drop your screen res in-game to 1280 x 1024... will still look amazing and will give you about a 10 - 20 fps increase most likely
Avatar image for Masterfulfish
#14 Posted by Masterfulfish (230 posts) -

since i use vsync i want solid 60.X360PS3AMD05

what is vsync? 

Avatar image for FPS_Addict
#15 Posted by FPS_Addict (817 posts) -

something to do with synchrinizing two frames into one frame to prevent tearing and artifacts on screen..

Thus, it usually almost halfs your frame rate.

You dont need it, but if you can use it, it is quite ideal. 

Avatar image for dbowman
#16 Posted by dbowman (6836 posts) -
25+ is very playable. People who demand 60+ are just being stupid. You can't tell the difference between 40 and 60.
Avatar image for Large_Soda
#17 Posted by Large_Soda (8658 posts) -

 

The card I am looking at is the: Inno3D 8800GTS Overclocked 640MB 320-bit DDR3, 570MHz Core Clock, 1800MHz Memory Clock Nvidia Gefore.

The frame rates are reported to be, by example;

 

Battlefield 2142 - 1600x1200x32, 4xAA, 8xAF, Max Quality: 45.2 FPS

Doom 3 - 1600x1200x32, 4xAA, 8xAF, Ultra Quality: 78.7 FPS

 

Is this good? Acceptable? Crap?

I am a TV Editor, so I just work with video on 25 FPS!

 

For example, any game the run below 40 FPS look bad?

 

Thank you

hg199

You are a TV editor?  Shouldn't you be working with 29.97 FPS? 

Avatar image for BounceDK
#18 Posted by BounceDK (7381 posts) -
40+ and nothing less.
Avatar image for coolmonkeykid
#19 Posted by coolmonkeykid (3276 posts) -

25+ is very playable. People who demand 60+ are just being stupid. You can't tell the difference between 40 and 60. dbowman

i can tell the difference between 50 and 60 sometimes..

Anyways, what is tearing? 

Avatar image for filmography
#20 Posted by filmography (3202 posts) -
a good frame rate is 40, higher than that is over kill really.
Avatar image for Flame_Co
#21 Posted by Flame_Co (620 posts) -
how do you measure frame rates without using fraps, fraps always SUPER SLOWS my computer...
Avatar image for coolmonkeykid
#22 Posted by coolmonkeykid (3276 posts) -
there are certain commands you can use. In steam games type cl_showfps 2 in console.
Avatar image for Masterfulfish
#23 Posted by Masterfulfish (230 posts) -

or (in CS:S anyway, havn't tried others) net_graph3

 

Avatar image for coolmonkeykid
#24 Posted by coolmonkeykid (3276 posts) -
most console games run at a locked framerate of 30 fps.
Avatar image for crazymonkey092
#25 Posted by crazymonkey092 (974 posts) -

[QUOTE="X360PS3AMD05"]since i use vsync i want solid 60.Masterfulfish

what is vsync?

 

vsync(Vertical Sync) makes the frames slow down because the frames have to stay in sync with the monitor's refresh rate. Vsync is actually bad to use because it makes everything in game to wait for the monitor to allow it to stay in sync. Meaning your mouse will lag and other things can get messed up. Honestly though you probably wont noticed unless your playing a fast paced game. I dont ever use vsync and I just use a force refresh rates program that keeps the game refresh rate the same as when im just on desktop which I never get tearing anymore.  

Avatar image for doomsdaydave11
#26 Posted by doomsdaydave11 (1159 posts) -

dude, I'm playing modern games with my cheapo 7900 GS and I'm getting:

AoE3 (1440x900, maxed, noAA) 65+ fps

CoH (1440x900, low, noAA, 2xAF) 30 fps

CoD2 (1440x900, maxed, noAA, 4xAF) 35 fps

They all look beautiful and run smooth. Go for it. 

 

Avatar image for doomsdaydave11
#27 Posted by doomsdaydave11 (1159 posts) -

most console games run at a locked framerate of 30 fps.coolmonkeykid

yeah. I notice much, much, smoother gameplay on a pc. Even if you're playing an Xbox game on a X360, it still looks and runs the same as it would on the oldschool Xbox. 

Avatar image for Flame_Co
#28 Posted by Flame_Co (620 posts) -

there are certain commands you can use. In steam games type cl_showfps 2 in console.coolmonkeykid

 But what abouts in other games, I'd like to know my FPS in Stalker, SUP-COM, and Half-life 2.

Avatar image for WhOOmpa260
#29 Posted by WhOOmpa260 (600 posts) -

Frame rate is objective, and the ability to notice differences between frame rates grows exponentially.

Some people are happy with 30, extremely happy with 60, ecstatic at 100, but then they stop noticing differences. Others DEMAND 60, become happy at 100, and are extremely happy at 250.

While I'm on the subject I'll clear up another myth: "The eyes can't see frame rates over xxx!"

False! You can notice any frame rates, even 2000 and above. It's simply relative. You'll notice a difference between 10 and 20, 40 and 60, 100 and 500, even 1000 and 10000. But not 1000 and 1100.

 

Just find something you're happy with. I'll stop typing now.

Zeke129

Totally 100% false. The human eye cannot distinguish between frames going higher than 30-35fps, if you don't know anything about Physics or Biology than don't make a comment that you don't know as fact. For instance, if you look at a liht bulb, it is alternating at 50 cycles per second, which (by your logic) would be viewed as individual cycles but very quickly, thus you'd see rapid changes in light intensity. This is obviously not true, you only see a continuous stream of light at a particular intensity and yet the light is alternating 50 times a second (ie between brighter and dimmer...because the light intensity is modeled by a sine curve).

Anyway, back to the topic and yes those fps are fine. 

Avatar image for doomsdaydave11
#30 Posted by doomsdaydave11 (1159 posts) -
[QUOTE="Zeke129"]

Frame rate is objective, and the ability to notice differences between frame rates grows exponentially.

Some people are happy with 30, extremely happy with 60, ecstatic at 100, but then they stop noticing differences. Others DEMAND 60, become happy at 100, and are extremely happy at 250.

While I'm on the subject I'll clear up another myth: "The eyes can't see frame rates over xxx!"

False! You can notice any frame rates, even 2000 and above. It's simply relative. You'll notice a difference between 10 and 20, 40 and 60, 100 and 500, even 1000 and 10000. But not 1000 and 1100.

 

Just find something you're happy with. I'll stop typing now.

WhOOmpa260

Totally 100% false. The human eye cannot distinguish between frames going higher than 30-35fps, if you don't know anything about Physics or Biology than don't make a comment that you don't know as fact. For instance, if you look at a liht bulb, it is alternating at 50 cycles per second, which (by your logic) would be viewed as individual cycles but very quickly, thus you'd see rapid changes in light intensity. This is obviously not true, you only see a continuous stream of light at a particular intensity and yet the light is alternating 50 times a second (ie between brighter and dimmer...because the light intensity is modeled by a sine curve).

Anyway, back to the topic and yes those fps are fine.

I don't agree with either of you. I can notice a huge difference between 35 fps and 65 fps, but I notice no difference in anything over 75 fps. 

Avatar image for WhOOmpa260
#31 Posted by WhOOmpa260 (600 posts) -
[QUOTE="WhOOmpa260"][QUOTE="Zeke129"]

Frame rate is objective, and the ability to notice differences between frame rates grows exponentially.

Some people are happy with 30, extremely happy with 60, ecstatic at 100, but then they stop noticing differences. Others DEMAND 60, become happy at 100, and are extremely happy at 250.

While I'm on the subject I'll clear up another myth: "The eyes can't see frame rates over xxx!"

False! You can notice any frame rates, even 2000 and above. It's simply relative. You'll notice a difference between 10 and 20, 40 and 60, 100 and 500, even 1000 and 10000. But not 1000 and 1100.

 

Just find something you're happy with. I'll stop typing now.

doomsdaydave11

Totally 100% false. The human eye cannot distinguish between frames going higher than 30-35fps, if you don't know anything about Physics or Biology than don't make a comment that you don't know as fact. For instance, if you look at a liht bulb, it is alternating at 50 cycles per second, which (by your logic) would be viewed as individual cycles but very quickly, thus you'd see rapid changes in light intensity. This is obviously not true, you only see a continuous stream of light at a particular intensity and yet the light is alternating 50 times a second (ie between brighter and dimmer...because the light intensity is modeled by a sine curve).

Anyway, back to the topic and yes those fps are fine.

I don't agree with either of you. I can notice a huge difference between 35 fps and 65 fps, but I notice no difference in anything over 75 fps.

Well, like i said, that's impossible. Maybe because you see the number of fps increase, you think you can see a difference but im teeling you that you can't. The only explanation i have for you is that you're not human :p, uou must be an alien with super eye sight...the next superman??? 

Avatar image for Skullheart
#32 Posted by Skullheart (2054 posts) -
[QUOTE="coolmonkeykid"]

40 fps and higher is good(for me atleast). Some people demand atleast 60fps. Depends on the game. With shooters you want a high framerate, in RTS a fps of 30 is playable, and in mmos a framerate of 30 is also playable.

Its a very acceptable card.

crazymonkey092

 

pretty much what he said, I really like 60 the most.

Yeah but in games like Call of Dut 2 60 FPS won't do crap online. Oh, you can still be good, but not nearly as good as the people getting 150+ 

Avatar image for Targzissian
#33 Posted by Targzissian (1228 posts) -

I'm pretty sure a lot of people can tell the difference between say 40 FPS and 60 FPS. But 60 FPS and 100 FPS is more of a challenge, although some people can still tell the difference. Now how about the difference between 100 FPS and 200 FPS? No way. The human eye and nervous system does have its limitations, in spite of individual variation.

Oh, and I played S.T.A.L.K.E.R.: Shadow of Chernobyl all the way through (to one of the endings at the Power Plant) at 20 to 25 FPS, and I was quite happy and immersed with it. Sure, it would have felt a little smoother at 45 FPS, but I would have had to sacrifice visual quality or resolution to do that, and I am not willing to do that. Maybe if I was playing an online shooter against human opponents, then I would be willing to make that sacrifice, but not when playing by myself.
Avatar image for JSDempsey
#34 Posted by JSDempsey (1803 posts) -
[QUOTE="crazymonkey092"][QUOTE="coolmonkeykid"]

40 fps and higher is good(for me atleast). Some people demand atleast 60fps. Depends on the game. With shooters you want a high framerate, in RTS a fps of 30 is playable, and in mmos a framerate of 30 is also playable.

Its a very acceptable card.

Skullheart

 

pretty much what he said, I really like 60 the most.

Yeah but in games like Call of Dut 2 60 FPS won't do crap online. Oh, you can still be good, but not nearly as good as the people getting 150+ 

Getting a difference that unnoticable will not make you a better player, if that's what you mean.

Avatar image for Targzissian
#35 Posted by Targzissian (1228 posts) -
One correction about S.T.A.L.K.E.R.. Specifically I was at 20 to 25 FPS in the outdoor areas. Underground and in buildings where the draw distances are much less it got up to over 35 FPS or thereabouts. This is a common thing. I found significant differences in the framerate for The Elder Scrolls IV: Oblivion when roaming about the forests and countryside, as opposed to being in towns or dungeons where you don't have all the trees and foliage to render.
Avatar image for WhOOmpa260
#36 Posted by WhOOmpa260 (600 posts) -

I'm pretty sure a lot of people can tell the difference between say 40 FPS and 60 FPS. But 60 FPS and 100 FPS is more of a challenge, although some people can still tell the difference. Now how about the difference between 100 FPS and 200 FPS? No way. The human eye and nervous system does have its limitations, in spite of individual variation.

Oh, and I played S.T.A.L.K.E.R.: Shadow of Chernobyl all the way through (to one of the endings at the Power Plant) at 20 to 25 FPS, and I was quite happy and immersed with it. Sure, it would have felt a little smoother at 45 FPS, but I would have had to sacrifice visual quality or resolution to do that, and I am not willing to do that. Maybe if I was playing an online shooter against human opponents, then I would be willing to make that sacrifice, but not when playing by myself.Targzissian

I keep on telling you guys, it's physically impossible to see a difference at anything above 35fps. I'm betting the only difference you're seeing in games is the lack of going below 35fps, thus not noticing any lag during play compared with dipping just below 35fps during open areas or heavy shooting etc.

My proof is the hmble light bulb, operating at 50Herz in Europe/Australia and 60herz in America, this means the filament is alternating its light intensity from dim to bright at a rate of either 50times per second (50herz) or 60 times per second (60 herz). Everyone who says they can tell the difference between 35fps and 100fps must also be able to notice the fluctuations in light intensity of the 50herz or 60herz light. So if you think you can tell the difference between 40fps and 100fps, then please, have a look at a light bulb...if you can't see a continuous stream of mono-intensity light, but rapid fluctuations of bright and dim light, then congratulations because you just made a giant leap in the evolution of the human eye and nervous system!  

Avatar image for Chris_53
#37 Posted by Chris_53 (5436 posts) -
For me, 30 is the minimum allthough i can tolerate it if it goes down to 25 eventhough it looks choppy. I like good graphics so its something i have to tolerate at times. Ive using a 7600GT and im upgrading to a 7950GT soon as i find getting a 8800GTS to expensive and i would need a new case and PSU. Plus im not getting Vista yet. Anyway when i get the 7950GT, i can have good graphics and good frame-rates, yay
Avatar image for Large_Soda
#38 Posted by Large_Soda (8658 posts) -
[QUOTE="Targzissian"]

I'm pretty sure a lot of people can tell the difference between say 40 FPS and 60 FPS. But 60 FPS and 100 FPS is more of a challenge, although some people can still tell the difference. Now how about the difference between 100 FPS and 200 FPS? No way. The human eye and nervous system does have its limitations, in spite of individual variation.

Oh, and I played S.T.A.L.K.E.R.: Shadow of Chernobyl all the way through (to one of the endings at the Power Plant) at 20 to 25 FPS, and I was quite happy and immersed with it. Sure, it would have felt a little smoother at 45 FPS, but I would have had to sacrifice visual quality or resolution to do that, and I am not willing to do that. Maybe if I was playing an online shooter against human opponents, then I would be willing to make that sacrifice, but not when playing by myself.WhOOmpa260

I keep on telling you guys, it's physically impossible to see a difference at anything above 35fps. I'm betting the only difference you're seeing in games is the lack of going below 35fps, thus not noticing any lag during play compared with dipping just below 35fps during open areas or heavy shooting etc.

My proof is the hmble light bulb, operating at 50Herz in Europe/Australia and 60herz in America, this means the filament is alternating its light intensity from dim to bright at a rate of either 50times per second (50herz) or 60 times per second (60 herz). Everyone who says they can tell the difference between 35fps and 100fps must also be able to notice the fluctuations in light intensity of the 50herz or 60herz light. So if you think you can tell the difference between 40fps and 100fps, then please, have a look at a light bulb...if you can't see a continuous stream of mono-intensity light, but rapid fluctuations of bright and dim light, then congratulations because you just made a giant leap in the evolution of the human eye and nervous system!

Well since you are an internet dweller I don't believe you, nay, CAN'T believe you. I don't buy the light bulb theory, you are eessentially stating that the alternating filament is the same as off/on. While one filament is on the other is off. It's happening so fast we cannot see it, just like we can't see the effect of interlacing on television, we aren't noticing the change from odd to even. But that isn't applicable to frame rates. I can see the difference between film (24FPS) and video (30FPS). I may not be able to tell you what specific framerate something is running at, but I can distinguish between the two based on the fluidity of the motion.

A light bulb is never changing, it is always bright and motionless. A video game is not static in terms of motion or content, so being able to see a difference in framerate is totally possible. Again, not the ability to say "this is 45FPS, an that is 93FPS", but being able to tell a difference.

If you have a 360 download the DiRT demo which is playing at 30FPS and then download the Forza demo which is 60FPS and tell me there is no difference in the motion.

Or at least give a link to some genius that knows all and has this figured out. This debate is about as debatable as "Does God exist?".