This topic is locked from further discussion.
You'll be getting 75 FPS in your monitor is set to 75hz refresh.
To check what your monitor can do right click the desktop, hit settings, and go into your monitors advanced settings.
60hz for lcd's i believe (including hdtv's not just computer monitors)out0v0rderMy LCD is at 75, and most decent LCDs can do 75 as well.
60hz for lcd's i believe (including hdtv's not just computer monitors)out0v0rder
actually it depends on the monitor, yours only does 60hz refresh while mine does 75hz refresh
Anything above your monitors refresh rate is redundant! Samsung has got a TV thats able to process 120fps, its 120hrtz. I saw it at Best Buy and it was $4500:o We can't see above 100 frames of motion anyways.
[QUOTE="ThePlothole"]You can't see 100 FPS anyways. It's far more than the eye/brain can percieve.br0kenrabbitWrong.
If you are using a CRT you can see more than 100 but with an LCD it's 60-75 tops. This is one of the reasons i never understand why anyone used SLI. 30fps is all you really need.
Wrong.[QUOTE="br0kenrabbit"][QUOTE="ThePlothole"]You can't see 100 FPS anyways. It's far more than the eye/brain can percieve.Deihmos
If you are using a CRT you can see more than 100 but with an LCD it's 60-75 tops. This is one of the reasons i never understand why anyone used SLI. 30fps is all you really need.
I was referring to the 'human eye' comment, that's why it's quoted.[QUOTE="Deihmos"]Wrong.[QUOTE="br0kenrabbit"][QUOTE="ThePlothole"]You can't see 100 FPS anyways. It's far more than the eye/brain can percieve.br0kenrabbit
If you are using a CRT you can see more than 100 but with an LCD it's 60-75 tops. This is one of the reasons i never understand why anyone used SLI. 30fps is all you really need.
I was referring to the 'human eye' comment, that's why it's quoted.I was responding to the one that said you cannot see it. if you play on a CRT with a game that registers 100fps then move to an LCD it wouldn't look the same.
[QUOTE="ThePlothole"]You can't see 100 FPS anyways. It's far more than the eye/brain can percieve.br0kenrabbitWrong.
HA! I been there before. That explains alot..when I'm playing Crysis some parts process 999.9 FPS..so actually we could see unlimited FPS and not realize it enough to distinguish
[QUOTE="IndianaJosh"]LCDs do not have refresh rates. They have response times.GTSaiyanjin2
they have both
Yes. It's true for all systems, as HDTV's usually refresh at 60hz. Only the newest (and most expensive) models out there refresh at 120hz.
My old CRT monitor could manage 85hz while my newer LCD refreshes at 75hz.
Wrong.[QUOTE="br0kenrabbit"][QUOTE="ThePlothole"]You can't see 100 FPS anyways. It's far more than the eye/brain can percieve.Pro_wrestler
HA! I been there before. That explains alot..when I'm playing Crysis some parts process 999.9 FPS..so actually we could see unlimited FPS and not realize it enough to distinguish
heres a fun little program i found a long time ago that lets you visually distinguish different FPS rates in the same scene.
http://novicee.com/edu/fps_compare.zip
[QUOTE="ThePlothole"]You can't see 100 FPS anyways. It's far more than the eye/brain can percieve.Cali3350
False, that limit is around 340fps according to the airforce.
Can you provide a reputable link to back that up?
Wrong.[QUOTE="br0kenrabbit"][QUOTE="ThePlothole"]You can't see 100 FPS anyways. It's far more than the eye/brain can percieve.Deihmos
If you are using a CRT you can see more than 100 but with an LCD it's 60-75 tops. This is one of the reasons i never understand why anyone used SLI. 30fps is all you really need.
But crysis won't be above 30 fps without SLI gtx/ultra lol.. (GT might do)
60fps is enough to feel genuinly smooth and fluid movement throughout gameplay. There is absolutely no reason for a game dev to sacrifice quality to shoot for anything higher then this. It is 100% justifiable for a dev to shoot for 60fps over 30 however, as there is a huge difference there. In this day and age ANY game running at 30fps is rediculous. Do games like Gears of War and such play and look fine at 30fps? Yes. Do they play and look a hell of a lot better at 60fps? Oh yes.
MS and Sony should make 60fps required imo, but i know not many agree with me.
[QUOTE="Cali3350"][QUOTE="ThePlothole"]You can't see 100 FPS anyways. It's far more than the eye/brain can percieve.ThePlothole
False, that limit is around 340fps according to the airforce.
Can you provide a reputable link to back that up?
Can you provide a link to back up your claim about 100fps?
[QUOTE="ThePlothole"][QUOTE="Cali3350"][QUOTE="ThePlothole"]You can't see 100 FPS anyways. It's far more than the eye/brain can percieve.mjarantilla
False, that limit is around 340fps according to the airforce.
Can you provide a reputable link to back that up?
Can you provide a link to back up your claim about 100fps?
He said it according to his perceptions, you said something you should be able to prove.
Will I still be able to see 100 FPS in a game if my monitor only has a 75hz refresh rate ? And whats the max refresh rate LCD's can do ?GTSaiyanjin2Yes, you'll still be able to see the 100fps in a game even if your monitor isn't refreshing @ 100 hertz. Only problem is that whenever the framerate does not match the fps, you get screen tearing.
The short answer is "yes, you can only see what the monitor can display, and the monitor can display a new image usually 60 times a second".
Pushing extra frames from the GPU to the monitor will actually result in image tearing, because the speed that the GPU is drawing does not correspond to the speed that the monitor draws its pixels.
For example, consider a monitor at 60 Hz (60 frames a second), and a GPU that can output 90 FPS for a particular scene.
At 1/60th of a second, your monitor has drawn one frame and is waiting for frame 2. Your GPU has already drawn 1 1/2 frames at this point. Your monitor, trying to display the 2nd frame, will use what the GPU already has of the 2nd frame.
At 3/120th's of a second (e.g., half of 1/60th's of a second past the first frame), the GPU has finished the 2nd frame and is now working on the 3rd frame. The monitor, only caring about what the GPU is sending it, will start displaying the rest of its second frame (it's still on its second up until 2/60th's of a second) using the GPU's 3rd frame.
Hence on frame 2, your monitor gets half its frame from frame 2 GPU and half from frame 3 GPU. If you're moving quickly, the screen "tears" because the two images are different.
The way to solve this? Vertical sync allows the monitor to ask the GPU for a frame, so frames are always in sync, eliminating the tearing, but locking your frame rate at the refresh rate of the monitor.
Triple buffering also does the trick at the expense of GPU memory; it keeps track of multiple buffers and renders them in the background so you always get a full frame from a fully-rendered buffer instead of half-of-each.
60fps is enough to feel genuinly smooth and fluid movement throughout gameplay. There is absolutely no reason for a game dev to sacrifice quality to shoot for anything higher then this. It is 100% justifiable for a dev to shoot for 60fps over 30 however, as there is a huge difference there. In this day and age ANY game running at 30fps is rediculous. Do games like Gears of War and such play and look fine at 30fps? Yes. Do they play and look a hell of a lot better at 60fps? Oh yes.
MS and Sony should make 60fps required imo, but i know not many agree with me.
Cali3350
minimum of 60fps should be in effect. if Infinity Ward can pump out COD4 at 60fps with those graphics and effects on both consoles, I'm sure first parties could come out with similar results.
when I found out I could unlock the framerate on Bioshock and play at 60+fps, I was surprised. I don't play it any other way now.
Of course, but the confort of an LCD is just fantastic.
The monitor is even more important that the best card because if your screen can't render the video signal of your card well, you are going to have a bad screen quality.
Also, the native resolution is the only resolution you should use for games. Having a native resolution too high is by far not an option for gaming.
The best choice for now is 1440X900 or 1280X1024 which are higher than 720p from an xbox360.
[QUOTE="out0v0rder"]60hz for lcd's i believe (including hdtv's not just computer monitors)mikemil828
actually it depends on the monitor, yours only does 60hz refresh while mine does 75hz refresh
Yeah, most monitors do 75hz...
Refresh rate DOESN'T EVEN APPLY TO LCD MONITORS.PS2_ROCKSYes it does. It's the rate at which the grid electrically refreshes. Each pixels remains lit, but it receives updates at the rate of the refresh rate. The response time is how long it takes it to change after it receives the update.
The spec that you should be concerned with is your *minimum* FPS. Ideally, for an FPS, you want your minimum FPS to be at least 40fps, especially for fast paced games, if you are to consider yourself a competitive player. Once your minimum FPS if above 30 ~ 40fps, everything else just adds additional layers of smoothness, until you hit the 60 ~ 75fps limit of most LCD monitors.
When people talk about getting 100fps, they're usually not talking about what is displayed on their screen, but what is rendered by their GPU. For example, if your GPU can *average* 120fps on a game, then you shoul turn up the graphics options, anti-aliasing, et cetera, since you have "room" to improve the imagine quality, while still keep your minimum framerate high.
That's all. The game isn't "limited" by your monitor particularly though, it's far more limited by your GPU - as achieving 60fps+ as a minimum in modern games takes some doing.
Once you get above 60 fps you don't get any real benefit from higher frame rates while the human eye can tell the difference above 100 fps you don't get any benefit from going higher. The only benefit you get from going above 100 fps is for your e-penis.danielsmith2020
And for competitive PC gaming. PC gamers playing online will often turn their graphical settings down to achieve maximum FPS, especially faster-paced shooters like the Quakes and UTs of the world.
You can play games at a higher framerate then the refresh rate of the monitor. It's called disabling v-sync. But the tradeoff for getting higher fps is screen tearing.coolviper2003Its impossible to display any faster than your hardware can support "in real terms", i can go make a small app that has 4 polygons and get perhaps 1450 FPS .... yet it wont be going faster than this 2ms LCD can handle no matter what the application reports as its rendering speed. Turning off v-sync only makes the application *report* faster FPS, as thats how fast it can *process* without syncing limits not how fast it will display. So its really quite pointless and as you say can have screen tearing if left off (tho it uses a little extra processing so some like it off) So you wont be playing games with a higher framerate at all. You will be playing games with a higher rendering speed and the same maximum refresh rate of your monitor which keeps your frames per second that you can see...no higher at all and FPS you can see is the only thing that matters.
[QUOTE="coolviper2003"]You can play games at a higher framerate then the refresh rate of the monitor. It's called disabling v-sync. But the tradeoff for getting higher fps is screen tearing.DMWhiteDragonIts impossible to display any faster than your hardware can support "in real terms", i can go make a small app that has 4 polygons and get perhaps 1450 FPS .... yet it wont be going faster than this 2ms LCD can handle no matter what the application reports as its rendering speed. Turning off v-sync only makes the application *report* faster FPS, as thats how fast it can *process* without syncing limits not how fast it will display. So its really quite pointless and as you say can have screen tearing if left off (tho it uses a little extra processing so some like it off) So you wont be playing games with a higher framerate at all. You will be playing games with a higher rendering speed and the same maximum refresh rate of your monitor which keeps your frames per second that you can see...no higher at all and FPS you can see is the only thing that matters. It doesn't change the fact that the hardware can render the game at a higher FPS then the refresh of the monitor. The FPS of a game is not limited by the monitor. The FPS WE see is.
[QUOTE="DMWhiteDragon"][QUOTE="coolviper2003"]You can play games at a higher framerate then the refresh rate of the monitor. It's called disabling v-sync. But the tradeoff for getting higher fps is screen tearing.coolviper2003Its impossible to display any faster than your hardware can support "in real terms", i can go make a small app that has 4 polygons and get perhaps 1450 FPS .... yet it wont be going faster than this 2ms LCD can handle no matter what the application reports as its rendering speed. Turning off v-sync only makes the application *report* faster FPS, as thats how fast it can *process* without syncing limits not how fast it will display. So its really quite pointless and as you say can have screen tearing if left off (tho it uses a little extra processing so some like it off) So you wont be playing games with a higher framerate at all. You will be playing games with a higher rendering speed and the same maximum refresh rate of your monitor which keeps your frames per second that you can see...no higher at all and FPS you can see is the only thing that matters. It doesn't change the fact that the hardware can render the game at a higher FPS then the refresh of the monitor. The FPS of a game is not limited by the monitor. The FPS WE see is. I still find "You can play games at a higher framerate then the refresh rate of the monitor" to be a false statement... as you don't PLAY the game any higher at all as playing is about the interation of you and the game and that is limited by your display. So yes the real FPS is always limited by the monitor. Playing a game at 120fps without vsync and playing at 75fps with vsync (assuming fps never moves or changes) will give you the same viewable FPS that you will PLAY with... except there might be tearing without vsync on so its potentionally worse heh Its similar to saying my car can go 350mph without any friction being held in mid-air.... yet it can only go 145mph on the road. Would you then go around saying the speed of your car is 350mph? Thats all im saying here... vsync doesn't increase your fps. A rendered number thats higher than your monitor means you are WASTING your PC's power for no reason at all... as for playing like that? .... just plain crazy heh
Please Log In to post.
Log in to comment