Why don't devs ever try to go for a steady 45 FPS?

This topic is locked from further discussion.

#1 Posted by IcyFlamez96 (1355 posts) -

Just curious

#2 Posted by stereointegrity (10919 posts) -
good question
#3 Posted by Rocker6 (13358 posts) -

I'd like a technical explanation as well... think it has something to do with the TV refresh rate.

#4 Posted by seanmcloughlin (38219 posts) -

I actually don't know why there is no middle ground. I feel like I should know though because there's a noticeable difference between 30 and 45 and I don't know why it's 30 or 60. Probably has something to do with refresh rates and interlacing as most TVs are 60Hz. 

#5 Posted by XVision84 (13807 posts) -

Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30. 

#6 Posted by NFJSupreme (5501 posts) -
30 FPS is the middle ground.
#7 Posted by silversix_ (16882 posts) -
we need a true master warlock nerd to answer this question because 45fps for a controller would've been the absolute perfection when it comes to fps/visuals balance.
#8 Posted by seanmcloughlin (38219 posts) -

Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30. 

XVision84

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

#9 Posted by hartsickdiscipl (14787 posts) -

[QUOTE="XVision84"]

Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30. 

seanmcloughlin

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

 

I've played some games that run at a constant 40-50 fps on a 60hz screen before.  It looked fine.  

#10 Posted by seanmcloughlin (38219 posts) -

[QUOTE="seanmcloughlin"]

[QUOTE="XVision84"]

Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30. 

hartsickdiscipl

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

 

I've played some games that run at a constant 40-50 fps on a 60hz screen before.  It looked fine.  

Was it Vsynced?

#11 Posted by Kinthalis (5340 posts) -

In order to prevent screen tearing you need to use Vsync (a rendering technique that syncs the frame rate to the refresh rate of your display). The typical double buffered version means you need to run at either the display's refresh rate (60 for most TV's) or half that (30). If the frame rate dips below 60 it'll go straight to 30, which is a noticeable drop.

 

For this reason most devs on consoles optimize for 30 fps. There's no real reason to go for 45 fps beacuse then they would have to avoid DB v-sync, and those 15 FPS cna mean adding shadows/detail to objects here and there - improving the graphics over performance.

 

There is another option: tripple buffered Vsync. That would allow a smooth FPS grading from 30 60 or whatever. But that takes up more memory, so it's not something the current gen consoles can muster.

 

Next gen consoles might. Which is good, because that means more PC games will feature tripple buffered Vsync out of the box! Currently on PC only some games support trippled-buffered v-syn natively. If you want to use it with other titles you need to use third party utilities.

#12 Posted by hartsickdiscipl (14787 posts) -

[QUOTE="hartsickdiscipl"]

[QUOTE="seanmcloughlin"]

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

seanmcloughlin

 

I've played some games that run at a constant 40-50 fps on a 60hz screen before.  It looked fine.  

Was it Vsynced?

 

No.

#13 Posted by seanmcloughlin (38219 posts) -

No.

hartsickdiscipl

Strange cos I've done the same and saw lots of tearing

#14 Posted by ShadowriverUB (5515 posts) -
HDMI only supports 24Hz, 30Hz, 60 Hz and analog 50Hz for PAL and 60Hz for NTSC which both can be safly divided in half creating 25Hz and 30Hz. If you lacking one frame in refresh rate you gonna have duplicate frame or tearing. Thats why developers aims for specific refrash rate, developers pick to aim at 30Hz as it a safer option. One of main reasons why games become region lock due PAL and NTSC is because refresh rate, which required optimized code for each. DDR for example is buggy when you use PAL <-> NTSC hacks You don't see that on PC for simple reasons, 1. VGA and DVI supports more refrash rate options (not to mention resolutions too) 2. Devloper have no idea which one you use so they make game universal and let you set or do that for you Higher FPS is useless waste of power if you use lower refrash rate then fps you have, you display simply not able to show all the frames that device generates
#15 Posted by Darth_Kane (2966 posts) -

[QUOTE="hartsickdiscipl"]

[QUOTE="seanmcloughlin"]

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

seanmcloughlin

 

I've played some games that run at a constant 40-50 fps on a 60hz screen before.  It looked fine.  

Was it Vsynced?

A game only needs to be Vsynced if it goes above 60fps. And even then the only difference is some screen-tearing

Anyway, current gen consoles can barely run 30fps

#16 Posted by seanmcloughlin (38219 posts) -

[QUOTE="seanmcloughlin"]

[QUOTE="hartsickdiscipl"]

 

I've played some games that run at a constant 40-50 fps on a 60hz screen before.  It looked fine.  

Darth_Kane

Was it Vsynced?

A game only needs to be Vsynced if it goes above 60fps. And even then the only difference is some screen-tearing

Anyway, current gen consoles can barely run 30fps

I was always under the impression Vsync was needed when the frames weren't near the refresh rate of your TV. When I played Crysis 1 on my TV through my PC before it ran at 35 fps and it had screen tearing like crazy

#17 Posted by clyde46 (47872 posts) -

[QUOTE="XVision84"]

Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30. 

seanmcloughlin

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."
#18 Posted by seanmcloughlin (38219 posts) -

[QUOTE="seanmcloughlin"]

[QUOTE="XVision84"]

Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30. 

clyde46

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."

I ain't readin all that :P Am I right or wrong? 

#19 Posted by the_bi99man (11062 posts) -

Meh. Just game on PC. Where the framerate is (at least somewhat) under  your  control.

#20 Posted by Kinthalis (5340 posts) -

[QUOTE="seanmcloughlin"]

[QUOTE="hartsickdiscipl"]

 

I've played some games that run at a constant 40-50 fps on a 60hz screen before.  It looked fine.  

Darth_Kane

Was it Vsynced?

A game only needs to be Vsynced if it goes above 60fps. And even then the only difference is some screen-tearing

Anyway, current gen consoles can barely run 30fps

 

This is not true.

 

Screne tearing is an artifact that occurs whenever the GPU and display are not in sync. Regardless of frame rate. You can be running at 60 FPS or 30 FPS EXACTLY, but if the display and GPU aren't synced, you are still going to get screen tearing artifacts.

#21 Posted by ShadowriverUB (5515 posts) -
[QUOTE="seanmcloughlin"]

[QUOTE="XVision84"]

Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30. 

clyde46

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."

This is only for analog signal transmition, current consoles are work for digital output, which same standards are used in whole world based on MPEG (for TV stations transmitions) and HDMI specification. So i general this is obsulite for this gen and next gen
#22 Posted by DrTrafalgarLaw (4487 posts) -

[QUOTE="Darth_Kane"]

[QUOTE="seanmcloughlin"]

Was it Vsynced?

Kinthalis

A game only needs to be Vsynced if it goes above 60fps. And even then the only difference is some screen-tearing

Anyway, current gen consoles can barely run 30fps

 

This is not true.

 

Screne tearing is an artifact that occurs whenever the GPU and display are not in sync. Regardless of frame rate. You can be running at 60 FPS or 30 FPS EXACTLY, but if the display and GPU aren't synced, you are still going to get screen tearing artifacts.

Yup. V-sync in options has built-in 60fps-lock. Don't tell me hermits don't even know their stuff? :?

#23 Posted by clyde46 (47872 posts) -
[QUOTE="clyde46"][QUOTE="seanmcloughlin"]

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

ShadowriverUB
Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."

This is only for analog signal transmition, current consoles are work for digital output, which same standards are used in whole world based on MPEG (for TV stations transmitions) and HDMI specification. So i general this is obsulite for this gen and next gen

Yea, its a lot easier now that the world has moved to digital.
#24 Posted by R4gn4r0k (17620 posts) -

I don't think every console game out there is either 30FPS or 60FPS. It's just that those two framerates get talked about the most, for example in FPS discussions.

#25 Posted by PAL360 (27428 posts) -

I think it has something to do with screen tearing. It would be cool, though. At least it would guarantee no more sub30fps.

#26 Posted by clyde46 (47872 posts) -

[QUOTE="clyde46"][QUOTE="seanmcloughlin"]

AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look right 

But European TVs have different refresh rates of like 25,50 and 100. 

But I could be totally wrong

seanmcloughlin

Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."

I ain't readin all that :P Am I right or wrong? 

Yes, yes you are correct.
#27 Posted by Riverwolf007 (24428 posts) -

who needs it?

all i do on pc is crank to max then reduce the settings until i hit 30 fps anyway.

i devote my asserts to where it does the most good and to me extra fps is just not worth it.

#28 Posted by R4gn4r0k (17620 posts) -

I think it has something to do with screen tearing. It would be cool, though. At least it would guarantee no more sub30fps.

PAL360

a steady 30FPS would also guarantee no sub 30fps :P

#29 Posted by seanmcloughlin (38219 posts) -

[QUOTE="seanmcloughlin"]

[QUOTE="clyde46"] Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."clyde46

I ain't readin all that :P Am I right or wrong? 

Yes, yes you are correct.

Nailed it :cool:

#30 Posted by BeardMaster (1686 posts) -

will cause screen tearing, but even with vsync if a tv is running at 60hz, but only receiving 45fps, that means 15 of those frames will be shown twice in order to meet the 60hz refresh rate. this results in judder.

#31 Posted by ManatuBeard (1121 posts) -

Variable framerates have been used succesfully. Both Metal Gear Rising Revengeance and God of War Ascension have a max of 60fps and an average bitrate of 40-50 and neither of them has screentear because of triple buffer being used.

I think it would be a great solution for nextgen.

#32 Posted by PCgameruk (1512 posts) -

Framerate does not sell games

Graphics sells games

30fps is fine for devs and not fine for PC gamers

 

Simple Facts...

#33 Posted by Wasdie (50925 posts) -

Not much of a point. At 45 fps you're still going to have noticeable input lag and general sluggishness. Not as bad as 30 fps, but not as good as 60fps.

The reason devs just cap off at 30 if they can't maintain 60fps is that it gives them more headroom to fill that 30 fps with graphical features and whatnot. 30 is pretty much the minimum you can go before it becomes noticeably choppy. There is also the issue of screen tearing. It's easier to work in halves again. At 45 fps 15 of those frames could potentially be split between two cycles of the display.

If you can't hit 60 steady there is no point for settling in between. Might as well drop it to 30 fps and increase the graphical fidelity or decrease graphical fidelity and run at 60 fps. 

There is probably a lot to do with game timers and how the game is synced. 30 is half of 60, and 60 is generally considered game speed running at 100%. Easier to work in halves than in other fractions. That's more of a guess though as I don't program video games. I know oldschool games were locked at 60 fps as the game was synced to run at that speed. Any faster or slower would result in a physical increase of speed of the game or decrease of speed. This is why emulators have to lock framerates and you can't increase the FPS of an emulated game without really weird things happening.

#34 Posted by PAL360 (27428 posts) -

[QUOTE="PAL360"]

I think it has something to do with screen tearing. It would be cool, though. At least it would guarantee no more sub30fps.

R4gn4r0k

a steady 30FPS would also guarantee no sub 30fps :P

But there is no such thing as steady 30fps :twisted:

#35 Posted by Hseptic (1566 posts) -
I wish more developers went for 60fps. It just looks so much more fluid to me.
#36 Posted by ShadowriverUB (5515 posts) -
I wish more developers went for 60fps. It just looks so much more fluid to me.Hseptic
Next gen gonna go 30fps either way to squish juices out of device to run newer games, as people don't mind 30fps (except minority on places like SW ;p) and like more nice looking graphics, developers will prefer to make graphics nicer then washing them out for 60fps
#37 Posted by ZoomZoom2490 (3943 posts) -

TEARING HAPPENS WHEN GPU RENDERS FRAMES ABOVE THE REFRESH RATES, THIS IS WHY VSYNC IS USED, TO LOCK THE FPS.

YOU CAN USE VSYNC ONLY FOR 30FPS AND 60FPS, DISPLAYS DONT SUPPORT 45FPS.

GAMES CAN BE PROGRAMMED TO RUN AT 45FPS BUT WITH VSYNC OFF, BUT DEVS HAVE TO CODE THE GAME VERY CAREFULLY NOT TO GO ABOVE 45FPS.

#38 Posted by ManatuBeard (1121 posts) -

TEARING HAPPENS WHEN GPU RENDERS FRAMES ABOVE THE REFRESH RATES, THIS IS WHY VSYNC IS USED, TO LOCK THE FPS.

 

YOU CAN USE VSYNC ONLY FOR 30FPS AND 60FPS, DISPLAYS DONT SUPPORT 45FPS.

 

GAMES CAN BE PROGRAMMED TO RUN AT 45FPS BUT WITH VSYNC OFF, BUT DEVS HAVE TO CODE THE GAME VERY CAREFULLY NOT TO GO ABOVE 45FPS.

 

 

ZoomZoom2490

Why are you shouting?

#39 Posted by 04dcarraher (20863 posts) -

Not much of a point. At 45 fps you're still going to have noticeable input lag and general sluggishness. Not as bad as 30 fps, but not as good as 60fps.

The reason devs just cap off at 30 if they can't maintain 60fps is that it gives them more headroom to fill that 30 fps with graphical features and whatnot. 30 is pretty much the minimum you can go before it becomes noticeably choppy. There is also the issue of screen tearing. It's easier to work in halves again. At 45 fps 15 of those frames could potentially be split between two cycles of the display.

If you can't hit 60 steady there is no point for settling in between. Might as well drop it to 30 fps and increase the graphical fidelity or decrease graphical fidelity and run at 60 fps. 

There is probably a lot to do with game timers and how the game is synced. 30 is half of 60, and 60 is generally considered game speed running at 100%. Easier to work in halves than in other fractions. That's more of a guess though as I don't program video games. I know oldschool games were locked at 60 fps as the game was synced to run at that speed. Any faster or slower would result in a physical increase of speed of the game or decrease of speed. This is why emulators have to lock framerates and you can't increase the FPS of an emulated game without really weird things happening.

Wasdie
Ya I remember a handful of games being at crazy fps and the games were in like fast forward mode :P enabling V-sync fixed that problem.