Just curious
This topic is locked from further discussion.
I actually don't know why there is no middle ground. I feel like I should know though because there's a noticeable difference between 30 and 45 and I don't know why it's 30 or 60. Probably has something to do with refresh rates and interlacing as most TVs are 60Hz.Â
Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30.Â
Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30.Â
XVision84
AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
[QUOTE="XVision84"]
Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30.Â
seanmcloughlin
AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
Â
I've played some games that run at a constant 40-50 fps on a 60hz screen before. Â It looked fine. Â
[QUOTE="seanmcloughlin"]
[QUOTE="XVision84"]
Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30.Â
hartsickdiscipl
AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
Â
I've played some games that run at a constant 40-50 fps on a 60hz screen before. Â It looked fine. Â
Was it Vsynced?
In order to prevent screen tearing you need to use Vsync (a rendering technique that syncs the frame rate to the refresh rate of your display). The typical double buffered version means you need to run at either the display's refresh rate (60 for most TV's) or half that (30). If the frame rate dips below 60 it'll go straight to 30, which is a noticeable drop.
Â
For this reason most devs on consoles optimize for 30 fps. There's no real reason to go for 45 fps beacuse then they would have to avoid DB v-sync, and those 15 FPS cna mean adding shadows/detail to objects here and there - improving the graphics over performance.
Â
There is another option: tripple buffered Vsync. That would allow a smooth FPS grading from 30 60 or whatever. But that takes up more memory, so it's not something the current gen consoles can muster.
Â
Next gen consoles might. Which is good, because that means more PC games will feature tripple buffered Vsync out of the box! Currently on PC only some games support trippled-buffered v-syn natively. If you want to use it with other titles you need to use third party utilities.
[QUOTE="hartsickdiscipl"]
[QUOTE="seanmcloughlin"]
AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
seanmcloughlin
Â
I've played some games that run at a constant 40-50 fps on a 60hz screen before. Â It looked fine. Â
Was it Vsynced?
Â
No.
[QUOTE="hartsickdiscipl"]
[QUOTE="seanmcloughlin"]
AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
seanmcloughlin
Â
I've played some games that run at a constant 40-50 fps on a 60hz screen before. Â It looked fine. Â
Was it Vsynced?
A game only needs to be Vsynced if it goes above 60fps. And even then the only difference is some screen-tearing
Anyway, current gen consoles can barely run 30fps
[QUOTE="seanmcloughlin"]
[QUOTE="hartsickdiscipl"]
Â
I've played some games that run at a constant 40-50 fps on a 60hz screen before. Â It looked fine. Â
Darth_Kane
Was it Vsynced?
A game only needs to be Vsynced if it goes above 60fps. And even then the only difference is some screen-tearing
Anyway, current gen consoles can barely run 30fps
I was always under the impression Vsync was needed when the frames weren't near the refresh rate of your TV. When I played Crysis 1 on my TV through my PC before it ran at 35 fps and it had screen tearing like crazy
[QUOTE="XVision84"]
Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30.Â
seanmcloughlin
AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."[QUOTE="seanmcloughlin"][QUOTE="XVision84"]
Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30.Â
clyde46
AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."I ain't readin all that :P Am I right or wrong?Â
[QUOTE="seanmcloughlin"]
[QUOTE="hartsickdiscipl"]
Â
I've played some games that run at a constant 40-50 fps on a 60hz screen before. Â It looked fine. Â
Darth_Kane
Was it Vsynced?
A game only needs to be Vsynced if it goes above 60fps. And even then the only difference is some screen-tearing
Anyway, current gen consoles can barely run 30fps
Â
This is not true.
Â
Screne tearing is an artifact that occurs whenever the GPU and display are not in sync. Regardless of frame rate. You can be running at 60 FPS or 30 FPS EXACTLY, but if the display and GPU aren't synced, you are still going to get screen tearing artifacts.
[QUOTE="seanmcloughlin"][QUOTE="XVision84"]
Yeah it's got to be refresh rates, otherwise if devs wanted to go for maximum graphics on old hardware, they'd try to stick with 24fps gameplay instead of 30.Â
clyde46
AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second." This is only for analog signal transmition, current consoles are work for digital output, which same standards are used in whole world based on MPEG (for TV stations transmitions) and HDMI specification. So i general this is obsulite for this gen and next gen[QUOTE="Darth_Kane"]
[QUOTE="seanmcloughlin"]
Was it Vsynced?
Kinthalis
A game only needs to be Vsynced if it goes above 60fps. And even then the only difference is some screen-tearing
Anyway, current gen consoles can barely run 30fps
Â
This is not true.
Â
Screne tearing is an artifact that occurs whenever the GPU and display are not in sync. Regardless of frame rate. You can be running at 60 FPS or 30 FPS EXACTLY, but if the display and GPU aren't synced, you are still going to get screen tearing artifacts.
Yup. V-sync in options has built-in 60fps-lock. Don't tell me hermits don't even know their stuff? :?[QUOTE="clyde46"][QUOTE="seanmcloughlin"]Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second." This is only for analog signal transmition, current consoles are work for digital output, which same standards are used in whole world based on MPEG (for TV stations transmitions) and HDMI specification. So i general this is obsulite for this gen and next gen Yea, its a lot easier now that the world has moved to digital.AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
ShadowriverUB
Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."[QUOTE="clyde46"][QUOTE="seanmcloughlin"]
AFAIK American TVs and Japanese TVs have refresh rates of 30 fps or 60 interlaced. So it's absically multiples of that. 45 wouldn't really look rightÂ
But European TVs have different refresh rates of like 25,50 and 100.Â
But I could be totally wrong
seanmcloughlin
I ain't readin all that :P Am I right or wrong?Â
Yes, yes you are correct.who needs it?
all i do on pc is crank to max then reduce the settings until i hit 30 fps anyway.
i devote my asserts to where it does the most good and to me extra fps is just not worth it.
[QUOTE="seanmcloughlin"][QUOTE="clyde46"] Taken from Wikipedia. "NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) are used for synchronization and vertical retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2,4,6,...,524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1,3,5,...,525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hertz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second."clyde46
I ain't readin all that :P Am I right or wrong?Â
Yes, yes you are correct.Nailed it :cool:
will cause screen tearing, but even with vsync if a tv is running at 60hz, but only receiving 45fps, that means 15 of those frames will be shown twice in order to meet the 60hz refresh rate. this results in judder.
Variable framerates have been used succesfully. Both Metal Gear Rising Revengeance and God of War Ascension have a max of 60fps and an average bitrate of 40-50 and neither of them has screentear because of triple buffer being used.
I think it would be a great solution for nextgen.
Not much of a point. At 45 fps you're still going to have noticeable input lag and general sluggishness. Not as bad as 30 fps, but not as good as 60fps.
The reason devs just cap off at 30 if they can't maintain 60fps is that it gives them more headroom to fill that 30 fps with graphical features and whatnot. 30 is pretty much the minimum you can go before it becomes noticeably choppy. There is also the issue of screen tearing. It's easier to work in halves again. At 45 fps 15 of those frames could potentially be split between two cycles of the display.
If you can't hit 60 steady there is no point for settling in between. Might as well drop it to 30 fps and increase the graphical fidelity or decrease graphical fidelity and run at 60 fps.Â
There is probably a lot to do with game timers and how the game is synced. 30 is half of 60, and 60 is generally considered game speed running at 100%. Easier to work in halves than in other fractions. That's more of a guess though as I don't program video games. I know oldschool games were locked at 60 fps as the game was synced to run at that speed. Any faster or slower would result in a physical increase of speed of the game or decrease of speed. This is why emulators have to lock framerates and you can't increase the FPS of an emulated game without really weird things happening.
I wish more developers went for 60fps. It just looks so much more fluid to me.HsepticNext gen gonna go 30fps either way to squish juices out of device to run newer games, as people don't mind 30fps (except minority on places like SW ;p) and like more nice looking graphics, developers will prefer to make graphics nicer then washing them out for 60fps
TEARING HAPPENS WHEN GPU RENDERS FRAMES ABOVE THE REFRESH RATES, THIS IS WHY VSYNC IS USED, TO LOCK THE FPS.
YOU CAN USE VSYNC ONLY FOR 30FPS AND 60FPS, DISPLAYS DONT SUPPORT 45FPS.
GAMES CAN BE PROGRAMMED TO RUN AT 45FPS BUT WITH VSYNC OFF, BUT DEVS HAVE TO CODE THE GAME VERY CAREFULLY NOT TO GO ABOVE 45FPS.
TEARING HAPPENS WHEN GPU RENDERS FRAMES ABOVE THE REFRESH RATES, THIS IS WHY VSYNC IS USED, TO LOCK THE FPS.
Â
YOU CAN USE VSYNC ONLY FOR 30FPS AND 60FPS, DISPLAYS DONT SUPPORT 45FPS.
Â
GAMES CAN BE PROGRAMMED TO RUN AT 45FPS BUT WITH VSYNC OFF, BUT DEVS HAVE TO CODE THE GAME VERY CAREFULLY NOT TO GO ABOVE 45FPS.
Â
Â
ZoomZoom2490
Why are you shouting?
Ya I remember a handful of games being at crazy fps and the games were in like fast forward mode :P enabling V-sync fixed that problem.Not much of a point. At 45 fps you're still going to have noticeable input lag and general sluggishness. Not as bad as 30 fps, but not as good as 60fps.
The reason devs just cap off at 30 if they can't maintain 60fps is that it gives them more headroom to fill that 30 fps with graphical features and whatnot. 30 is pretty much the minimum you can go before it becomes noticeably choppy. There is also the issue of screen tearing. It's easier to work in halves again. At 45 fps 15 of those frames could potentially be split between two cycles of the display.
If you can't hit 60 steady there is no point for settling in between. Might as well drop it to 30 fps and increase the graphical fidelity or decrease graphical fidelity and run at 60 fps.Â
There is probably a lot to do with game timers and how the game is synced. 30 is half of 60, and 60 is generally considered game speed running at 100%. Easier to work in halves than in other fractions. That's more of a guess though as I don't program video games. I know oldschool games were locked at 60 fps as the game was synced to run at that speed. Any faster or slower would result in a physical increase of speed of the game or decrease of speed. This is why emulators have to lock framerates and you can't increase the FPS of an emulated game without really weird things happening.
Wasdie
Please Log In to post.
Log in to comment