Some say the human eye can't see -30fps VS 60FPS-

  • 172 results
  • 1
  • 2
  • 3
  • 4
#101 Posted by PS4hasNOgames (1771 posts) -

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

#102 Edited by clyde46 (46634 posts) -

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

#103 Posted by cainetao11 (17771 posts) -

@tymeservesfate: dd214 and records are public record. But I'm not putting my full name on this board.

#104 Edited by PS4hasNOgames (1771 posts) -

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

#105 Edited by cainetao11 (17771 posts) -

@clyde46: wow. For the 1000 time, there is a difference. I don't believe its so huge. 480-1080 is huge.

#106 Posted by clyde46 (46634 posts) -

@ps4hasnogames said:

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

720p=1080i

1080p >> 1080i

Better?

#107 Edited by tymeservesfate (1975 posts) -

@wis3boi said:

@Wasdie said:

Those GIFs are both at lower than 30fps on my PC and I'll bet on most others.You can and always will be able to see the difference between 30fps and 60fps. Denying that there is little or no difference is just staying ignorant at this point.

It's sad that in 2014 there are still people denying there is a difference all because of console fanboyism.

you have to view them on the actual linked site, where they are hosted as HTML5, not gifs

BINGO...the link is there for a reason. i even stated that the site has better video displays in the OP.

@JangoWuzHere said:

Why are your gifs so shit? Seriously, there is like a hundred other comparisons that are better then what you have.

here's a better question...why did you just look at the gifs and not read anything in the opening post at all, or check the link?

these arent really my gifs...that isnt my site...i just brought the debate to the board.

#108 Posted by clyde46 (46634 posts) -

@cainetao11 said:

@clyde46: wow. For the 1000 time, there is a difference. I don't believe its so huge. 480-1080 is huge.

Calm down bro.... I was addressing the TC.

#109 Edited by cainetao11 (17771 posts) -

@clyde46: your right.......I step away from the ledge

#110 Edited by tymeservesfate (1975 posts) -

@cainetao11 said:

@tymeservesfate: dd214 and records are public record. But I'm not putting my full name on this board.

agreed, you shouldnt.

and i see your point completely...i don't see how anyone can speak against you after that answer lol. you shot some of these guns forreral...while we're talking video games lol. especially if those are your shooting range numbers.

#111 Posted by tymeservesfate (1975 posts) -

@clyde46 said:

@tymeservesfate said:

@cainetao11 said:

@wis3boi said:

@cainetao11 said:

@JangoWuzHere:

Play PC all the time. I guess its relative to what one considers huge. I don't consider the difference huge. The difference from 480-1080p is huge, imo. But to each their own assessment

It's double the pixel density...your eyes must be pretty terrible

Oh well guess so. Still shot 35/40 and 4/4 at 300m, no scope in US army. But If people really see differently than ok.

LMAO...that answer should shut anybody up in this thread XD.

Umm no, just because you can't see the difference doesn't negate all the studies and reports that say there is a difference.

i'm just saying...if those are his shooting range numbers, then his eyes are fine. i'm not picking one side or the other openly. but that does give him a little more weight behind his words.

#112 Posted by cainetao11 (17771 posts) -

@tymeservesfate: sucks because I needed 36/40 to qualify expert. I was sharpshooter because I missed one. Thing I miss most now is not going to ranges. Especially being a NYer where gun laws suck.

On topic, I do know there are differences in res and fps. Been playing through the arkham games on my new PC, and they are beautiful. But I still loved them when I originally played them on ps3/360. I can't bring my PC to my brothers this weekend but I'll still play arkham on my old 360 that he has. I guess if people are into all the nuances and stuff that's cool. I just have fun playing.

#113 Posted by jun_aka_pekto (16422 posts) -

@clyde46 said:

@jun_aka_pekto said:

@clyde46 said:
However I can be very wrong and mixed up Hertz and FPS all together.

I'm also skeptical of the HZ and FPS being mixed together. Perhaps we should go back to basics when framerates merely indicate how much reserve horsepower a a system has to run a game fluidly.

Anyway, I grew up gaming when PC games rarely reached 24fps. So, 30fps is playable enough to me. Unfortunately, YT drops a lot frames during uploading. So, this video seems jerkier than the original.

Dropping frames isn't really a Youtube problem, thats a recording and encoding problem. I always try to record at 60FPS whenever possible because if you do a dropped frame or two, its fixable.

It's probably a re-encoding problem when uploading a video to YT.

#114 Edited by jun_aka_pekto (16422 posts) -

I see subtle differences in 30fps and 60fps in motion and in the feeling while using a mouse which go hand in hand with the more reserves a PC has, the smoother the game will be.

But, I don't really see the correlation with the refresh rates apart from tearing. I haven't had an LCD monitor that went past 75 Hz. But, I've had 21" and 22" CRT/aperture grill monitors that could do 120 Hz a little over a decade ago. The framerates didn't feel any different whether I was playing the game at 1024x768 @ 85 Hz or 1024x768 @ 120 Hz. I certainly never noticed any image quality differences like what some here have posted before.

#115 Edited by -Rhett81- (3569 posts) -

This is a better comparison. I can see the difference, personally:


#116 Posted by melonfarmerz (1189 posts) -

Why do people keep posting these. You realize there's a huge difference between watching a gif/ video, and what you actually see and feel when you're immersed in a game...

#117 Edited by harry_james_pot (10947 posts) -

First of all, both those gifs are running on what seems like 20 fps. >__>

And second, there's even a clear difference between 60 and 120 fps. If someone can't see the difference between 30 and 60, then there's something wrong with their eyes.. or monitor.

#118 Edited by melonfarmerz (1189 posts) -

@-Rhett81- said:

This is a better comparison. I can see the difference, personally:

This is a pretty good illustration actually. When you look at 30fps first, then 60, you don't see much of a difference but going back, you realize how smooth 60 was. Probably explains why a lot of console gamers say they can't see the difference while I can't play GTA 5 without getting dizzy.

#119 Edited by the_bi99man (11047 posts) -

@hoosier7 said:

Who cares about seeing the difference? I can't see much of difference in the two but can sure as hell feel it in anything with high sensitivity input. M&K in an FPS at 30fps is very noticeable.

This is a big part of it. Especially when the debate gets into 60+ territory. I can see the difference between 30 and 60 like night and day, but above 60, the improvement gets less visible to me. However, I can feel the difference between 60 and 80 or 100, and it's glorious.

#120 Edited by hrt_rulz01 (6244 posts) -

I can tell the difference, but it's not a deal breaker for me.

#121 Posted by MK-Professor (3829 posts) -

When you are actually playing the game you can see/feel the difference between 60fps and 120fps.

As for 30fps vs 60fps the difference is just massive, in fact is not even enjoyable to play games at 30fps.

#122 Edited by wis3boi (31472 posts) -

@the_bi99man said:

@hoosier7 said:

Who cares about seeing the difference? I can't see much of difference in the two but can sure as hell feel it in anything with high sensitivity input. M&K in an FPS at 30fps is very noticeable.

This is a big part of it. Especially when the debate gets into 60+ territory. I can see the difference between 30 and 60 like night and day, but above 60, the improvement gets less visible to me. However, I can feel the difference between 60 and 80 or 100, and it's glorious.

I just got two 144hz monitors. Peasants be jelly of my framerates over 60

#123 Edited by kipsta77 (1016 posts) -

Oh, the denial... AHAH!

#124 Posted by geniobastardo (1294 posts) -

they say bullshit.

#125 Posted by AznbkdX (3226 posts) -

Honestly I noticed it when comparing after a few takes, but it was hardly anything to freak out about imo.

Weirdly enough I came into this thread thinking there would be a greater noticeable difference (still low influence though) than what I just saw since I've never seen in game tests before, but I guess not. Kind of strange since I've been 60 fps gaming for awhile now, but I guess I never really compared them, and only saw it as better than 30 just because. I think its blown out of proportion tbh, which was also what I thought before coming in here as well.

Now below 30 fps is very noticeable to me, especially around the 20's area.

#126 Posted by tymeservesfate (1975 posts) -

@harry_james_pot said:

First of all, both those gifs are running on what seems like 20 fps. >__>

And second, there's even a clear difference between 60 and 120 fps. If someone can't see the difference between 30 and 60, then there's something wrong with their eyes.. or monitor.

there's a link to videos in the OP...i even said they are a better quality than the gifs -_-

#127 Posted by FoxbatAlpha (7428 posts) -

The only thing I see is the OJ. He be all crouching down and looking scared.

#128 Edited by StormyJoe (5497 posts) -

@cainetao11 said:

@StormyJoe said:

@ribstaylor1 said:

@StormyJoe: When on my ps3 or even my computer playing on a 60inch plasma tv, I can notice a difference between 720 and 1080p. I must have amazing eyes, since so many people can't seem to see with clarity. 720 on any game on my 60inch is a blurry jagged mess unless I crank the aliasing up or do super sampling. 1080p is far better clearing up the image and making it far more crisp. I can even take it a step higher on my 1440p monitor, and get an even clearer and crisper image. Even on it I can notice a change sitting back a few feet between the three resolutions. There is a difference and noticeable one for both resolution and frame rate. There is no real debate on this just people on forums who don't know too much about anything or have horrible eyes talking out their asses about resolutions and frame rates.

Interestingly enough, the only people who talk about 1080p being such an incredible difference, are cows.

Look, I am a technophile - I have thousands and thousands of dollars tied to my home theater, and I can admit there isn't a big difference. I am not saying there is "no difference", but it's negligible. You people talk as though it is a difference between VHS and Blu Ray. So yes, your eyes defy medical science - perhaps you are an X-Man and don't know it.

Agreed. I am not a technopredator...............guy............thing. But I guess my eyes are bad. I don't see this MASSIVE difference from 1080p to 720p. And some claim its huge from 900p to 1080p. Its just favoritisms that makes people exaggerate.

It's a colloquialism for someone who only buys high end electronics.

Here's the definition

#129 Edited by StormyJoe (5497 posts) -

@JangoWuzHere said:

@cainetao11 said:

@StormyJoe said:

@ribstaylor1 said:

@StormyJoe: When on my ps3 or even my computer playing on a 60inch plasma tv, I can notice a difference between 720 and 1080p. I must have amazing eyes, since so many people can't seem to see with clarity. 720 on any game on my 60inch is a blurry jagged mess unless I crank the aliasing up or do super sampling. 1080p is far better clearing up the image and making it far more crisp. I can even take it a step higher on my 1440p monitor, and get an even clearer and crisper image. Even on it I can notice a change sitting back a few feet between the three resolutions. There is a difference and noticeable one for both resolution and frame rate. There is no real debate on this just people on forums who don't know too much about anything or have horrible eyes talking out their asses about resolutions and frame rates.

Interestingly enough, the only people who talk about 1080p being such an incredible difference, are cows.

Look, I am a technophile - I have thousands and thousands of dollars tied to my home theater, and I can admit there isn't a big difference. I am not saying there is "no difference", but it's negligible. You people talk as though it is a difference between VHS and Blu Ray. So yes, your eyes defy medical science - perhaps you are an X-Man and don't know it.

Agreed. I am not a technopredator...............guy............thing. But I guess my eyes are bad. I don't see this MASSIVE difference from 1080p to 720p. And some claim its huge from 900p to 1080p. Its just favoritisms that makes people exaggerate.

You don't have to be a technowhatever to see that there are a lot more pixels being rendered on screen. Go play a PC game and switch from 720p to 1080p. The difference is large.

Yeah, because your eyes are 12 inches from the screen...

#130 Posted by Motokid6 (5729 posts) -

@StormyJoe: Thats not true at all even with a desk. My pc's hooked up to my 50" plasma however. I sit about 8 feet away and the difference between 720 and 1080 is night and day.

#131 Posted by intotheminx (703 posts) -

30fps is playable, but of course people want more.

#132 Posted by HalcyonScarlet (4520 posts) -
@Heil68 said:

so everyone should look at buying a PS4 instead of Xbone? Bloody brilliant mate!

Most current gen looking games seem to be struggling to get above 30fps.

#133 Posted by StormyJoe (5497 posts) -

@Motokid6 said:

@StormyJoe: Thats not true at all even with a desk. My pc's hooked up to my 50" plasma however. I sit about 8 feet away and the difference between 720 and 1080 is night and day.

Whatever, man. You are full of crap, dude there is no way it is "night and day". At that distance, the human eye can barely tell the difference - even if you have better than 20/20 vision.

#134 Posted by SapSacPrime (8795 posts) -

This is a pretty good example of the difference; I fail to see how anybody could not see it.

#135 Posted by g0ddyX (3914 posts) -

On graphic intensive games, it would be hard to get over 30Fps.. That has always been the case.
Rather have great visuals with 30fps than 60fps with poor fuzzy graphics.

#136 Edited by Epak_ (6791 posts) -

@MBirdy88: It was 30fps on the Wii? :O

#137 Edited by BeardMaster (1580 posts) -

@ps4hasnogames said:

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

#138 Edited by PS4hasNOgames (1771 posts) -

@BeardMaster said:

@ps4hasnogames said:

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

So if I'm playing a game that runs 60 fps on a 1080i tv, you're telling me that it will go from 60fps to 30fps?

#139 Edited by BeardMaster (1580 posts) -

@ps4hasnogames said:

@BeardMaster said:

@ps4hasnogames said:

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

So if I'm playing a game that runs 60 fps on a 1080i tv, you're telling me that it will go from 60fps to 30fps?

Pretty much yes. The 1080i tv, receives half the image, then receives the next half.... then combines both half images together to show you a single full image. Its receiving half frames 60 times a second, but combines them to show you full frames 30 times a second. Whereas the 1080p tv, is receiving full frames 60 times a second, and is displaying 60fps. Basically TVs with 1080i input cannot display more than 30fps.

Although a 30 fps game on 1080i is the same on 1080p. You arent lowering the framerate or performance in anyway, i want to make that clear. 1080p is only better, and never worse.

#140 Posted by jun_aka_pekto (16422 posts) -

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

#141 Edited by Motokid6 (5729 posts) -

@StormyJoe: No..no crap in this guy. 720p does not look nearly as good as 1080 from the 8 foot distance I sit from the tv. It's very obvious. Why can't people accept that?

#142 Posted by BeardMaster (1580 posts) -

@jun_aka_pekto said:

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

1080 is the lines of resolution. Its imposible for a 720p display to be 1080i.

#143 Posted by StormyJoe (5497 posts) -

@Motokid6 said:

@StormyJoe: No..no crap in this guy. 720p does not look nearly as good as 1080 from the 8 foot distance I sit from the tv. It's very obvious. Why can't people accept that?

IDK, because I read about such things? Humans cannot discern 1080p or 720p with any significant meaningfulness from that distance. It's medical science. Can you tell the difference? Maybe. Is it as big of a difference as you are proposing? No - and it really can't be because you are not an eagle, hawk, or other raptor.

#144 Edited by StormyJoe (5497 posts) -

@BeardMaster said:

@jun_aka_pekto said:

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

1080 is the lines of resolution. Its imposible for a 720p display to be 1080i.

Umm.., most 720p TVs can display 1080i.

#145 Posted by BeardMaster (1580 posts) -

@StormyJoe said:

@BeardMaster said:

@jun_aka_pekto said:

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

1080 is the lines of resolution. Its imposible for a 720p display to be 1080i.

Umm.., most 720p TVs can display 1080i.

no they cant, every 1080i input is 1080p output. HDTVs dont output interlaced content. A 720p display regardless of input only can display 720 lines of resolution, the display doesnt have the pixels to display 1080.

#147 Edited by D4RKL1NG (236 posts) -

30 fps versus 60 fps is a bit like walking from point A to point B when you're semi-drunk: you get to B, only it feels as if it though you missed half of the steps along the way.

#148 Posted by StormyJoe (5497 posts) -

@BeardMaster said:

@StormyJoe said:

@BeardMaster said:

@jun_aka_pekto said:

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

1080 is the lines of resolution. Its imposible for a 720p display to be 1080i.

Umm.., most 720p TVs can display 1080i.

no they cant, every 1080i input is 1080p output. HDTVs dont output interlaced content. A 720p display regardless of input only can display 720 lines of resolution, the display doesnt have the pixels to display 1080.

Any 720p TV can upscale to 1080i. My 10 year old LG LCD could do it.

#149 Posted by NFJSupreme (5379 posts) -

You feel the difference more than you see it if you know what I mean.

#150 Edited by Nagidar (6231 posts) -

It depends on how fast the images are moving, watching a slow moving fog at 10 FPS will look fluid, playing a racing game at that same FPS will not.