Some say the human eye can't see -30fps VS 60FPS-

  • 177 results
  • 1
  • 2
  • 3
  • 4

This topic is locked from further discussion.

Avatar image for ps4hasnogames
PS4hasNOgames

2620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#101 PS4hasNOgames
Member since 2014 • 2620 Posts

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#102  Edited By clyde46
Member since 2005 • 49061 Posts

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

Avatar image for cainetao11
cainetao11

38035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#103 cainetao11
Member since 2006 • 38035 Posts

@tymeservesfate: dd214 and records are public record. But I'm not putting my full name on this board.

Avatar image for ps4hasnogames
PS4hasNOgames

2620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#104  Edited By PS4hasNOgames
Member since 2014 • 2620 Posts

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

Avatar image for cainetao11
cainetao11

38035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#105  Edited By cainetao11
Member since 2006 • 38035 Posts

@clyde46: wow. For the 1000 time, there is a difference. I don't believe its so huge. 480-1080 is huge.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#106 clyde46
Member since 2005 • 49061 Posts

@ps4hasnogames said:

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

720p=1080i

1080p >> 1080i

Better?

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#107  Edited By tymeservesfate
Member since 2003 • 2230 Posts

@wis3boi said:

@Wasdie said:

Those GIFs are both at lower than 30fps on my PC and I'll bet on most others.You can and always will be able to see the difference between 30fps and 60fps. Denying that there is little or no difference is just staying ignorant at this point.

It's sad that in 2014 there are still people denying there is a difference all because of console fanboyism.

you have to view them on the actual linked site, where they are hosted as HTML5, not gifs

BINGO...the link is there for a reason. i even stated that the site has better video displays in the OP.

@JangoWuzHere said:

Why are your gifs so shit? Seriously, there is like a hundred other comparisons that are better then what you have.

here's a better question...why did you just look at the gifs and not read anything in the opening post at all, or check the link?

these arent really my gifs...that isnt my site...i just brought the debate to the board.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#108 clyde46
Member since 2005 • 49061 Posts

@cainetao11 said:

@clyde46: wow. For the 1000 time, there is a difference. I don't believe its so huge. 480-1080 is huge.

Calm down bro.... I was addressing the TC.

Avatar image for cainetao11
cainetao11

38035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#109  Edited By cainetao11
Member since 2006 • 38035 Posts

@clyde46: your right.......I step away from the ledge

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#110  Edited By tymeservesfate
Member since 2003 • 2230 Posts

@cainetao11 said:

@tymeservesfate: dd214 and records are public record. But I'm not putting my full name on this board.

agreed, you shouldnt.

and i see your point completely...i don't see how anyone can speak against you after that answer lol. you shot some of these guns forreral...while we're talking video games lol. especially if those are your shooting range numbers.

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#111 tymeservesfate
Member since 2003 • 2230 Posts

@clyde46 said:

@tymeservesfate said:

@cainetao11 said:

@wis3boi said:

@cainetao11 said:

@JangoWuzHere:

Play PC all the time. I guess its relative to what one considers huge. I don't consider the difference huge. The difference from 480-1080p is huge, imo. But to each their own assessment

It's double the pixel density...your eyes must be pretty terrible

Oh well guess so. Still shot 35/40 and 4/4 at 300m, no scope in US army. But If people really see differently than ok.

LMAO...that answer should shut anybody up in this thread XD.

Umm no, just because you can't see the difference doesn't negate all the studies and reports that say there is a difference.

i'm just saying...if those are his shooting range numbers, then his eyes are fine. i'm not picking one side or the other openly. but that does give him a little more weight behind his words.

Avatar image for cainetao11
cainetao11

38035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#112 cainetao11
Member since 2006 • 38035 Posts

@tymeservesfate: sucks because I needed 36/40 to qualify expert. I was sharpshooter because I missed one. Thing I miss most now is not going to ranges. Especially being a NYer where gun laws suck.

On topic, I do know there are differences in res and fps. Been playing through the arkham games on my new PC, and they are beautiful. But I still loved them when I originally played them on ps3/360. I can't bring my PC to my brothers this weekend but I'll still play arkham on my old 360 that he has. I guess if people are into all the nuances and stuff that's cool. I just have fun playing.

Avatar image for jun_aka_pekto
jun_aka_pekto

25255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#113 jun_aka_pekto
Member since 2010 • 25255 Posts

@clyde46 said:

@jun_aka_pekto said:

@clyde46 said:
However I can be very wrong and mixed up Hertz and FPS all together.

I'm also skeptical of the HZ and FPS being mixed together. Perhaps we should go back to basics when framerates merely indicate how much reserve horsepower a a system has to run a game fluidly.

Anyway, I grew up gaming when PC games rarely reached 24fps. So, 30fps is playable enough to me. Unfortunately, YT drops a lot frames during uploading. So, this video seems jerkier than the original.

Dropping frames isn't really a Youtube problem, thats a recording and encoding problem. I always try to record at 60FPS whenever possible because if you do a dropped frame or two, its fixable.

It's probably a re-encoding problem when uploading a video to YT.

Avatar image for jun_aka_pekto
jun_aka_pekto

25255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#114  Edited By jun_aka_pekto
Member since 2010 • 25255 Posts

I see subtle differences in 30fps and 60fps in motion and in the feeling while using a mouse which go hand in hand with the more reserves a PC has, the smoother the game will be.

But, I don't really see the correlation with the refresh rates apart from tearing. I haven't had an LCD monitor that went past 75 Hz. But, I've had 21" and 22" CRT/aperture grill monitors that could do 120 Hz a little over a decade ago. The framerates didn't feel any different whether I was playing the game at 1024x768 @ 85 Hz or 1024x768 @ 120 Hz. I certainly never noticed any image quality differences like what some here have posted before.

Avatar image for -Rhett81-
-Rhett81-

3569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115  Edited By -Rhett81-
Member since 2002 • 3569 Posts

This is a better comparison. I can see the difference, personally:


Avatar image for melonfarmerz
melonfarmerz

1294

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#116 melonfarmerz
Member since 2014 • 1294 Posts

Why do people keep posting these. You realize there's a huge difference between watching a gif/ video, and what you actually see and feel when you're immersed in a game...

Avatar image for harry_james_pot
harry_james_pot

11414

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#117  Edited By harry_james_pot  Moderator
Member since 2006 • 11414 Posts

First of all, both those gifs are running on what seems like 20 fps. >__>

And second, there's even a clear difference between 60 and 120 fps. If someone can't see the difference between 30 and 60, then there's something wrong with their eyes.. or monitor.

Avatar image for melonfarmerz
melonfarmerz

1294

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#118  Edited By melonfarmerz
Member since 2014 • 1294 Posts

@-Rhett81- said:

This is a better comparison. I can see the difference, personally:

This is a pretty good illustration actually. When you look at 30fps first, then 60, you don't see much of a difference but going back, you realize how smooth 60 was. Probably explains why a lot of console gamers say they can't see the difference while I can't play GTA 5 without getting dizzy.

Avatar image for the_bi99man
the_bi99man

11465

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#119  Edited By the_bi99man
Member since 2004 • 11465 Posts

@hoosier7 said:

Who cares about seeing the difference? I can't see much of difference in the two but can sure as hell feel it in anything with high sensitivity input. M&K in an FPS at 30fps is very noticeable.

This is a big part of it. Especially when the debate gets into 60+ territory. I can see the difference between 30 and 60 like night and day, but above 60, the improvement gets less visible to me. However, I can feel the difference between 60 and 80 or 100, and it's glorious.

Avatar image for hrt_rulz01
hrt_rulz01

22374

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#120  Edited By hrt_rulz01
Member since 2006 • 22374 Posts

I can tell the difference, but it's not a deal breaker for me.

Avatar image for MK-Professor
MK-Professor

4214

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#121 MK-Professor
Member since 2009 • 4214 Posts

When you are actually playing the game you can see/feel the difference between 60fps and 120fps.

As for 30fps vs 60fps the difference is just massive, in fact is not even enjoyable to play games at 30fps.

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#122  Edited By wis3boi
Member since 2005 • 32507 Posts

@the_bi99man said:

@hoosier7 said:

Who cares about seeing the difference? I can't see much of difference in the two but can sure as hell feel it in anything with high sensitivity input. M&K in an FPS at 30fps is very noticeable.

This is a big part of it. Especially when the debate gets into 60+ territory. I can see the difference between 30 and 60 like night and day, but above 60, the improvement gets less visible to me. However, I can feel the difference between 60 and 80 or 100, and it's glorious.

I just got two 144hz monitors. Peasants be jelly of my framerates over 60

Avatar image for kipsta77
kipsta77

1119

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#123  Edited By kipsta77
Member since 2012 • 1119 Posts

Oh, the denial... AHAH!

Avatar image for AznbkdX
AznbkdX

4284

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#125 AznbkdX
Member since 2012 • 4284 Posts

Honestly I noticed it when comparing after a few takes, but it was hardly anything to freak out about imo.

Weirdly enough I came into this thread thinking there would be a greater noticeable difference (still low influence though) than what I just saw since I've never seen in game tests before, but I guess not. Kind of strange since I've been 60 fps gaming for awhile now, but I guess I never really compared them, and only saw it as better than 30 just because. I think its blown out of proportion tbh, which was also what I thought before coming in here as well.

Now below 30 fps is very noticeable to me, especially around the 20's area.

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126 tymeservesfate
Member since 2003 • 2230 Posts

@harry_james_pot said:

First of all, both those gifs are running on what seems like 20 fps. >__>

And second, there's even a clear difference between 60 and 120 fps. If someone can't see the difference between 30 and 60, then there's something wrong with their eyes.. or monitor.

there's a link to videos in the OP...i even said they are a better quality than the gifs -_-

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#127 FoxbatAlpha
Member since 2009 • 10669 Posts

The only thing I see is the OJ. He be all crouching down and looking scared.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#128  Edited By StormyJoe
Member since 2011 • 7806 Posts

@cainetao11 said:

@StormyJoe said:

@ribstaylor1 said:

@StormyJoe: When on my ps3 or even my computer playing on a 60inch plasma tv, I can notice a difference between 720 and 1080p. I must have amazing eyes, since so many people can't seem to see with clarity. 720 on any game on my 60inch is a blurry jagged mess unless I crank the aliasing up or do super sampling. 1080p is far better clearing up the image and making it far more crisp. I can even take it a step higher on my 1440p monitor, and get an even clearer and crisper image. Even on it I can notice a change sitting back a few feet between the three resolutions. There is a difference and noticeable one for both resolution and frame rate. There is no real debate on this just people on forums who don't know too much about anything or have horrible eyes talking out their asses about resolutions and frame rates.

Interestingly enough, the only people who talk about 1080p being such an incredible difference, are cows.

Look, I am a technophile - I have thousands and thousands of dollars tied to my home theater, and I can admit there isn't a big difference. I am not saying there is "no difference", but it's negligible. You people talk as though it is a difference between VHS and Blu Ray. So yes, your eyes defy medical science - perhaps you are an X-Man and don't know it.

Agreed. I am not a technopredator...............guy............thing. But I guess my eyes are bad. I don't see this MASSIVE difference from 1080p to 720p. And some claim its huge from 900p to 1080p. Its just favoritisms that makes people exaggerate.

It's a colloquialism for someone who only buys high end electronics.

Here's the definition

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#129  Edited By StormyJoe
Member since 2011 • 7806 Posts

@JangoWuzHere said:

@cainetao11 said:

@StormyJoe said:

@ribstaylor1 said:

@StormyJoe: When on my ps3 or even my computer playing on a 60inch plasma tv, I can notice a difference between 720 and 1080p. I must have amazing eyes, since so many people can't seem to see with clarity. 720 on any game on my 60inch is a blurry jagged mess unless I crank the aliasing up or do super sampling. 1080p is far better clearing up the image and making it far more crisp. I can even take it a step higher on my 1440p monitor, and get an even clearer and crisper image. Even on it I can notice a change sitting back a few feet between the three resolutions. There is a difference and noticeable one for both resolution and frame rate. There is no real debate on this just people on forums who don't know too much about anything or have horrible eyes talking out their asses about resolutions and frame rates.

Interestingly enough, the only people who talk about 1080p being such an incredible difference, are cows.

Look, I am a technophile - I have thousands and thousands of dollars tied to my home theater, and I can admit there isn't a big difference. I am not saying there is "no difference", but it's negligible. You people talk as though it is a difference between VHS and Blu Ray. So yes, your eyes defy medical science - perhaps you are an X-Man and don't know it.

Agreed. I am not a technopredator...............guy............thing. But I guess my eyes are bad. I don't see this MASSIVE difference from 1080p to 720p. And some claim its huge from 900p to 1080p. Its just favoritisms that makes people exaggerate.

You don't have to be a technowhatever to see that there are a lot more pixels being rendered on screen. Go play a PC game and switch from 720p to 1080p. The difference is large.

Yeah, because your eyes are 12 inches from the screen...

Avatar image for intotheminx
intotheminx

2608

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#131 intotheminx
Member since 2014 • 2608 Posts

30fps is playable, but of course people want more.

Avatar image for HalcyonScarlet
HalcyonScarlet

13664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#132 HalcyonScarlet
Member since 2011 • 13664 Posts
@Heil68 said:

so everyone should look at buying a PS4 instead of Xbone? Bloody brilliant mate!

Most current gen looking games seem to be struggling to get above 30fps.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#133 StormyJoe
Member since 2011 • 7806 Posts

@Motokid6 said:

@StormyJoe: Thats not true at all even with a desk. My pc's hooked up to my 50" plasma however. I sit about 8 feet away and the difference between 720 and 1080 is night and day.

Whatever, man. You are full of crap, dude there is no way it is "night and day". At that distance, the human eye can barely tell the difference - even if you have better than 20/20 vision.

Avatar image for SapSacPrime
SapSacPrime

8925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#134 SapSacPrime
Member since 2004 • 8925 Posts

This is a pretty good example of the difference; I fail to see how anybody could not see it.

Avatar image for g0ddyX
g0ddyX

3914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#135 g0ddyX
Member since 2005 • 3914 Posts

On graphic intensive games, it would be hard to get over 30Fps.. That has always been the case.
Rather have great visuals with 30fps than 60fps with poor fuzzy graphics.

Avatar image for Epak_
Epak_

11911

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#136  Edited By Epak_
Member since 2004 • 11911 Posts

@MBirdy88: It was 30fps on the Wii? :O

Avatar image for BeardMaster
BeardMaster

1686

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#137  Edited By BeardMaster
Member since 2012 • 1686 Posts

@ps4hasnogames said:

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

Avatar image for ps4hasnogames
PS4hasNOgames

2620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#138  Edited By PS4hasNOgames
Member since 2014 • 2620 Posts

@BeardMaster said:

@ps4hasnogames said:

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

So if I'm playing a game that runs 60 fps on a 1080i tv, you're telling me that it will go from 60fps to 30fps?

Avatar image for BeardMaster
BeardMaster

1686

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#139  Edited By BeardMaster
Member since 2012 • 1686 Posts

@ps4hasnogames said:

@BeardMaster said:

@ps4hasnogames said:

@clyde46 said:

@ps4hasnogames said:

what I want to know is if 1080p vs 1080i makes a difference, because i sure as hell can tell the diff between 30 and 60 fps...its obvious.

1080i is different from 1080p.

Within the designation "1080i", the i stands for interlaced scan. A frame of 1080i video consists of two sequential fields of 1920 horizontal and 540 vertical pixels. The first field consists of all odd-numbered TV lines and the second all even numbered lines. Consequently the horizontal lines of pixels in each field are captured and displayed with a one-line vertical gap between them, so the lines of the next field can be interlaced between them, resulting in 1080 total lines. 1080i differs from 1080p, where the p stands for progressive scan, where all lines in a frame are captured at the same time. In native or pure 1080i, the two fields of a frame correspond to different instants (points in time), so motion portrayal is good (50 or 60 motion phases/second). This is true for interlaced video in general and can be easily observed in still images taken from fast motion scenes, as shown in the figure on the right (without appropriate deinterlacing). However when 1080p material is captured at 25 or 30 frames/second it is converted to 1080i at 50 or 60 fields/second, respectively, for processing or broadcasting. In this situation both fields in a frame do correspond to the same instant. The field-to-instant relation is somewhat more complex for the case of 1080p at 24 frames/second converted to 1080i at 60 fields/second, as explained in the telecine article.

http://en.wikipedia.org/wiki/1080i

thanks, i read that...and still don't get it lol, but thanks anyways. I ask because one of my tv's is 1080i, and another is 1080p....

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

So if I'm playing a game that runs 60 fps on a 1080i tv, you're telling me that it will go from 60fps to 30fps?

Pretty much yes. The 1080i tv, receives half the image, then receives the next half.... then combines both half images together to show you a single full image. Its receiving half frames 60 times a second, but combines them to show you full frames 30 times a second. Whereas the 1080p tv, is receiving full frames 60 times a second, and is displaying 60fps. Basically TVs with 1080i input cannot display more than 30fps.

Although a 30 fps game on 1080i is the same on 1080p. You arent lowering the framerate or performance in anyway, i want to make that clear. 1080p is only better, and never worse.

Avatar image for jun_aka_pekto
jun_aka_pekto

25255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#140 jun_aka_pekto
Member since 2010 • 25255 Posts

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

Avatar image for BeardMaster
BeardMaster

1686

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#142 BeardMaster
Member since 2012 • 1686 Posts

@jun_aka_pekto said:

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

1080 is the lines of resolution. Its imposible for a 720p display to be 1080i.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#143 StormyJoe
Member since 2011 • 7806 Posts

@Motokid6 said:

@StormyJoe: No..no crap in this guy. 720p does not look nearly as good as 1080 from the 8 foot distance I sit from the tv. It's very obvious. Why can't people accept that?

IDK, because I read about such things? Humans cannot discern 1080p or 720p with any significant meaningfulness from that distance. It's medical science. Can you tell the difference? Maybe. Is it as big of a difference as you are proposing? No - and it really can't be because you are not an eagle, hawk, or other raptor.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#144  Edited By StormyJoe
Member since 2011 • 7806 Posts

@BeardMaster said:

@jun_aka_pekto said:

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

1080 is the lines of resolution. Its imposible for a 720p display to be 1080i.

Umm.., most 720p TVs can display 1080i.

Avatar image for BeardMaster
BeardMaster

1686

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#145 BeardMaster
Member since 2012 • 1686 Posts

@StormyJoe said:

@BeardMaster said:

@jun_aka_pekto said:

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

1080 is the lines of resolution. Its imposible for a 720p display to be 1080i.

Umm.., most 720p TVs can display 1080i.

no they cant, every 1080i input is 1080p output. HDTVs dont output interlaced content. A 720p display regardless of input only can display 720 lines of resolution, the display doesnt have the pixels to display 1080.

Avatar image for D4RKL1NG
D4RKL1NG

290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#147  Edited By D4RKL1NG
Member since 2012 • 290 Posts

30 fps versus 60 fps is a bit like walking from point A to point B when you're semi-drunk: you get to B, only it feels as if it though you missed half of the steps along the way.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#148 StormyJoe
Member since 2011 • 7806 Posts

@BeardMaster said:

@StormyJoe said:

@BeardMaster said:

@jun_aka_pekto said:

@BeardMaster said:

The only difference between 1080i and 1080p is that 1080i sends half an image per frame and 1080p sends the full image. Since most displays accept video input at a max of 60hz, what this effectively means is 1080i will cap out at 30fps while 1080p can achieve up to 60fps. If you are viewing content at 30fps or lower, there is effectively no difference between the 2.

I'm under the impression 1080i is usually found only on 720p TVs. I still have an old 720p TV and it does 1080i. It shouldn't be a concern to those who own full HD TV sets.

1080 is the lines of resolution. Its imposible for a 720p display to be 1080i.

Umm.., most 720p TVs can display 1080i.

no they cant, every 1080i input is 1080p output. HDTVs dont output interlaced content. A 720p display regardless of input only can display 720 lines of resolution, the display doesnt have the pixels to display 1080.

Any 720p TV can upscale to 1080i. My 10 year old LG LCD could do it.

Avatar image for NFJSupreme
NFJSupreme

6605

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#149 NFJSupreme
Member since 2005 • 6605 Posts

You feel the difference more than you see it if you know what I mean.

Avatar image for Nagidar
Nagidar

6231

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#150  Edited By Nagidar
Member since 2006 • 6231 Posts

It depends on how fast the images are moving, watching a slow moving fog at 10 FPS will look fluid, playing a racing game at that same FPS will not.