Can VGA/D-Sub do Full 1920x1080p (FULL HD)?

This topic is locked from further discussion.

Avatar image for killa4lyfe
killa4lyfe

3849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#1 killa4lyfe
Member since 2008 • 3849 Posts

Read the topic^^^^

I have a video card that supports that res, and i run my games (i.e COD4, Grid) at 1920 x 1080 res but is that full HD? Also I am using an HD monitor which supports FULL HD and i am using a VGA cable (monitor calls it D-Sub) but when i try to go to 1920 x 1080 on my desktop, it says out of range but when ever i got to that res on games its fine. Also using the ATi catalyst, i try to force 1080p at 30 or 60hz but it doesn't work and says out of range. Whats the problem? The cable?:?

Please give some explanation to ur answers because i don't want two words answers :P

Avatar image for Jamiemydearx3
Jamiemydearx3

4062

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 Jamiemydearx3
Member since 2008 • 4062 Posts

So you can play games at 1920x1080p, but when you go back to your desktop you cant set the resolution to 1920x1080? Thats really odd, what video card and monitor do you own?

Avatar image for killa4lyfe
killa4lyfe

3849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#3 killa4lyfe
Member since 2008 • 3849 Posts

So you can play games at 1920x1080p, but when you go back to your desktop you cant set the resolution to 1920x1080? Thats really odd, what video card and monitor do you own?

Jamiemydearx3
no i can do 1920 x 1080 RES but i do not know if it acctually is 1080p or not. I have an HD2200 BenQ monitor (brand new) and an HD4850
Avatar image for killa4lyfe
killa4lyfe

3849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#4 killa4lyfe
Member since 2008 • 3849 Posts

Odd, now if i go to ATI catalyst control center, it works perfectly in 1920 x 1080 p at 60 mhz......

Avatar image for killa4lyfe
killa4lyfe

3849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#5 killa4lyfe
Member since 2008 • 3849 Posts
Okay now it seems to be working, (yeah i am confused as well) but how do i know if it is 1080p or 1080i
Avatar image for powerslide67
powerslide67

266

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#6 powerslide67
Member since 2006 • 266 Posts

Okay now it seems to be working, (yeah i am confused as well) but how do i know if it is 1080p or 1080ikilla4lyfe

I have the same card as you and it only outputs 1080i when attached to a hdmi as far as i know

Avatar image for Hekynn
Hekynn

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 Hekynn
Member since 2003 • 2164 Posts
Nope if your LCD has a hdmi port then use the hdmi adapter and buy a hdmi cable from amazon or buy a DVI to HDMI cable their great I'm using one now on my 8800GT and Samsung T220HD LCD and picture is AWSOME!
Avatar image for killa4lyfe
killa4lyfe

3849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#8 killa4lyfe
Member since 2008 • 3849 Posts
Nope if your LCD has a hdmi port then use the hdmi adapter and buy a hdmi cable from amazon or buy a DVI to HDMI cable their great I'm using one now on my 8800GT and Samsung T220HD LCD and picture is AWSOME!Hekynn
so there is a big difference between VGA 1920 x 1080 and HDMI/DVI 1920 x 1080? And i really need to know, How will i figure out if i am running 1080i or 1080p?
Avatar image for killa4lyfe
killa4lyfe

3849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#9 killa4lyfe
Member since 2008 • 3849 Posts

[QUOTE="killa4lyfe"]Okay now it seems to be working, (yeah i am confused as well) but how do i know if it is 1080p or 1080ipowerslide67

I have the same card as you and it only outputs 1080i when attached to a hdmi as far as i know

Hey but i thought that HD4850 could support full 1080p? right?:? maybe its ur monitor?:?
Avatar image for S0H0
S0H0

298

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 S0H0
Member since 2009 • 298 Posts
1920 x 1080 is Full HD. 1080i and 1080p contain the same data, the difference being the way the source signal is sent to the output monitor. 1080i = Interlaced 1080p = Progressive Scan VGA (D-Sub) = Analogue output DVI / HDMI = Digital output Relax mate, it all sounds fine. :)
Avatar image for powerslide67
powerslide67

266

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#11 powerslide67
Member since 2006 • 266 Posts

[QUOTE="powerslide67"]

[QUOTE="killa4lyfe"]Okay now it seems to be working, (yeah i am confused as well) but how do i know if it is 1080p or 1080ikilla4lyfe

I have the same card as you and it only outputs 1080i when attached to a hdmi as far as i know

Hey but i thought that HD4850 could support full 1080p? right?:? maybe its ur monitor?:?

i probably should have said that it always outputs progressive resolutions like 1050p, 720p or 1080p and so on, if you want 1080i you need to attach your monitor with hdmi cable and enable 1080i in the CCC

Avatar image for Marfoo
Marfoo

6002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 Marfoo
Member since 2004 • 6002 Posts
[QUOTE="S0H0"]1920 x 1080 is Full HD. 1080i and 1080p contain the same data, the difference being the way the source signal is sent to the output monitor. 1080i = Interlaced 1080p = Progressive Scan VGA (D-Sub) = Analogue output DVI / HDMI = Digital output Relax mate, it all sounds fine. :)

This. The only difference you will see between VGA and DVI/HDMI is there will be less interference and scaling issues with DVI/HDMI. If you can get away with just DVI, just use that. DVI and HDMI use the same digital video format and you don't need an adapter for it. Computers default to progressive signal by default unless you specifically tell them to output an interlaced signal. So yeah, sounds like it's good, you're in 1080p. If you're not getting interference sticking with VGA is fine.
Avatar image for S0H0
S0H0

298

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 S0H0
Member since 2009 • 298 Posts
[QUOTE="Marfoo"][QUOTE="S0H0"]1920 x 1080 is Full HD. 1080i and 1080p contain the same data, the difference being the way the source signal is sent to the output monitor. 1080i = Interlaced 1080p = Progressive Scan VGA (D-Sub) = Analogue output DVI / HDMI = Digital output Relax mate, it all sounds fine. :)

This. The only difference you will see between VGA and DVI/HDMI is there will be less interference and scaling issues with DVI/HDMI. If you can get away with just DVI, just use that. DVI and HDMI use the same digital video format and you don't need an adapter for it. Computers default to progressive signal by default unless you specifically tell them to output an interlaced signal. So yeah, sounds like it's good, you're in 1080p. If you're not getting interference sticking with VGA is fine.

I also advocate using DVI instead of VGA. VGA converts the digital signal to analogue back to digital, thus there is some loss. DVI directly outputs digital, thus with no loss is generally sharper. However, some people can't discern much difference. ie., it's a debatable topic. It's not essential, but still preferable in my opinion.
Avatar image for joshuahaveron
joshuahaveron

2165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 joshuahaveron
Member since 2004 • 2165 Posts

[QUOTE="Marfoo"][QUOTE="S0H0"]1920 x 1080 is Full HD. 1080i and 1080p contain the same data, the difference being the way the source signal is sent to the output monitor. 1080i = Interlaced 1080p = Progressive Scan VGA (D-Sub) = Analogue output DVI / HDMI = Digital output Relax mate, it all sounds fine. :)S0H0
This. The only difference you will see between VGA and DVI/HDMI is there will be less interference and scaling issues with DVI/HDMI. If you can get away with just DVI, just use that. DVI and HDMI use the same digital video format and you don't need an adapter for it. Computers default to progressive signal by default unless you specifically tell them to output an interlaced signal. So yeah, sounds like it's good, you're in 1080p. If you're not getting interference sticking with VGA is fine.

I also advocate using DVI instead of VGA. VGA converts the digital signal to analogue back to digital, thus there is some loss. DVI directly outputs digital, thus with no loss is generally sharper. However, some people can't discern much difference. ie., it's a debatable topic. It's not essential, but still preferable in my opinion.

Ye, I would advise a DVI too. BTW "full HD" is something that TV manufacturers came up with, aswell as "HD" 1080p has been around for a long time, just not in TVs. Also 1920 x 1080 is classed as 1080p or 1080i, so yes. And you will be getting 1080p aslong as you have not forced 1080i under CCC.

Avatar image for killa4lyfe
killa4lyfe

3849

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#15 killa4lyfe
Member since 2008 • 3849 Posts
Okay thnx to everybody who replied. So now i dont have to worry about bieng in 1080p or 1080i since i know i haven't forced it to go to 1080i. And also i will probably be getting a DVI cable in a week or two since they are really cheap. So once again thnx :D (i guess the mods could close this thread now if they want)