DVI cable vs VGA cable w/ DVI convertors

Associate
Joined
9 Jul 2006
Posts
1,848
Location
Vienna
This may be a silly question but will a DVI cable perform the same as a VGA cable with a VGA to DVI convertor on each end?
 
Last edited:
I think he's trying to say if there's a difference between a normal dvi cable vs a vga cable which has 2 dvi connectors on the end (if anyone understands that!) :rolleyes:
 
cobxx said:
I think he's trying to say if there's a difference between a normal dvi cable vs a vga cable which has 2 dvi connectors on the end (if anyone understands that!) :rolleyes:

Thats the jist of it.
 
I can't answer the OP's question, but I have another basic question myself.
If I've got a graphics card with both VGA and DVI outputs, and a monitor with both VGA and DVI inputs, is it always preferable to only use the DVI or are there instances where it is beneficial to use the VGA?
 
emailiscrap said:
I can't answer the OP's question, but I have another basic question myself.
If I've got a graphics card with both VGA and DVI outputs, and a monitor with both VGA and DVI inputs, is it always preferable to only use the DVI or are there instances where it is beneficial to use the VGA?


I can't get into my BIOS when I run a DVI cable, I see that as a benifit :D

I see a lot of noise on the screen when booting at the black screen part of xp but with the DVI I see nothing but black until the logon screen
 
No.

Basically, all graphics cards have a DVI-I output. These outputs support both Digital (DVI-D) and Analogue (VGA).

DVI-I - Digital and Analogue (VGA).
DVI-D - Digital Only.
DVI-A - Analogue only, basically the same as VGA.

That is why you can use a simple adaptor to change your DVI port on your GPU to VGA - it's not 'converting' anything, it's just using the analogue support, which is identical to VGA.

If you use a PROPER DVI-to-DVI cable to a DVI monitor, you will get a true digital DVI connection.
If you use a VGA cable with a DVI adaptor, it will ONLY output an analogue signal, and probably won't work on your DVI monitor (because they usually have Digital-only DVI-D ports). Even if it did work, it's still only an analogue connection, so you get none of the benefits of DVI.

There is no advantage to VGA, it's legacy, although it must be noted that DVI-D and VGA are both encoded as 8-bit RGBHV, so there's a whole debate right there.
Contrary to popular belief, HDMI is not the same as DVI, HDMI supports RGBHV all the way upto 48-bit (on the newest HDMI 1.3) as well as YCbCr (YUV).
 
Last edited:
Shakey_Jake33 said:
There is no advantage to VGA, it's legacy, although it must be noted that DVI-D and VGA are both encoded as 8-bit RGBHV, so there's a whole debate right there.
Contrary to popular belief, HDMI is not the same as DVI, HDMI supports RGBHV all the way upto 48-bit (on the newest HDMI 1.3) as well as YCbCr (YUV).

Yup, but how many 48bit sources do you find? :p Point taken though.
 
Back
Top Bottom