Adaptor conundrum.

Associate
Joined
2 Oct 2010
Posts
659
Location
Bedfordshire
I have a 5850 which I am trying to use with 2 monitors.
However 1 monitor is DVI and the other monitor is VGA.
The 5850 has 1 DVI, 1 HDMI and 1 displayport.
Do I Convert the DVI to HDMI and the VGA to DVI, or do I convert the VGA to DVI then to HDMI(Is it even possible to do that?)?

They are very close together so I am having trouble plugging them both in.
 
DVI electrically is the same as HDMI (HDMI carries sound as well) so no conversion required. I am surprised you cannot plug them both in, usually there is enough room.
 
As mentioned above, to get the VGA (analogue video signal) you will want to use the DVI output of the graphics card and a DVI to VGA adapter like the one Surveyor linked to. On most graphics cards, the DVI port is of the DVI-I type- which means it can output both an analogue and digital video signal - so a basic DVI-I to VGA adapter like this will work fine without any difficult (and expensive) digital-to-analogue conversion required. However, HDMI is a digital-only connection - so you can't use passive VGA adapters with this connection since you would need active digital-to-analogue conversion for it to work.

Hence your best option (as mentioned above) is to connect the VGA monitor to the DVI-I output (using an adapter like the one linked to) and use a passive HDMI to DVI-D adapter or cable to connect up the DVI monitor. HDMI is the same as DVI in terms of the video signal - so connecting up a DVI monitor through the HDMI port will not reduce the quality at all. If space at the outputs on the graphics card is tight then I would suggest using a DVI-HDMI cable like this.
 
Back
Top Bottom