DVI to VGA analog??

Associate
Joined
15 Oct 2007
Posts
2,186
Location
Bedfordshire, UK.
I have a monitor and old gfx card that both have dvi connections on them. Apparently the monitor only supports up to 80hz with a VGA analog signal. If I have a DVI to VGA cable and then on the VGA end plag a VGA to DVI adaptor would this qualify as an analog cable not digital and thus allow me to get 80hz out of the monitor?
 
(not sure if the monitor we're talking about is the monitor on your sig (2405FPW) or not, but the below answer is given on that basis)

The 80Hz is most probably limited only to the lower resolutions. Which means you will be running it at non-native resolution, which will make the image blurry in LCD monitors, as they have fixed pixels and need to interpolate the non-native resolutions. Whereas the technology of CRT monitors is widely different, and wouldn't be that limited to these shortcomings.

Also, is there a specific need for 80Hz? Most LCD monitors run normally at 60Hz. With CRT monitors this would cause noticeable flicker, and higher refresh rates were more preferable. But LCD, as a different technology, mainly have the backlight unit that is the sole source of flicker (and doesn't care about the panel's refresh rate), and manufacturers usually don't tell you this figure. Natively it would be over 2kHz (2000Hz), I think, but PWM-control can bring it down to 200Hz (for brightness control and power-saving reasons). But this for most people should still be unnoticeable by the naked eye. There are few PWM threads even in this forum, if you want more information. Furthermore, the flicker is less noticeable with CCFL backlights, which your monitor most probably has, given the age of it. Modern monitors mainly use LED backlights, which are more energy efficient, but there are drawbacks.

The most common reason I can see for 80Hz on LCDs would be the possibility for higher FPS on games. But as I said earlier, at the same time you also lose resolution as the 80Hz is most probably only limited to the lower resolutions.

As for DVI and VGA (though not sure if you need this anymore, given the above text, so not gonna go very deep with this):
DVI can come in digital and analog form. VGA is always analog. For analog DVI, you need to check whether the connectors (on monitor AND graphics card) are DVI-A or DVI-I. If they are, then you are good to go. But do note: although some monitors give the option of analog signal, it would be recommended to always use digital signal with LCD monitors, if at all possible (because of the way the LCD is designed). You can get more info from wikipedia regarding the DVI-A and DVI-I.

If you still wanna go forward, here's a link for more advanced settings (modelines). I know how to deal with those in Linux world, but not sure how to use them in Windows...
http://en.community.dell.com/support-forums/desktop/f/3515/p/17163667/17286653.aspx

But as you can see, the highest combo seems to be 1280x1024@75Hz

Here's one even more advanced link, where you can make your own modelines (I've used it few times for CRT myself, but not sure how applicable it is for LCD):
http://xtiming.sourceforge.net/cgi-bin/xtiming.pl

Just make sure not to use modes that go above your monitor's own Dot Clock Frequency (if there is one for LCD...?). You will need to check the manual for the numbers in the first section.
 
Ok the monitor in question is a HP LP2065 4:3. It apparently support 1600x1200 @75hz in analog mode (DVI-VGA)...

"1600 x 1200 @ 60 Hz (native), all VESA modes up to 1600 x 1200 @ 75 Hz (analog), 1600 x 1200 @ 60 Hz (digital)"

The manual does state that it can do analog over the dual DVI connections on the monitor, the problem is my GFX card doesnt have a VGA socket on it only DVI and HDMI. Now i have a DVI-VGA cable so i have connected the DVI end to the monitor and added a DVI-VGA adapter on the VGA end and then connected the adapter (DVI end) to my GFX card, but the monitor detects there is a cable but says its not active and puts the monitor into sleep mode, also windows doesnt detect there is a monitor connected.

Can a DVI-VGA (analog) cable with a aded DVI-VGA adapter on the DVI end be used in this way to give an analog signal, as it is an analog cable but now has 2 DVI connections on either end (because of the DVI-VGA adapter on the VGA end)??
 
Last edited:
(Hmm, I don't know how to solve this without going technical, so apologies in advance...)

As I see it, if you already have a DVI-to-VGA cable (should have come along the monitor), you shouldn't need any additional adapters. Just plug the VGA to monitor, and DVI to the GPU. The problem arises, if the GPU's DVI port doesn't support analog signals (or high enough resolution/frequency combos). In which case no amount or combination of passive adapters would help.

ALSO IMPORTANT:
Standard VESA 1600x1200@75Hz requires Horizontal Sync Frequency of 93.75 kHz and Pixel Clock of 202.5 MHz. Your monitor's DVI input allows only 92kHz and 162 MHz, respectively. VGA input allows 94 kHz and 202 MHz (which is 0.5 MHz too little, but there could be some leeway in either end). So you can only use VGA input for higher modes. Problem arises if those same restrictions are on the GPU side, because your GPU doesn't offer VGA at all.

And make sure your monitor is trying to find and use the right input signal. There's probably DVI-I #1, DVI-I #2 and VGA to choose from. You'll want to use the VGA.

Although what worries me, is that the monitor's manual gives an example of using DVI-to-VGA only in one direction (from GPU's VGA to monitor's DVI-I). I may not be THAT familiar with the matter, but I don't see why it shouldn't work the other way around, too? Or doesn't the GPU realize which type of signal it should output?

But as I asked previously: Is there any specific reason or need to have 75Hz? Or anything higher than 60Hz? Like I said earlier, you should always use digital signals with LCD (preferably the native resolution), if at all possible.

In any case, bottom line:
You can't get 1600x1200@75Hz through the monitor's DVI port. Not sure if the GPU's DVI is the same, in which case you're out of luck.

Even MORE technical stuff, if interested:
Here's another Pixel Clock calculator:
http://www.epanorama.net/faq/vga2rgb/calc.html

What's good about this is that it also offers standard VESA modes (modes your monitor should offer automatically, if compatible). Though it needs "front porch" and "back porch" values for custom modes, which makes the earlier link more convenient. It's good for checking standard VESA mode requirements, though.

PS. The earlier mentioned disparity between 202 and 202.5 MHz can also be a case of slightly different back and front porch values...
PS2. Actually, it's most probably a case of "reduced blanking" for LCD monitors:
http://en.wikipedia.org/wiki/Coordinated_Video_Timings
 
Last edited:
Ah-haa, the manual speaks about "VGA input" and "DVI input", so I took it also as physical VGA and DVI connectors, but apparently not, then... Well that's a bummer. But that would explain why the manual gives the odd example of the cable settings. Other option is that there are actually two different versions of that monitor, and one of them has the VGA connector, too.

Hmm, can you check from the manual if your GPU supports analog signals through the DVI?

Can you get the native 1600x1200@60Hz through the current cable settings?

The DVI-VGA cable most probably has pins for analog signal requirements only, so it should stay analog all the time, the adapter shouldn't change this in any way. (You could check this also from the wiki.)

PS. And you STILL haven't answered the question: Why do you need or want the 75Hz?

PS2. Btw, does the monitor have (in the OSD) different options for something like "DVI (analog)" and "DVI (digital)"? And you have tried going through all the input modes, right?
 
Last edited:
Not sure if my GPU supports Analog signal or not its a ASUS ATI 6970 2GB card.

Yeah im able to get 1600x1200 via the supplied DVI to DVI cable

I want to be able to use a Analog cable with the monitor as it does really low refresh rate (31hz-75hz) which is very rare for an LCD. I am wanting to use this monitor on a MAME PC im building as this monitor will be able to run the old arcade games in there native refresh rates.

I cant see any options for select the DVI port to be analog or digital
 
Hmm, just now noticed you've replied (I gave up after waiting for two weeks :D )...

These are the only ASUS 6970 cards I could find:
http://uk.asus.com/Graphics_Cards/AMD_Series/EAH69702DI2S2GD5
http://uk.asus.com/Graphics_Cards/AMD_Series/EAH6970_DCII2DI4S2GD5

Is either one of them yours? I'm a little worried, because you said your GPU only had HDMI and DVI, but these also have DisplayPorts. Hmm, could you also check the GPU manual and see if there's ANY mention of DVI-I ? That would really clear things up. The rest of the post is made on the assumption that your GPU is indeed one of those.

----------------------------------

Well, as it so seems (if above is right), your GPU has one DVI-I port, and one DVI-D port. DVI-D is only capable of digital signals so that won't work, but the DVI-I should be OK for VGA (for analog signal).

First things first, just to make sure:
Can you get the native resolution and refresh rate 1600x1200@60Hz (or anything at all, for that matter) with the "DVI-VGA-cable + VGA-DVI adapter" combination? DVI-DVI cable is probably with pins for digital signals, so we're not interested in that. Actually, you can put the DVI-DVI cable aside for now. Only use the "DVI-VGA + VGA-DVI". Try both ports on the GPU (actually, there should be a mark for "VGA" on the other port, try that first) and both ports on the monitor, too. You might have to restart in between the changes. It's at least recommendable.

And just to clear something: 75Hz is not a LOW refresh rate for LCD technology. 60Hz is the standard refresh rate, and it's what basically every LCD monitor uses by default. The highest refresh rates for consumer monitors are 144Hz at the moment.

Regarding MAME (arcade games, right?) and lower refresh rates:
I don't have that much knowledge about arcade machines, but I'd reckon they use either 50 or 60 Hz refresh rate? Neither should be a problem for LCDs, I think. And even if they use 30 Hz, that's exactly at half point of 60Hz, so you would be better off using the 60Hz (there shouldn't be any disadvantage from running it at 60Hz, it just shows the same frame twice).

Well, that's a secondary issue, in any case. Let's first try to figure out whether the analog signal is possible or not...

Oh and btw, the numbers with MHz and kHz in the earlier posts are different kind of clock rates, they're not the refresh rate of the monitor itself. Like I said, they're more technical figures, relating to the electronics inside, don't worry about them for now.
 
Back
Top Bottom