Why is dithering worse over VGA?

Soldato
Joined
17 Mar 2007
Posts
5,506
Location
Plymouth
There was nothing wrong with vga per se but it isn't appropriate for modern flatscreens, it can provide very high bandwidth, eg 100hz @ 1600x1200. Since crts were analogue devices they could use a direct pass through of the vga signal, which is why crts have zero input lag. Flatscreens on the other hand are always digital, the vga signal needs converting from analogue to digital (with an ADC) whereas it doesn't with dvi as dvi is already digital. The problem is most flatscreens have a poor quality ADC so the vga signal always appears poor, with a high quality ADC and cable the difference is less noticeable but vga is still inferior.
 
Soldato
Joined
11 May 2006
Posts
5,769
Depends on the graphics card as well as the monitor. Pretty much every onboard graphics VGA output I've used has been poor, with lots of noise and weird 'wave' artifacts constantly moving across the screen. Even on some early dedicated cards it was a problem. The first card that was free of noise was my 9800pro which was pretty much perfect with no detectable noise; all subsequent ATI high/mid end cards have also been noise free for me.

Also, I found that monitors with only VGA (no digital connections) gave a better image. My old LG 17" was VGA only and quality was pixel perfect with no noise - you couldn't tell it was VGA at all. Plus it was 75hz capable.
 
Soldato
Joined
6 Feb 2004
Posts
20,652
Location
England
i've never had a problem with VGA. i had a viewsonic 22inch (1680x1050) which supported both VGA/DVI and i couldn't tell the difference the two. since then i've had 2 completely different LGs which only are VGA only/1920x1080. again no problems whatsoever. whatever issues people have, i don't think there is anything inherently wrong with VGA but it's more likely poor implementation with certain kit.
 
Back
Top Bottom