VGA is analogue, this means you have 2 (or 3 if you count the lighting up of pixels on the screen) conversions going on:
Digital to Analogue on your graphics card
Analogue to Digital on the monitor input
This becomes a problem as for good picture quality you are reliant on the DA in your graphics card being good and the AD in your monitor also being good. I'm sure the extra steps could also add a little lag, although this should be negligible.
With a DVI cable there are no conversions, the signal is purely digital all the way to the monitor.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.