DVI vs VGA

VGA is analogue, this means you have 2 (or 3 if you count the lighting up of pixels on the screen) conversions going on:

  1. Digital to Analogue on your graphics card
  2. Analogue to Digital on the monitor input

This becomes a problem as for good picture quality you are reliant on the DA in your graphics card being good and the AD in your monitor also being good. I'm sure the extra steps could also add a little lag, although this should be negligible.

With a DVI cable there are no conversions, the signal is purely digital all the way to the monitor.
 
Back
Top Bottom