Adaper v monitor scaling?

Associate
Joined
2 Feb 2006
Posts
375
Location
Chepstow, Wales
Just connected my Yuraku 19 inch monitor to my Nvidia 8600GT via a DVI cable and if I select adapter scaling then everything looks a bit blurred but if I select monitor scaling then everything looks ok.
To be honest I cannot see any difference in display quality using the DVI cable or the regular monitor cable.
What difference should there be???
Title should say Adapter and not Adaper........duh.....
 
You should always use monitor scaling.

You should always use DVI where possible too. Your video card outputs a digital signal. If you use D-Sub (analogue) then the outputted digital signal has to be converted by your cards DAC, through the cable, then back to digital again at your monitor. Using DVI, no conversion takes place.

On a 19" monitor, you won`t notice much difference, agreed, but try watching a movie then compare ;) Less interference from external sources too.
 
You should always use monitor scaling.

Personally I'm not so sure about that, IMO you should use whichever scaling gives you the best image, which in the OP's case does seem to be monitor scaling as you say, although more budget level monitors such as yuraku's tend to scale things terribly. For example my 24inch Yuraku can't scale anything that isnt 16:10 properly ending up with bits cut off or stretched, whereas using the nvidia scaler with it, things scale perfectly.
 
Back
Top Bottom