Improving Screen Clarity - Monitors or GFX Card?

Associate
Joined
14 Jan 2006
Posts
243
Hello,

So I was wondering if someone could help me. I currently do a lot of graphic design on both my PC and my Macbook Pro. My Macbook Pro has a retina display and so switching over to my PC, the graphics downgrade is quite noticeable.

At the moment I have 2 Ssamsung S24B300B monitors
http://www.samsung.com/uk/consumer/pc-peripherals/monitors/essential/LS24B300BS/EN

And a Palit GTX 560 Ti 2GB card
http://www.palit.biz/palit/vgapro.php?id=1485

What would I need to do to increase the graphics clarity I get on my PC. Is it possible to get a PC setup that would for example show the use of @2x.png images when used in webdesign?

Is it the graphics card at limit or the monitors or both?
 
The monitors will be limiting you.
The 13" has 2560x1600 and the 15" has 2880x1800

The next step up would be something 2560x1440 which your card should be able to handle but (if you game) it won't be great as your card will struggle a bit.
If gaming isn't a concern then go for it.
 
Are you using the DVI option to connect your monitors? As that'll be better than VGA.

How would it?
1920x1080 is 1920x1080 no matter if it's transferred by DVI, HDMI, VGA or rats tails.

Over distance I appreciate the signal can degrade on VGA but over a short distance >1m with a decent cable there should be no difference.

Not arguing, trying to stir a conversation is all....
 
How would it?
1920x1080 is 1920x1080 no matter if it's transferred by DVI, HDMI, VGA or rats tails.

Over distance I appreciate the signal can degrade on VGA but over a short distance >1m with a decent cable there should be no difference.

Not arguing, trying to stir a conversation is all....


Dont quote me on this but it's my understanding that when using VGA, the Graphics converts a Digital Signal to Analog then the screen converts it from analog back to Digital (I think) which can cause a loss in quality. Not by much though.

At work, we tested this a while ago on multiple screens side by side to see if there was any difference between VGA and DVI. On the same resolution, text did seem slightly sharper on DVI.
 
The DVI / VGA implementation is not the same across every brand and every model, some models implement those standards poorly, you may find you receive a better signal off one port over another.

Not with all monitors, but can be the case *sometimes*...

For the cost of a cable(or 5 mins looking if you have spares), it's worth checking to see if there is any noticeable difference between one input over another, as this can vary from screen to screen.
 
I do notice flickering more when using VGA than DVI, but not everyone will. But as said, it depends on brands, models, condition of the cable and socket to whether you will find a worse picture.

Most people think it's a crime and retrogression sticking to VGA. :p
 
That could be down to your cable Ryan, unless you're using the same VGA cable in both cases and using a converter....(in which case it could be the converter)..
 
Yeah VGA can be fine Steveocee, you're quite right. However, VGA won't be better than DVI* & may potentially be worse so there is no reason not to use DVI if you've got the choice. It involves double conversions that could (but may not) degrade quality and increase effective input lag. You need a higher quality cable too, to minimise the degradation over distance which is worse than with DVI

VGA is mostly still fine, but when both ends are DVI it seems a bit odd to use VGA. Not a disaster or anything, but always worth a shot especially when the problem is 'screen clarity' which is a potential area that could go when using VGA.

*unless you've got a knackered dvi cable :p

Both DisplayPort & HDMI have potential drawbacks such as issues with colour range but DVI has none that I can think of (at lower resolutions). Unless you're on a laptop in which case it probably has a big drawback: No port! :p
 
Last edited:
Back
Top Bottom