• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
Its that debate again, only this time abit of a different approach in the testing.

It would seem both vendors have better image quality depending on game! Forza was the clear winner here from the other titles followed by FarCry 5 for Nvidia.
I know they is a few guys on here that are convinced AMD does have a better image quality but I honestly believe that is to do with the default GPU colour output.
Nvidia really need to address that tbh


P.S keep this clean please
 
Associate
Joined
9 Sep 2018
Posts
84
Location
Barcelona
Pair an Nvidia gpu with a TN monitor and you'll get an horrible colour banding, that's a fact. AMD has dithering implementation on hardware level afaik, Nvidia offers it via software but only on Linux drivers.
 
Associate
Joined
15 Oct 2018
Posts
1,293
Will have to give that dynamic range nVidia adjustment a whirl. Can't see any difference just on the desktop though.
 
Associate
Joined
28 Aug 2014
Posts
2,228
Stock to stock I noticed AMD is quite a bit better. More vibrant, more colours. Thats was 3 years ago I changed though. GTX 970 to rx 480.

I know this is off topic but if we can't mention how AMD cards are not power hungry when tweaked we shouldn't be able to mention how nvidia picture quality is similar when tweaked.
 
Associate
Joined
19 May 2012
Posts
1,297
I changed one of my setups from Nvidia to AMD this week and noticed the better colours on AMD as usual in both HDR and SDR.

Don't even touch that digital vibrance rubbish. It increases banding.
 
Soldato
Joined
6 Feb 2010
Posts
14,594
Thought it was just my imagination since it's not a topic that's being discussed or brought up frequently, but guess that would explain why back when I was playing Guild Wars 2 the colour looked more "dull/wash-out" on the GTX560Ti comparing to the HD5850.
 
Soldato
Joined
26 Sep 2013
Posts
10,713
Location
West End, Southampton
I’ve said this a few times of late, I’m in a perfect position having two separate gaming rigs using the same monitor, gaming on Vega64 rig is visually better than the 1080ti rig. The detail and image looks cleaner, details pop more with Vega64. It’s not just down to colour settings, I’ve increased them via Nvidia control panel to increase colour/vibrancy but detail still looks nicer on Vega64, for me anyway.

I use display port for vega64 for Freesync, and HDMI for 1080ti, not sure that would make any difference though.
 
Last edited:

V F

V F

Soldato
Joined
13 Aug 2003
Posts
21,184
Location
UK
Ahh I remember this from the time I had an ATi 9800 pro after switching from an nvidia card. Seen it mentioned from time to time I guess there must be something in it for those that want to find a difference. My eyes are only getting worse as I age so better off asking a teenager!

I couldn't greatly see a difference that so many claim considering I've been on ATi/AMD since 2003 until 2017. Only the Omega drivers were a little better.

Marketing is powerful in what they want to see and hear. Just like the 4K thread.
 
Soldato
Joined
13 Jun 2009
Posts
6,847
I have no idea but I would definitely leave all image adjustment options at their default settings if you want the best image quality, as long as your monitor has a correct colour profile.
 
Soldato
Joined
22 Nov 2006
Posts
23,390
Having tried this back to back it's more than simply colour settings. As mentioned above, if you increase vibrance it will cause other image quality issues and doesn't really "fix" it.

It's like Nvidia are adding a white layer to everything (maybe some workaround to increase FPS?) which causes a slightly washed-out look. The texture sharpness also appears better on my AMD card as well as less colour banding.

It's not a HUGE difference and you won't notice it until you switch from Nvidia to AMD, but there is something there.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,168
Will have to give that dynamic range nVidia adjustment a whirl. Can't see any difference just on the desktop though.

It is only an issue on certain setups - for most people it will be using full RGB anyhow.

Don't even touch that digital vibrance rubbish. It increases banding.

That is what it does - it isn't rubbish really it is what happens when you adjust saturation in that manner but for some people it might be preferable - personally I won't use it.

I’ve said this a few times of late, I’m in a perfect position having two separate gaming rigs using the same monitor, gaming on Vega64 rig is visually better than the 1080ti rig. The detail and image looks cleaner, details pop more with Vega64. It’s not just down to colour settings, I’ve increased them via Nvidia control panel to increase colour/vibrancy but detail still looks nicer on Vega64, for me anyway.

I use display port for vega64 for Freesync, and HDMI for 1080ti, not sure that would make any difference though.

Something wrong if there is a massive difference - I've run many setups side by side in my time and as I've said before in some cases the AMD setup looks slightly more vibrant but it is literally like nudging the digital vibrance setting in the nVidia control panel like 2% if it is more than that something else is in play.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
Stock to stock I noticed AMD is quite a bit better. More vibrant, more colours. Thats was 3 years ago I changed though. GTX 970 to rx 480.

I know this is off topic but if we can't mention how AMD cards are not power hungry when tweaked we shouldn't be able to mention how nvidia picture quality is similar when tweaked.


That is just default color profiles though. Nvidia use more realistic color tones by default, AMD jack up saturation and vibrancy , which is how you sell cheap TVs to consumers.


I believe Nvidia keep the color profiles of the Gerforce cards the same as for professional Quadro cards, while AMD's Firepro cards aim for more realistic colors and vibrancy and look closer to Nvidia.
 
Associate
Joined
26 Apr 2017
Posts
1,255
That is just default color profiles though. Nvidia use more realistic color tones by default, AMD jack up saturation and vibrancy , which is how you sell cheap TVs to consumers.


I believe Nvidia keep the color profiles of the Gerforce cards the same as for professional Quadro cards, while AMD's Firepro cards aim for more realistic colors and vibrancy and look closer to Nvidia.

consensus seems to be that Nvidia is washed out and AMD has better colors.
Cant argue with that

subjective views does count
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
consensus seems to be that Nvidia is washed out and AMD has better colors.
Cant argue with that

subjective views does count


But subjective views are often wrong, hence cheaop TV are sold with jacked up saturation. Tehy have terrible image quality, but those who don't know better seem to lap it up.
 
Soldato
Joined
13 Jun 2009
Posts
6,847
consensus seems to be that Nvidia is washed out and AMD has better colors.
Cant argue with that
Well if we can't argue with that I guess we should close the thread. :rolleyes:

When people describe image quality they generally use nebulous terms like "colours" and "vibrant". There are multiple ways in which the image quality between cards (in reality drivers) can differ. For example, they might have different default settings for colour saturation, brightness, etc. The only objectively "correct" settings are those closest to "neutral" such that what is shown on your monitor is as close as possible to the source image. This will depend on your monitor too though. "More vibrant" is in no way objectively "better".

Separately from that, each driver/card could render objects differently, e.g. texture compression. Typically the user can change these settings either in-game or using the control panels but not everything is exposed to the user. With default settings AMD and nVidia probably handle certain things slightly differently. This can be objectively compared with screenshots but whether it's ever noticeable during gameplay is another question and I haven't seen any evidence of that. Given most people wouldn't notice the difference between medium and high quality textures in a particular game, I really doubt it makes a difference except in extreme cases (e.g. I vaguely remember a game from earlier this year in which nVidia cards weren't showing some shadows?).

But subjective views are often wrong, hence cheaop TV are sold with jacked up saturation. Tehy have terrible image quality, but those who don't know better seem to lap it up.
Yep, walk into any shop with TV displays and they'll all have unnatural colour saturation, whacked up motion interpolation, and maxed "sharpness" which just gives everything a nasty halo effect.
 
Soldato
Joined
22 Nov 2006
Posts
23,390
But subjective views are often wrong, hence cheaop TV are sold with jacked up saturation. Tehy have terrible image quality, but those who don't know better seem to lap it up.

But in this case it isn't :D

AMD aren't just raising the vibrancy etc as the side-effects you get from doing that aren't present. Like I said earlier it's more than just colours.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
I’ve said this a few times of late, I’m in a perfect position having two separate gaming rigs using the same monitor, gaming on Vega64 rig is visually better than the 1080ti rig. The detail and image looks cleaner, details pop more with Vega64. It’s not just down to colour settings, I’ve increased them via Nvidia control panel to increase colour/vibrancy but detail still looks nicer on Vega64, for me anyway.

I use display port for vega64 for Freesync, and HDMI for 1080ti, not sure that would make any difference though.

DisplayPort is the better interface.

Pair an Nvidia gpu with a TN monitor and you'll get an horrible colour banding, that's a fact. AMD has dithering implementation on hardware level afaik, Nvidia offers it via software but only on Linux drivers.

It has always been the case. Since 20 years ago from the age of Matrox graphics cards, nVidia is famous for their lack of understanding of image quality.
 
Back
Top Bottom