• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Soldato
Joined
16 Jun 2004
Posts
3,215
This argument has been going on for so long.

I remember similar ramblings when I had an Nvidia 6800GT back in 2003(?) vs ATI X850 cards.

a lot of it came down to Nvidia or AMD (can't remember which) messing about with Anisotropic filtering levels, to get higher scores in benchmarks of the day.

Having said that, as someone who has owned both brands down the years, AMD cards always came with more vibrant default colours than Nvidia, which I've always had to tweak in the NV control panel.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,376
Location
Greater London
You just took 'AMD's colour output straight out the box is superior' as

So imo, Nv having the overall better experience isn't enough for you, they can't be runner up at anything?

I'll get ma coat...
Lol. Well said mate.

I agree with what you said anyway. I have switched between AMD and Nvidia graphics cards quite a bit over the past few years and as you say out of the box AMD colours are more vibrant and better. But it is not something that stops me from going back to Nvidia. All it takes is a few hours and your eyes adjust and forget about the difference.

Bit like when changing something from bright to something a bit dimmer, at first the dimmer image looks horrible, after a while looks perfectly fine. This happens to me on my monitor a lot as I have two profiles on the actual monitor (not software) that I use. The first is gaming mode where I have upped the brightness and also have overdrive set to medium, the other is for desktop use with lower brightness and overdrive set to low. The gaming profile having the higher brightness really brings out the colours in games and it just looks much better but not something I would use for browsing as no need and not great on the eyes at night. When I initially change from gaming to desktop profile the desktop one does not look very good and lacks the punchy colours, but after a while your eyes adjust and it’s fine for browsing.
 

V F

V F

Soldato
Joined
13 Aug 2003
Posts
21,184
Location
UK
You just took 'AMD's colour output straight out the box is superior' as

So imo, Nv having the overall better experience isn't enough for you, they can't be runner up at anything?

I'll get ma coat...

What... No wonder what Gregster said about this section. LOL!
 
Associate
Joined
23 Dec 2018
Posts
1,129
I legitimately noticed this almost 20 years ago when my 9800 pro gave up the ghost and I had to get an emergency Nvidia 520 or something that didn't even have a fan and just a small heatsink.

I thought it was because I went from a nicer ATI card to a bottom of the line Nvidia, but subsequently noticed ATI were richer colours in general even with more expensive Nvidia cards. However, I upped digital vibrance a bit and also changed a black setting to make them more deep in Nvidia settings and was fine with it and have been ever since.

Haven't actually had an ATI/AMD card since then as Nvidia has suited me better, but I definitely agree ATI/AMD had richer colours with more 'pop' straight out of the box.
 
Soldato
Joined
30 Mar 2010
Posts
13,096
Location
Under The Stairs!
Lol. Well said mate.

I agree with what you said anyway. I have switched between AMD and Nvidia graphics cards quite a bit over the past few years and as you say out of the box AMD colours are more vibrant and better. But it is not something that stops me from going back to Nvidia. All it takes is a few hours and your eyes adjust and forget about the difference.

Bit like when changing something from bright to something a bit dimmer, at first the dimmer image looks horrible, after a while looks perfectly fine. This happens to me on my monitor a lot as I have two profiles on the actual monitor (not software) that I use. The first is gaming mode where I have upped the brightness and also have overdrive set to medium, the other is for desktop use with lower brightness and overdrive set to low. The gaming profile having the higher brightness really brings out the colours in games and it just looks much better but not something I would use for browsing as no need and not great on the eyes at night. When I initially change from gaming to desktop profile the desktop one does not look very good and lacks the punchy colours, but after a while your eyes adjust and it’s fine for browsing.

Reading his reply, I'm sitting here in stitches now!!!:p

But yes, you pretty much nailed it too-it's nothing much tbh, but gets blown clean out of proportions by some peoples mindset.:D
 

V F

V F

Soldato
Joined
13 Aug 2003
Posts
21,184
Location
UK
Reading his reply, I'm sitting here in stitches now!!!:p

But yes, you pretty much nailed it too-it's nothing much tbh, but gets blown clean out of proportions by some peoples mindset.:D

I'm not the one trying to cause a storm over nothing. This thread died a long time ago and then was revived for what? Opinions. As there was nothing more to discuss.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
This thread died a long time ago and then was revived for what? Opinions. As there was nothing more to discuss.

Why would you think so? nvidia can decide at any moment to cheat, AMD can decide so in favour of some FPS, intel will join soon.
But I think, the order for now is AMD > intel > nvidia.
 
Soldato
Joined
28 May 2007
Posts
10,102
Which predator was it? the ones that have the high refresh AUO panel have washed out colour out the box, little better on later revisions, and you need to calibrate/adjust the gamma in software which kind of sucks - but at the time there weren't many options for a high resolution, high refresh panel so it was easier to accept the limitations. I think it is intentional to mask the colour banding you get in certain situations with those panels.

As I've said over many many years of posting here I've personally noticed AMD cards having slightly better colour saturation/vibrancy when all else is equal but I've never seen it as a massive difference like some are making out unless in some way people are not comparing like for like.

One aspect I've occasionally seen as well is AMD and nVidia having a slightly different implementation of gamma ramp/overbright bits which can make certain games look different when certain settings are used - mostly relevant to certain older games.

It is a 4k 60hz. i can't remember off hand the model number but was in the 700's for the monitor before it went on special offer and 700 for his 1080ti. The fps was impressive but the image less so.

I haven't had as many NV cards compared to AMD/Ati but when i got a 8800gtx it was so noticeable to me that the colours looked way off. I think those that use Nvidia cards don't really care as it's the norm. I don't really care either as i get used to it but initially i prefer AMD's image.
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,376
Location
Greater London
Why would you think so? nvidia can decide at any moment to cheat, AMD can decide so in favour of some FPS, intel will join soon.
But I think, the order for now is AMD > intel > nvidia.
When you think about it, how is it cheating really? If AMD suddenly find a way of boosting fps by 30% on all their cards by changing drivers in a way where the result is 99% same and you need to use screen shots to point out the small differences and no one seems to care, would that be cheating?

Both these companies use different hardware software to arrive at the end result of what we see. Just because one is different to the other (very likely as they are very different approaches) does that make it cheating?

End of the day we all have a choice and if the results were night and day different then people would not be buying Nvidia cards, but that is not the case now is it?

Too much bias guys. We all have it lets be honest, but some of you guys have way too much that you let it get in the way...
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
When you think about it, how is it cheating really? If AMD suddenly find a way of boosting fps by 30% on all their cards by changing drivers in a way where the result is 99% same and you need to use screen shots to point out the small differences and no one seems to care, would that be cheating?

Both these companies use different hardware software to arrive at the end result of what we see. Just because one is different to the other (very likely as they are very different approaches) does that make it cheating?

End of the day we all have a choice and if the results were night and day different then people would not be buying Nvidia cards, but that is not the case now is it?

Too much bias guys. We all have it lets be honest, but some of you guys have way too much that you let it get in the way...

The differences are best seen during motion images, not so much in static imagery.
There is no bias - there is reasoning behind this - why nvidia fanboys never claim that nvidia's picture is better than AMD's? Because it never is.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,376
Location
Greater London
The differences are best seen during motion images, not so much in static imagery.
There is no bias - there is reasoning behind this - why nvidia fanboys never claim that nvidia's picture is better than AMD's? Because it never is.
There is bias and you just showed it by dismissing all I said and carried on pushing your opinion ;)
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
There is bias and you just showed it by dismissing all I said and carried on pushing your opinion ;)

I didn't dismiss anything. You've just described how nvidia achieves those 30% uplift except that the difference in image quality is not 1%, but more likely 20%.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,376
Location
Greater London
I didn't dismiss anything. You've just described how nvidia achieves those 30% uplift except that the difference in image quality is not 1%, but more likely 20%.
Sigh...

If what you are doing is not showing bias, I don't know what is.

You are clearly dismissing my argument. You said they are cheating, I gave you an explanation as to why that is not true which you have gone out of your way to ignore. Then on top you are now saying the difference is 20%??? You either are poor with numbers/common sense or you are majorly bias and pushing your view. I am 99.9% sure it is the latter ;)
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
Sigh...

If what you are doing is not showing bias, I don't know what is.

You are clearly dismissing my argument. You said they are cheating, I gave you an explanation as to why that is not true which you have gone out of your way to ignore. Then on top you are now saying the difference is 20%??? You either are poor with numbers/common sense or you are majorly bias and pushing your view. I am 99.9% sure it is the latter ;)

First is that you won't notice any difference between 99% and 100%, which is not the case, hence I claim the difference is 20%.
Second - cheating like that - potential. One which happened in the past and there is no reason to believe or accept it won't happen in the future:

Futuremark confirms nVidia is cheating in benchmark
https://www.geek.com/games/futuremark-confirms-nvidia-is-cheating-in-benchmark-553361/
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,376
Location
Greater London
First is that you won't notice any difference between 99% and 100%, which is not the case, hence I claim the difference is 20%.
Second - cheating like that - potential. One which happened in the past and there is no reason to believe or accept it won't happen in the future:

Futuremark confirms nVidia is cheating in benchmark
https://www.geek.com/games/futuremark-confirms-nvidia-is-cheating-in-benchmark-553361/
Cheating in benchmarks is something completely different than what we are talking about.

Right, 20%... Will leave it at that, as clearly short of something divine occurring no matter what I or anyone says won't change your mind. Carry on with your bias ;)
 
Man of Honour
Joined
13 Oct 2006
Posts
91,772
First is that you won't notice any difference between 99% and 100%, which is not the case, hence I claim the difference is 20%.
Second - cheating like that - potential. One which happened in the past and there is no reason to believe or accept it won't happen in the future:

Futuremark confirms nVidia is cheating in benchmark
https://www.geek.com/games/futuremark-confirms-nvidia-is-cheating-in-benchmark-553361/

If anything like you are claiming was going on there are dozens of tech YouTube channels who'd be all over it and as I've linked to before when reviewers have done analysis with proper settings they struggled to find anything other than some very minor colour vibrancy difference.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,359
Location
kent
There is no bias - there is reasoning behind this - why nvidia fanboys never claim that nvidia's picture is better than AMD's? Because it never is.

smilyhammer.gif
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/

You can see that there is greater deviation in colour accuracy between the two signals than there was on the Nvidia GPU. This is true for deep red and certain grey and pastel shades in particular, amongst others. There is little need to critically analyse the accuracy of specific colour values for one signal type vs. the other as this varies between monitors. The take home message here is simply that ‘YCbCr 4:4:4’ and ‘RGB 4:4:4’ (‘Full Range RGB, 0-255’) do differ in their shade representation on AMD GPUs to a greater extent than Nvidia GPUs.

Unlike Nvidia’s ‘Limited Range RGB (16-235)’ signal AMD’s default ‘YCbCr 4:4:4’ signal never causes things to look washed out by dramatically altering gamma or contrast. But it does slightly affect colour values so some shades are presented slightly differently to how they would over a DVI or DisplayPort connection (i.e. ‘correct’).

o_O.
 
Last edited:
Soldato
Joined
28 May 2007
Posts
10,102
What i will say is while playing on a Nvidia setup i have never noticed any detail missing. Might have missed it if there was its been more that the screen has been more pleasing on my eye on ati/amd. Most likely just a preference thing and Nvidia do kick out more performance easily atm but for a lot more cash though. Like i said within 2 years my mate paid out 2k on Nvidia graphics cards but missed amds image. No bias just what our eyes could see. Metro on the 2080ti at 4k RTX on still looked the business and to my eye one of the best best looking games i have played.
 
Back
Top Bottom