• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Man of Honour
Joined
13 Oct 2006
Posts
91,058

Interesting:

You can see that there is greater deviation in colour accuracy between the two signals than there was on the Nvidia GPU. This is true for deep red and certain grey and pastel shades in particular, amongst others.

I might have to try and get a chance to go back re-examine my previous example I've posted in Dishonored with the deep red and brown barrels showing different on AMD to nVidia - though from what I recall I was using RGB on both.

EDIT: I think that link reinforces what I've said before though - if someone is seeing other than a very minor difference between the two they, whether that is me or anyone else, is probably not understanding something somewhere or have something not configured equally.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
What i will say is while playing on a Nvidia setup i have never noticed any detail missing. Might have missed it if there was its been more that the screen has been more pleasing on my eye on ati/amd. Most likely just a preference thing and Nvidia do kick out more performance easily atm but for a lot more cash though. Like i said within 2 years my mate paid out 2k on Nvidia graphics cards but missed amds image. No bias just what our eyes could see. Metro on the 2080ti at 4k RTX on still looked the business and to my eye one of the best best looking games i have played.


Radeon IQ


GeForce IQ




And more:
Question: Is nvidia cheating on image quality? Can someone confirm this?
https://www.reddit.com/r/Amd/comments/c7yxrb/question_is_nvidia_cheating_on_image_quality_can/


"nvidia is heavily compressing textures"
 
Last edited:
Permabanned
Joined
2 Sep 2017
Posts
10,490
Nvidia owners prefer their image and Amd owners prefer their image.

None are better than the other, they are both just different.

The thing is that nvidia gains unfair performance advantage by rendering less, thus stealing the top performance crown from the Radeon Vii that should be in reality faster than RTX 2080 Ti.
 
Soldato
Joined
22 Nov 2006
Posts
23,364
They can still compress things like textures though. OFC the more you compress the more quality you lose so its a trade-off if you want more fps.
 
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
The thing is that nvidia gains unfair performance advantage by rendering less, thus stealing the top performance crown from the Radeon Vii that should be in reality faster than RTX 2080 Ti.
Rubbish. Where is it shown that NVidia 'render less'? Why should the R VII be faster than a 2080Ti? Cores alone, the 2080Ti has 4352 as opposed to the 3840 on the Radeon.

Please stop making stuff up and partake with at least a moderate amount of knowledge.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
Where is it shown that NVidia 'render less'?

Here:

Radeon IQ
Radeon-IQ.png

GeForce IQ
nvidia-IQ.png


The GeForce IQ is rubbish.
 
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Here:

Radeon IQ
Radeon-IQ.png

GeForce IQ
nvidia-IQ.png


The GeForce IQ is rubbish.
You are using a post that is over 10 years old (GTX 285 against the HD 5870) as your proof and using an image of differing attributes as the reasoning. Look at both shots moving and you will see they are not the same. I had my 290X in my PC a short while back and whilst I also felt the IQ was better, that was down to the fact that default settings are superior to NVidia's and not even I will argue with that but after some calibration (which is needed on both), I couldn't see a difference at all.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,524
Location
Greater London
You are using a post that is over 10 years old (GTX 285 against the HD 5870) as your proof and using an image of differing attributes as the reasoning. Look at both shots moving and you will see they are not the same. I had my 290X in my PC a short while back and whilst I also felt the IQ was better, that was down to the fact that default settings are superior to NVidia's and not even I will argue with that but after some calibration (which is needed on both), I couldn't see a difference at all.
You are wasting your time mate, the guy see's 20% difference from Nvidia to AMD cards IQ :p;)
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,524
Location
Greater London
Maybe. Should I go back even further and really show my age by saying Quake 3 on ATI was ummm questionable lol

Quake Vs Quack is worth a read.... For comedy and nothing more.
Yeah. Silliness really. If he wants to be taken seriously he needs to provide a Pascal/Turing Nvidia GPU with a Vega/Navi and do so demonstrating the results over say 5-10 games.

Unfortunately for him showing an image from ages ago and saying the difference he see's is 20% just kills all his credibility. Worse yet he ignores an logical argument and carries on saying them same thing over and over again rather than engaging in it with an open mind.

I think we probably all agree AMD has a slight better IQ out of the box, but this so damn small that hardly anyone cares. End of the day AMD are free to reduced their IQ and get the 30% or whatever performance they are behind by, but obviously it not that simple and it does not work like that...
 
Last edited:
Permabanned
Joined
2 Sep 2017
Posts
10,490
You are using a post that is over 10 years old (GTX 285 against the HD 5870) as your proof and using an image of differing attributes as the reasoning. Look at both shots moving and you will see they are not the same. I had my 290X in my PC a short while back and whilst I also felt the IQ was better, that was down to the fact that default settings are superior to NVidia's and not even I will argue with that but after some calibration (which is needed on both), I couldn't see a difference at all.

@Panos can show you the same differences with Radeon Vega 64 vs GTX 1060.

I said that nvidia's images are extremely aggressive and unpleasant to the eye - washed out colours, missing details, too high contrast in some areas, too high brightness in others.
It's like image produced by some amateurs who have no clue of graphics and art, rather than super paid professionals...
 
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
But it's quite strange if they pursue performance in desktop 2D mode?
They cheap out on some elements on the circuit boards, hence the lower image quality in 2D... That's a problem since year 2000 and perhaps before.

@Panos can show you the same differences with Radeon Vega 64 vs GTX 1060.

I said that nvidia's images are extremely aggressive and unpleasant to the eye - washed out colours, missing details, too high contrast in some areas, too high brightness in others.
It's like image produced by some amateurs who have no clue of graphics and art, rather than super paid professionals...

I remember you from early in this thread and you was stopping a Youtube video where it suited your argument and then posting it as proof of NVidia lowing IQ. Comical and pretty lame in truth. It also shows a lack of knowledge and whilst I was looking to see what Panos was posting I found this and laughed out loud (I need to get a life).

How on earth does "Cheaping out on some elements on the circuit boards" even remotely equate to 2D IQ? Seriously, I would love you to show me this and make me eat my words and if you can prove it, I will never ever buy NVidia again.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
I remember you from early in this thread and you was stopping a Youtube video where it suited your argument and then posting it as proof of NVidia lowing IQ. Comical and pretty lame in truth. It also shows a lack of knowledge and whilst I was looking to see what Panos was posting I found this and laughed out loud (I need to get a life).

How on earth does "Cheaping out on some elements on the circuit boards" even remotely equate to 2D IQ? Seriously, I would love you to show me this and make me eat my words and if you can prove it, I will never ever buy NVidia again.

Look, assessing image quality in front of a monitor as an ordinary user is not rocket science. You don't need knowledge to see the defects. They are just there and have been there for decades.
 
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Look, assessing image quality in front of a monitor as an ordinary user is not rocket science. You don't need knowledge to see the defects. They are just there and have been there for decades.
Sorry but you haven't actually answered any of my questions. I also agree and assessing IQ in front of a monitor isn't rocket science and I even said, "I feel AMD have a better 'out the box' IQ over NVidia"

Now please please please show me some tiny evidence of where NVidia are using 'cheaper elements on the circuit board' to lower the 2D IQ.

Remember, if you do, no more NVidia buying from me :D :D

If not, sadly, I have to put you back down the rabbit hole!

Edit:

@TNA Behave lol
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
@Panos can show you the same differences with Radeon Vega 64 vs GTX 1060.

I said that nvidia's images are extremely aggressive and unpleasant to the eye - washed out colours, missing details, too high contrast in some areas, too high brightness in others.
It's like image produced by some amateurs who have no clue of graphics and art, rather than super paid professionals...

Mate don't bother. Don't you see them?
 
Back
Top Bottom