• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Permabanned
Joined
2 Sep 2017
Posts
10,490
There are metrics for ASIC quality but without exhaustive tests on every single chip individually (which just isn't worth it) it isn't really possible to implement that - it is much cheaper to just assess a range that chips will work within and set the minimum voltage for that.

No. They are taking the worst case and put it on all the cards which is very incorrect. The exhaustive tests will be run like machine learning by the consumer himself, not in the factory. They will run in the background while the GPU is loaded.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,177
No. They are taking the worst case and put it on all the cards which is very incorrect. The exhaustive tests will be run like machine learning by the consumer himself, not in the factory. They will run in the background while the GPU is loaded.

You would need to do it at factory level for it to be reliable enough for what you are talking about. Last couple of generations AMD has had its hands tied a bit in terms of voltage - GF production was still having some yield issues and they jumped on 7nm early as well which is going to mean having to accommodate a larger range of voltage requirements than a more matured node.
 
Soldato
Joined
22 Nov 2006
Posts
23,400
The industry leading IQ is produced with Radeons with the following Radeon Settings adjustments:

Texture Filtering Quality - High
Surface Format Optimisation - Off
Anti-aliasing Method - Adaptive multisampling
Pixel Format - YCbCr 4:4:4 Pixel Format
Colour Temperature - Automatic

Using adaptive multisampling causes weird issues. E.g. in Skyrim it makes helmets transparent, but only certain ones O_o
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
Panos said:
Not actually. To boot there is a different in colour spectrum because by default Nvidia has all their cards with culled RGB to improve bandwidth and performance.
Has to be manually activated.

Man this argument is still going on.

OK, citation please. You say it's to improve performance, so presumably you've seen benchmarks. Link please?

Earth to @Panos, where have you gone?
 
Man of Honour
Joined
13 Oct 2006
Posts
91,177
Well that's complete nonsense. Either back your claim up or admit you made it up. Simple enough ;)

I'm not even sure what he is claiming - there is a couple of known consideration in terms of limited colour output one being that HDMI output on nVidia tends to default to limited range RGB on a range of displays which often needs adjusting but that has no real impact on performance and the other is an edge case that doesn't apply to most users.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
AMD's Radeon Default settings vs Nvidia Default settings
AMD does provide better IQ at default.

For nvidia (and I'm going on memory) you have to make a few changes in control panel to get it working again.

Here is a quick search youtube video snippets showing a tut on what to change

 
Man of Honour
Joined
13 Oct 2006
Posts
91,177
AMD's Radeon Default settings vs Nvidia Default settings
AMD does provide better IQ at default.

For nvidia (and I'm going on memory) you have to make a few changes in control panel to get it working again.

Here is a quick search youtube video snippets showing a tut on what to change

The RGB thing only applies to certain displays over HDMI where nVidia uses a blanket TV settings approach - if I hook up my Dell S2716DG for instance it is correctly using full range out the box without touching settings.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
The RGB thing only applies to certain displays over HDMI where nVidia uses a blanket TV settings approach - if I hook up my Dell S2716DG for instance it is correctly using full range out the box without touching settings.

It is something that those with the card need to know if they want to improve IQ.
If there is a list of monitors that behave as you describe I'm sure the community at large would benefit from knowing it.
 
Soldato
Joined
3 Jan 2006
Posts
24,955
Location
Chadderton, Oldham
I have always noticed the difference in image quality between Amd and Nvidia, and wondered how anyone with good vision could not notice it.

The image on my 1070 was like the depth, colour, contrast had been taken out of it, compared to the Amd card before it and the Sapphire Nitro+ 5700xt I have now.

While the dynamic range tweak and digital vibrance set higher on the 1070 did make it look not so dull and washed out, the default image of the 5700xt is still a noticeably better image. The image quality of Nvidia would be hard to go back to.

My laptop with a 1070 has such vibrant colours and great contract, I would therefore expect an AMD card to look over saturated and contract way out.
 
Soldato
Joined
22 Nov 2006
Posts
23,400
AMD's Radeon Default settings vs Nvidia Default settings
AMD does provide better IQ at default.

For nvidia (and I'm going on memory) you have to make a few changes in control panel to get it working again.

Here is a quick search youtube video snippets showing a tut on what to change


It's about time Nvidia modernised their control panel. AMD's is so much better now, with built in overclocking tools etc :/
 
Last edited:
Associate
Joined
26 Jun 2015
Posts
669
My laptop with a 1070 has such vibrant colours and great contract, I would therefore expect an AMD card to look over saturated and contract way out.

I like how people think image quality is only colour hue/saturation and contrast, theres plenty of other things that make up image quality that is being discussed here.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,177
My laptop with a 1070 has such vibrant colours and great contract, I would therefore expect an AMD card to look over saturated and contract way out.

I pretty much guarantee as I've said and even shown that if someone is seeing anything other than a very minor difference between them then there is some other issue going on such as misconfiguration.

I have more than 2 decades of experience with this and built and used endless number of different systems over the years and while I would agree that AMD tends to have a slight advantage in terms of the perception of colourfulness when everything is configured correctly it is a very slight difference that you'd need identical setups side by side to appreciate - several Youtubers have even done side by side comparisons (such as https://www.youtube.com/watch?v=WmvPwuHRiNc ) and were unable to actually capture a difference using photography gear but by eye did tend to notice a very very slight colour saturation advantage to AMD.
 
Back
Top Bottom