• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

My god the amount of pro amd crap over better colours and (it just pop's, where nvidia is washed out) in this thread, is just unreal. I included a link in my earlier post which explains the exact mythbuster, it's funny I included the same link in the other thread and it was also ignored.

There is barely any difference when the nvidia rgb or ycbcr is set to full. I own both.
Now show me difference in rendering iq in modern games at the exact same camera angles and sure I'll gladly make a comment, contribute and test.
 
My god the amount of pro amd crap over better colours and (it just pop's, where nvidia is washed out) in this thread, is just unreal. I included a link in my earlier post which explains the exact mythbuster, it's funny I included the same link in the other thread and it was also ignored.

There is barely any difference when the nvidia rgb or ycbcr is set to full. I own both.
Now show me difference in rendering iq in modern games at the exact same camera angles and sure I'll gladly make a comment, contribute and test.

Why modern games when we play what we want, and second - how do you get the exact same camera angle and why does it matter so much? Different camera angle doesn't change the quality of the scene, only the lighting can be different but this isn't the point.
 
Why modern games when we play what we want, and second - how do you get the exact same camera angle and why does it matter so much? Different camera angle doesn't change the quality of the scene, only the lighting can be different but this isn't the point.

Why would you use just one ancient game to come to a conclusion. When you are myth busting you have to account for all variables, and this includes a greater variation of data, (more games, old or modern).

Why are there numerous posts in this thread stating AMD have better colours than Nvidia,
Despite this having been busted years ago?

You could get identical camera angles(although this depends on the game) , by having the same game save slot, which then loads into the game with no mouse or keyboard Interruption.
 
Why modern games when we play what we want, and second - how do you get the exact same camera angle and why does it matter so much? Different camera angle doesn't change the quality of the scene, only the lighting can be different but this isn't the point.

Texture filtering, etc. is angle and distant dependant - many game engines use mipmaps or similar techniques that have static variants of a texture for different distances for optimal filtering, etc. so even slight changes in the scene can massively change the detail of rendered textures.

If there was as much difference as you and certain others are making out in general the techsites would have been on top of it years ago.
 
Texture filtering, etc. is angle and distant dependant - many game engines use mipmaps or similar techniques that have static variants of a texture for different distances for optimal filtering, etc. so even slight changes in the scene can massively change the detail of rendered textures.

If there was as much difference as you and certain others are making out in general the techsites would have been on top of it years ago.

The techsites are corrupt. And they don't care so much. Like someone already said, it is about knowing or not knowing what you miss. If you have been staying with GeForces all your life, of course, you would have no idea how things are with the other camp.

But we know what is right.


 
Texture filtering, etc. is angle and distant dependant - many game engines use mipmaps or similar techniques that have static variants of a texture for different distances for optimal filtering, etc. so even slight changes in the scene can massively change the detail of rendered textures.

If there was as much difference as you and certain others are making out in general the techsites would have been on top of it years ago.
Whilst I agree with you about the texture filtering etc, I disagree about the tech sites angle.
 
The techsites are corrupt.

There are a vast array of techsites with all kind of allegiances/agenda - if there was such a big difference in general some would be shouting from the rooftop to make a name for themselves with it.

I've extensive experience of GPUs in general and have often ran identical monitor setups and/or multiple systems through a KVM, etc. side by side with different GPUs and though I have seen some difference in saturation (and hence vibrancy) between nVidia and AMD it is not consistently the case and is a lot more minor than most people are making out. Far more often these reports are due to user error especially if people have changed between vendors and have some residual issues from the previous GPU in play or inexperienced with what settings are actually doing.
 
http://www.overclock.net/t/1589554/amd-vs-nvidia-image-quality/10#post_24856355

From one of the Ashes Developers

"No. There should be minimal difference between the video cards. The D3D and OpenGl specification specify precision. For D3D, it's within a few ULP of 32 bit. Thus, while the images won't be identical they should be below a threshold which is visible. Additionally, sRGB curve is well defined so the output to the display should also have minimal variance. In short, color standards and API precision standards are not only well established, but actively verified by companies like Microsoft."

We've actually used both NV and AMD cards simultaneously alternating frames and the delta between the images is so small as to not be noticeable."

Perhaps it's simply a case of some are more susceptible to the small delta and or dont change the limited range switch, sure most people cannot perceive a difference. However a difference is still there.
 
http://www.overclock.net/t/1589554/amd-vs-nvidia-image-quality/10#post_24856355

From one of the Ashes Developers

"No. There should be minimal difference between the video cards. The D3D and OpenGl specification specify precision. For D3D, it's within a few ULP of 32 bit. Thus, while the images won't be identical they should be below a threshold which is visible. Additionally, sRGB curve is well defined so the output to the display should also have minimal variance. In short, color standards and API precision standards are not only well established, but actively verified by companies like Microsoft."

We've actually used both NV and AMD cards simultaneously alternating frames and the delta between the images is so small as to not be noticeable."

Perhaps it's simply a case of some are more susceptible to the small delta and or dont change the limited range switch, sure most people cannot perceive a difference. However a difference is still there.

What are we talking about, 0.3dE vs 0.8dE? because nobody would be able to spot the difference unless with reference monitors. Plus it still would be very hard. The way some go on it would be like 0.8dE vs 6dE as anything above 3dE shows errors. 2dE starts to become harder to notice since pro calibrators class reference below 2dE.

Since I've been on ATI/AMD from 2003 to 2017. I haven't noticed any dullness like some are making it out being on NVIDIA. Nor is anything popping like others has also spoken since I also have an AMD on the Mac side by side. Nothing is screaming like people spoke of Matrox cards.
 
The purpose of the exercise is that you stand in the same position with your nvidia graphics card and post here what you see there.
The two screenshots are on the same machine, just different places in the tunnel of Dust 2.
It really needs to be on the same machine and the exact same place, with the same settings. It is no good using different machines, as the colour settings might well be different. I still have my 290X that I might well use to do this at some point but not sure I can be bothered, as it has been covered by people who do this for a living.
 
In my opinion and experience it’s just the colours that are a bit different, not the actual graphics. I also prefer the default amd colour profile, but I am perfectly happy with nvidia’s profile also as I never bother to change it from the nvidia control panel.

What I however do is calibrate my monitor and have two profiles on the actual monitor settings which I control with the buttons it, one for desktop where overdrive is off and brightness is lower and another for gaming where overdrive is set to medium and brightness is higher which makes the colours pop a lot more and games just look better :)
 
I know one big problem for some users with the nvidia cards is the colours settings not actually loading with windows, you need to open up the nvidia settings and change them before they take effect and any changes you make reset every time you reboot.

When I switched from my 280x to gtx 1080 first thing I noticed was everything was washed out, I could go in the nvidia settings and change settings and picture would look good but everytime I reboot it would reset to washed out default.

Nvidia need to fix this issue it's been going on for years, I tried the fixes and couldn't find one to work, I had to switch all the settings round on my tv to compensate.
 
The techsites are corrupt. And they don't care so much. Like someone already said, it is about knowing or not knowing what you miss. If you have been staying with GeForces all your life, of course, you would have no idea how things are with the other camp.

But we know what is right.


That comment form Andres Palmo is just pure nonsense without any grounding in reality.

As has been said a dozen times already, the compression is absolutely lossless and AMD does the exact same compression algorithms.

"loss of detail on the modelling"., that is impossible otherwise Nvidia's drivers would fail to be certified by Microsoft. What is actually happening is some LoD system of the game engine and people comparing images that do not have the exact same position and camera angles.



And testing the exact same camera position is extremely hard in most games. In things like cut scenes and demos the camera position is interpolated and the exact position will depend on a number of factors such as FPS.
 
Perhaps it's simply a case of some are more susceptible to the small delta

Some people have a gene (or something like that) that allows them to see a bigger range of colours (stuff like Tetrachromacy). Personally I seem to see the world in more vivid colour than a lot of people who struggle to see the difference between grey and blue/grey, etc. where it is obvious to me. The difference is really minimal - like less than +2% digital vibrancy. Anyone talking anything else is either super sensitive to it and perceives a bigger contrast or isn't comparing like for like for some reason.

EDIT: I can't even say for sure it is a general thing - but I've seen it reproduced more than once over the years when I have had identical monitors and different GPUs to play with.
 
Some people have a gene (or something like that) that allows them to see a bigger range of colours (stuff like Tetrachromacy). Personally I seem to see the world in more vivid colour than a lot of people who struggle to see the difference between grey and blue/grey, etc. where it is obvious to me. The difference is really minimal - like less than +2% digital vibrancy. Anyone talking anything else is either super sensitive to it and perceives a bigger contrast or isn't comparing like for like for some reason.

EDIT: I can't even say for sure it is a general thing - but I've seen it reproduced more than once over the years when I have had identical monitors and different GPUs to play with.

i understand how it works and actually its not a gene, our eyes all see differently and its well documented link here http://www.bbc.co.uk/guides/zgh34j6 or https://qz.com/1254316/why-do-humans-see-different-colors-it-depends-what-language-you-speak/

edit: i think the main issue is actually the limited to full range switch, most people forget or dont realize to switch it over which has been blown out of proportion.
 
Last edited:
What are we talking about, 0.3dE vs 0.8dE? because nobody would be able to spot the difference unless with reference monitors. Plus it still would be very hard. The way some go on it would be like 0.8dE vs 6dE as anything above 3dE shows errors. 2dE starts to become harder to notice since pro calibrators class reference below 2dE.

Since I've been on ATI/AMD from 2003 to 2017. I haven't noticed any dullness like some are making it out being on NVIDIA. Nor is anything popping like others has also spoken since I also have an AMD on the Mac side by side. Nothing is screaming like people spoke of Matrox cards.

and thats the point, you might not notice a difference however others may, regardless of how big or small, anyway i was simply trying to offer an explanation as to why people might say there is a difference. :)

edit: see above, forgot to multi quote by which time it was to late.
 
Last edited:

That comment form Andres Palmo is just pure nonsense without any grounding in reality.

As has been said a dozen times already, the compression is absolutely lossless and AMD does the exact same compression algorithms.

Look back at the video. There is something going on in ROTR where the leaves on the tree and the waterfall look sharper and more detailed on the left hand side of the image.
Also, he is right about the statue - on the right hand side I see a fog/increase of brightness and details on the statue become invisible.
 
and thats the point, you might not notice a difference however others may, regardless of how big or small, anyway i was simply trying to offer an explanation as to why people might say there is a difference. :)
True. Some people assume that just because they cannot see/notice something no one else can either.
 
Look back at the video. There is something going on in ROTR where the leaves on the tree and the waterfall look sharper and more detailed on the left hand side of the image.
Also, he is right about the statue - on the right hand side I see a fog/increase of brightness and details on the statue become invisible.

that might be how the engine handles the hardware, doesn't necessarily indicate the cards process the image massively different to an extent where 1 is better than the other.
 
Back
Top Bottom