• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

The difference in image quality between radeon and geforce.
In a game from 15 years ago?

Can you or anyone prove this is true in current games? Not talking about the colours which I agree Radeon have better colours by default, I am talking about actual image quality. Can anyone clearly show AMD have better image quality over say 10 games released in the past couple of years?

No good picking one or two games and picking one or two shots. It needs to be shown to be true consistently. No one has done this so far.
 
That's the catch. If it was so easy to be proved with amateurs' equipments, it would have been proved already.
The colours are relative, you shouldn't argue so much about the colours look or even accuracy.
For example, how would you compare the image quality on a GeForce|TN 6-bit monitor with a Radeon|VA or IPS 10-bit monitor?? :confused:

I don't understand why AMD is completely quiet about this and instead doesn't use its marketing to show the differences.

Also, I agree there should be a unified standard across all vendors and graphics cards which guarantees that users will see exactly what the creators of the games want them to see, and not allow any freedom for the games to be rendered differently.
 
In a game from 15 years ago?

Can you or anyone prove this is true in current games? Not talking about the colours which I agree Radeon have better colours by default, I am talking about actual image quality. Can anyone clearly show AMD have better image quality over say 10 games released in the past couple of years?

No good picking one or two games and picking one or two shots. It needs to be shown to be true consistently. No one has done this so far.

Someone posted comparisons from GTA V a while back and the differences with texture sharpness was still there.

I also personally noticed how washed out the geforce image looked when I switched from nvidia. Both times I've switched infact.
 
Someone posted comparisons from GTA V a while back and the differences with texture sharpness was still there.

I also personally noticed how washed out the geforce image looked when I switched from nvidia. Both times I've switched infact.

It's more pronounced when you stay with Radeon, get used to its IQ, and then for some reason switch to a GeForce. One should immediately see loss of detail.
 
That's the catch. If it was so easy to be proved with amateurs' equipments, it would have been proved already.
The colours are relative, you shouldn't argue so much about the colours look or even accuracy.
For example, how would you compare the image quality on a GeForce|TN 6-bit monitor with a Radeon|VA or IPS 10-bit monitor?? :confused:

I don't understand why AMD is completely quiet about this and instead doesn't use its marketing to show the differences.

Also, I agree there should be a unified standard across all vendors and graphics cards which guarantees that users will see exactly what the creators of the games want them to see, and not allow any freedom for the games to be rendered differently.

As I've pointed out and linked to videos, etc. before people have actually sat down with high end monitors, professional grade photography equipment and done actual image difference analysis with professional grade software - in the vast majority of cases there was no image quality differences at all and only some very slight colour differences which were probably explained by the link posted by James Miller earlier in the thread. If there was the kind of differences as a general thing like you are making out then people like Hardware Unboxed and Gamers Nexus, etc. would have been all over it.

You keep on pulling the headlines from searching for stuff that backs up what you want to be the case without even bothering to read the details especially comments which go into greater depth as to what is being seen and why and often completely and conclusively counter the bit you've cherry picked from the relevant article.

Almost every time when bigger differences are posted about it is down to user error and/or using non-equivalent settings. There have been some exceptions - both nVidia and AMD have in the past had bugs or intentional performance optimisations in their filtering algorithms or settings, etc. which are effectively cheats, some games that have had greater compatibility with one brand or the other but by and large the vast majority of the time there isn't a substantial difference between them.
 
End of the day until we see objective comparisons showing this, then it is all subjective and not good enough.

If it was such a big deal then believe me AMD's marketing department would be making use of it.

Again, not good enough showing us old examples. Show us consistently in the latest games. No one can do this though can they?

Why are you still talking absolute nonsense? You have consistently shown in the GPU and CPU sections of this forum that you know nothing and that the rubbish you spout as facts are entirely based on what vendor you happen to be in love with at the time.

Stop posting up screenshots of games from 15 years ago to make an argument about the state of things today. @TNA has posted up something for you to do. Get screenshots of a range of current games and prove once and for all there is difference. Because if what you say is actually true, it should be very easily proved. But of course, most common sense people know that it isn't true, because if it was it would be huge news. Tech websites would be all over it, especially the PRO-AMD ones.
Listen, and understand. That 4K8K is out there. It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are brain dead.
 
Your TV has its own processor that adjusts the picture quality - changes colours, fixes pixels lack, etc.

I agree with everything you've said thus far , I recently Upgraded from an Nvidia TNT 2 paired with an early dell 1024x768 lcd monitor to a Spanking new Rx5700 / asus ultra wide 3440x1440 and the difference is night and day , i couldn't believe my eyes , it was like some sort of technical magic was happening before my very eyes, image quality was better , games ran faster , just incredible.

Don't let people troll you man !
 
I agree with everything you've said thus far , I recently Upgraded from an Nvidia TNT 2 paired with an early dell 1024x768 lcd monitor to a Spanking new Rx5700 / asus ultra wide 3440x1440 and the difference is night and day , i couldn't believe my eyes , it was like some sort of technical magic was happening before my very eyes, image quality was better , games ran faster , just incredible.

Don't let people troll you man !

:D:D:D
 
As I've pointed out and linked to videos, etc. before people have actually sat down with high end monitors, professional grade photography equipment and done actual image difference analysis with professional grade software - in the vast majority of cases there was no image quality differences at all and only some very slight colour differences which were probably explained by the link posted by James Miller earlier in the thread. If there was the kind of differences as a general thing like you are making out then people like Hardware Unboxed and Gamers Nexus, etc. would have been all over it.

You keep on pulling the headlines from searching for stuff that backs up what you want to be the case without even bothering to read the details especially comments which go into greater depth as to what is being seen and why and often completely and conclusively counter the bit you've cherry picked from the relevant article.

Almost every time when bigger differences are posted about it is down to user error and/or using non-equivalent settings. There have been some exceptions - both nVidia and AMD have in the past had bugs or intentional performance optimisations in their filtering algorithms or settings, etc. which are effectively cheats, some games that have had greater compatibility with one brand or the other but by and large the vast majority of the time there isn't a substantial difference between them.

The in-game colour differences between Radeon (on the right) and GeForce (on the left) vary and can be so large, up to dramatic.

In the following screenshot, you see three main differences:
- level detail loss on the water pool on the GeForce;
- different colours/wash out on the GeForce;
- lower texture quality on the GeForce which points to higher texture compression

But, it would be better if nvidia takes the code from the graphics drivers and exposes it to the public, so everyone openly sees what they do underneath. Visual examination as pixel per pixel comparisons is subjective because reviewers often call the black white or vice versa.
Or the game developers must examine it in the code, not visually.

IQ-Difference-1.png


 
The in-game colour differences between Radeon (on the right) and GeForce (on the left) vary and can be so large, up to dramatic.

In the following screenshot, you see three main differences:
- level detail loss on the water pool on the GeForce;
- different colours/wash out on the GeForce;
- lower texture quality on the GeForce which points to higher texture compression

But, it would be better if nvidia takes the code from the graphics drivers and exposes it to the public, so everyone openly sees what they do underneath. Visual examination as pixel per pixel comparisons is subjective because reviewers often call the black white or vice versa.
Or the game developers must examine it in the code, not visually.

I take it you didn't read the comments in the video?

On the AMD side the reflections are turned off - the reason the pool looks different is because on the AMD side it just has a generic water texture while on the nVidia side it is reflecting the scene but at that distance you don't see much reflection detail. Play the bit from 0:56 seconds in slow motion and watch the reflection map disappear as the AMD side moves across the cars on the left.

YECjkAS.png


Notice on the tower block behind for instance the nVidia side the windows on the right hand side of the building are reflecting the sky on the AMD side they aren't.

Yet again the problem is people comparing with non-equivalent settings.
 
Last edited:
Back
Top Bottom