• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Having run both AMD and NVidia quite a lot, I genuinely can't see a difference. Maybe AMD have better colour settings from the off that suit some people? No idea personally but they both look good to me.

I'll say the same I said in the other thread,
https://forums.overclockers.co.uk/posts/30284931/
https://forums.overclockers.co.uk/posts/30287948/
I run nvidia on my 4k Sony TV via hdmi2.
I use a Gtx1060 which replaced a Rx470 and this bull crap about nvidia fog or less vibrant colours is all guff, (unless you are unaware of the full/limited rgb, or full/limited ycbr settings.)

Now on a separate aspect of the debate, if there is a rendering iq difference between nvidia and and I am all in to seeing real life examples.
 
I'll say the same I said in the other thread,
https://forums.overclockers.co.uk/posts/30284931/
https://forums.overclockers.co.uk/posts/30287948/
I run nvidia on my 4k Sony TV via hdmi2.
I use a Gtx1060 which replaced a Rx470 and this bull crap about nvidia fog or less vibrant colours is all guff, (unless you are unaware of the full/limited rgb, or full/limited ycbr settings.)

Now on a separate aspect of the debate, if there is a rendering iq difference between nvidia and and I am all in to seeing real life examples.
Full RGB has set to default for as long as I can remember. I don't ever recall having to change it myself. Am I alone in this?
 
That was something different.

Back in those days the ATi/Nvidia GPU drivers used to have texture quality settings, IIRC they were "High performance, performance, quality and high quality". In order to try and gain an advantage over ATi Nvidia decided to stealthily alter their slider, they basically added a new extra low option below high performance, then removed high quality and reordered the names. So effectively a high quality texture setting on Nvidia's panel was the same as the quality setting on ATi's, which gave the illusion of Nvidia having more speed/performance but in reality they were sacrificing quality for it.

It all got found out and that's why the Omegadrivers (and others) hit the scene, custom third party drivers offering to fix Nvidia's shadiness and restore full image quality for Geforce/TNT/etc users. Eventually in time Nvidia stopped doing it and reverted to the proper settings.

*EDIT*

LMAO, just checked and the option is still in the Nvidia panel, and with the latest drivers it defaults to "quality" on a GTX1060 and an RTX2080 instead of "high quality". Oh Nvidia, you never change xD

:rolleyes:
 
That was something different.

Back in those days the ATi/Nvidia GPU drivers used to have texture quality settings, IIRC they were "High performance, performance, quality and high quality". In order to try and gain an advantage over ATi Nvidia decided to stealthily alter their slider, they basically added a new extra low option below high performance, then removed high quality and reordered the names. So effectively a high quality texture setting on Nvidia's panel was the same as the quality setting on ATi's, which gave the illusion of Nvidia having more speed/performance but in reality they were sacrificing quality for it.

It all got found out and that's why the Omegadrivers (and others) hit the scene, custom third party drivers offering to fix Nvidia's shadiness and restore full image quality for Geforce/TNT/etc users. Eventually in time Nvidia stopped doing it and reverted to the proper settings.

*EDIT*

LMAO, just checked and the option is still in the Nvidia panel, and with the latest drivers it defaults to "quality" on a GTX1060 and an RTX2080 instead of "high quality". Oh Nvidia, you never change xD
...please tell me you are joking (no seriously please tell me you are :eek:)

Don't tell me that all those hours I spent on Guild Wars 2 on my 560Ti the image couldn't look just as good as when I was using my HD5850, just because I didn't know about this...:eek:
 
...please tell me you are joking (no seriously please tell me you are :eek:)
Tested it on my GTX1060 laptop and my RTX2080 desktop, the default setting in the NCP is set to quality instead of high quality. Quite shocked myself TBH, I was only looking to double check the names of the settings.

nvid.png
 
Tested it on my GTX1060 laptop and my RTX2080 desktop, the default setting in the NCP is set to quality instead of high quality. Quite shocked myself TBH, I was only looking to double check the names of the settings.

nvid.png
Thought just by selecting "Let 3D application decide" would mean everything would be dictated by games graphic settings with no difference regardless of vendors, but obviously not!

Gonna give this a go when I have time to see if it result in any performance hit and image quality gain on my laptop's 960M when I have time.

One of the things holding me off from getting a newer and better gaming laptop with a faster Nvidia GPU is I simply not 100% fully happy with the image produced by Nvidia GPU comparing to ATI/AMD's. AMD is still quite a long way away from competing with Nvidia on gaming laptop side, so I honestly hope this is all that it is regarding the visual difference.
 
Ok so once again these forums and its inhabitant's don't fail to deliver.

@shankly1985 posts up a video talking about image quality, which has a conclusion that their isn't really any difference, the difference is, well different, Game by Game.

Of course the forum being the way it is soon descends into bedlam with one team being much worse than the other. Now being that Shankly posted it up and he even put does AMD give a better picture ?! in the title. I am very pleasantly surprised to find the conclusion that there isn't really any difference to actually be the conclusion.
So thank you Shankly, for posting up an interesting vid that has sparked a not all that surprising conversation. Cheers.


Anyway all that aside, on the full colour vs limited colour spectrum issue, I have found that it all depends what device or screen you are plugging it into. Most screens default to 'with the video player settings', whereas plugging it into my Denon AV AMP it defaults to 'with NVIDIA settings' and 'Full (0-255)'

As for the quality slider for me it defaults to 'let the 3D application decide.'
As for previous misdemeanours, both red and green team have had plenty so their is no need revisit them here.

 
Full RGB has set to default for as long as I can remember. I don't ever recall having to change it myself. Am I alone in this?
Me either. Only if you are using a hdmi cable I think.

Regarding the image quality or colour, I have also noticed it and prefer the colours on AMD, but it is one of those things you notice for a short while when switching over. Then you forget.

The Nvidia texture settings being on quality instead of high quality is old news, it has been like that for years. My understanding is there is a small difference in image quality, so much so that you cannot tell when moving about and playing. Kind of like going from high to ultra settings in many games these days.

Maybe someone is up for doing a new test here and comparing?

@bru but the default is set quality unless you download your drivers from elsewhere?
 
Last edited:
Ok so once again these forums and its inhabitant's don't fail to deliver.

@shankly1985 posts up a video talking about image quality, which has a conclusion that their isn't really any difference, the difference is, well different, Game by Game.

Of course the forum being the way it is soon descends into bedlam with one team being much worse than the other. Now being that Shankly posted it up and he even put does AMD give a better picture ?! in the title. I am very pleasantly surprised to find the conclusion that there isn't really any difference to actually be the conclusion.
So thank you Shankly, for posting up an interesting vid that has sparked a not all that surprising conversation. Cheers.


Anyway all that aside, on the full colour vs limited colour spectrum issue, I have found that it all depends what device or screen you are plugging it into. Most screens default to 'with the video player settings', whereas plugging it into my Denon AV AMP it defaults to 'with NVIDIA settings' and 'Full (0-255)'

As for the quality slider for me it defaults to 'let the 3D application decide.'
As for previous misdemeanours, both red and green team have had plenty so their is no need revisit them here.


Read my original post, the title of the thread was pulled directly from the video title.
My view on this isn't any more than what the video says. I not used nvidia in a long time and nor have I ever owned both at the same time.
What I did say though is definitely correct a lot of people will noticed that limited RGB vs Full RGB will make amd looking better.

Nvidia only added a toggle for full RGB with driver version 347.09. All this time amd defaulted to full PC range RGB.
 
Tested it on my GTX1060 laptop and my RTX2080 desktop, the default setting in the NCP is set to quality instead of high quality. Quite shocked myself TBH, I was only looking to double check the names of the settings.

nvid.png

The default setting is 'Let the 3d application decide'.

If you change any of the settings in the 'mange 3D settings' tab it will change the selection form 'Let the 3D application decide' to 'use the advanced 3D settings'.

NVidia-control-panel.jpg
 
Read my original post, the title of the thread was pulled directly from the video title.
My view on this isn't any more than what the video says. I not used nvidia in a long time and nor have I ever owned both at the same time.
What I did say though is definitely correct a lot of people will noticed that limited RGB vs Full RGB will make amd looking better.

Nvidia only added a toggle for full RGB with driver version 347.09. All this time amd defaulted to full PC range RGB.

Sorry @shankly1985 I was trying to praise you for posting the video, but even that has come across as a dig, so apologies.

Even this shows us that we expect certain views from certain members, seeing as we all have our allegiances.
 
Those settings in the menus won't really help. It's not something you can change like that. There are differences in how AMD and Nvidia cards render things.
 
That was something different.

Back in those days the ATi/Nvidia GPU drivers used to have texture quality settings, IIRC they were "High performance, performance, quality and high quality". In order to try and gain an advantage over ATi Nvidia decided to stealthily alter their slider, they basically added a new extra low option below high performance, then removed high quality and reordered the names. So effectively a high quality texture setting on Nvidia's panel was the same as the quality setting on ATi's, which gave the illusion of Nvidia having more speed/performance but in reality they were sacrificing quality for it.

It all got found out and that's why the Omegadrivers (and others) hit the scene, custom third party drivers offering to fix Nvidia's shadiness and restore full image quality for Geforce/TNT/etc users. Eventually in time Nvidia stopped doing it and reverted to the proper settings.

*EDIT*

LMAO, just checked and the option is still in the Nvidia panel, and with the latest drivers it defaults to "quality" on a GTX1060 and an RTX2080 instead of "high quality". Oh Nvidia, you never change xD

Omega drivers for NVIDIA?... I only recall they were for ATi back in its day. I remember I used to use them on the 9800XT and 850XT?

https://www.reddit.com/r/Amd/comments/8kwla0/remember_those_old_school_omega_drivers/

http://forfree-austin.weebly.com/blog/driver-omega-ati-radeon
 
Yes, there were omega drivers for nvidia cards too. Although they first appeared for AMD cards, so i guess they existed to fix AMD's shadiness in equal measure :shrugs:
 
Back
Top Bottom