• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD and Nvidia image quality

It's like a yearly thing. Every once in a while this topic is recreated and it always ends the same, no one the wiser and everyone bickering :D

Ah, but nvidia are working on next years post already and then when we get there, they will retrospectively fill in the forum with missing threads in between the dates to make forum look we’re talking about nvidia more … it’s not next-gen tech, it’s post-gen tech. :D
 
Last edited:
  • Like
Reactions: bru
I have no ideea and I don't know if it is still required to do now or it was some sort of a bug back in the day. You can always play with vibrancy in the driver if that's your thing.
Think of it like this: do photographers and videographers avoid nVIDIA due to this or actually using it thanks to CUDA? I don't think I ever heard that and they DO need good colors more than the your average Joe.
This is a bad example because people doing colour accurate work are not using default settings, they will get their monitor/system colour calibrated.
 
Is there still a noticeable difference between AMD and Nvidia cards when it comes to colour vibrancy, contrast, and sharpness? AMD cards were once considered better (at least by some) for a nicer looking image, does this still hold any truth?
Yep, when I went to my 3070ti from the rx5700, there was a difference.

People here thinks the compression technology by Nvidia is truly lossless.....


That's one of the big reason for the image degradation.
 
There definitely used to be a difference years ago, I remember going from a 2900xt to an 8800gtx and noticing that games looked duller on the gtx. These days, they're about the same.
 
Last edited:
Yes there is a difference. Nvidia looks far better in many games, it's amazing DLSS technology really outshines AMD






















:D

I found your next purchase, a nice body pillow :cry::D

v8hqxu1u5vy81.jpg
 
This is a bad example because people doing colour accurate work are not using default settings, they will get their monitor/system colour calibrated.
True, that's what I've said in the previous post. But I doubt people will buy a colorimeter and a good enough display to worth the trouble. Heck, I've bought 2 identical displays and they were calibrated differently from the factory and even now I don't think they match. I only worry about the central display and then I calibrate by eye. Works good enough. :)
 
Last edited:
You need to change the color range in nvidia control panel I think to achieve same picture quality. Something like 6:6 needs changing to 8:8. Don't need to change it in AMD settings.

Always makes me think that by leaving them settings stock nvidia somehow are getting more fps for the fps charts.

EDIT - posted before reading the whole thread. But the point still stands. I havent used mvidia since the gtx 970. Now after reading I remember it was the colour range was limited instead of full.
 
Last edited:
I’ve not looked in a while but a few years ago, when looking at reviews and their pixel peeping screenshots, that AMD cards had a clarity to some textures ( often on on tree leaves) which nvidia screenshots lacked. It was very subtle but often there I felt. I reckoned that nvidia were using some form of compression in the pipeline, but have zero proof.
 
Back
Top Bottom