• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD and Nvidia image quality

I can't find it off the top of my head but someone did an in depth analysis off full range RGB for both similar to the stuff at the middle of this article https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/ and the only differences were 2 very slight ones the main being in certain deep reds nVidia preserved more detail but was undersaturated and AMD was slightly oversaturated losing detail.

But when I did comparisons by eye back in the day AMD always looked like nVidia with the digital vibrancy up a couple of percent for whatever reason.
 
I can't find it off the top of my head but someone did an in depth analysis off full range RGB for both similar to the stuff at the middle of this article https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/ and the only differences were 2 very slight ones the main being in certain deep reds nVidia preserved more detail but was undersaturated and AMD was slightly oversaturated losing detail.
That would require going into the nvidia control panel and changing from the stock settings wouldn't it?
 
Back in the day I moved from a Nvidia FX5600 to a ATI 9600 pro and it was obvious to me games like Unreal Tournament looked more vibrant on the ATI card. Not sure about now but in the past ATI/AMD cards had a more saturated colour palette by default via the drivers compared to Nvidia.

I wouldn't say one is inherently better then the other it's personal choice at the end of the day.
 
Last edited:
This is not true with my 4090, it sets default 8bit full rgb on my LG CX OLED. Whereas my 7900 XTX sets it to 10bit ycbcr 444. :cry:

So I have to change the 4090 to 444 to get better colours.

So that's interesting.... if i switch to 444 it sets my range to limited and then greys it out, i can't change it.

WgvMkss.jpg
 
Could just be nvidia being meh, I thought the same.

But can't by pass it, I get the same issue, I think AMD doesn't have this either.
Or it's Nvidia compression tech being the problem here so limitations

Its DP 1.4.
According to this i'm using 21.9 Gbps.
DP 1.4 has a bandwidth of 32.4 Gbps

 
Last edited:
Its DP 1.4.
According to this i'm using 21.9 Gbps.
DP 1.4 has a bandwidth of 32.4 Gbps

Why would you use ycbcr anyway.? I found colours to be off using this mode regardless if I'm using Nvidia or AMD GPU at least on all 3 of my oled TVs.
 
Last edited:

Is YCbCr better than RGB

YCbCr is better than RGB for storing image/ image type video files as it takes lesser storage, but the image quality remains almost the same.

RGB is commonly used for gaming and regular computing tasks, whereas YCbCr is mainly used to compress image files to save space which can be done by separating the Y, Cb, and Cr components. One is better than the other and vice versa depending on what you want to do.

As mentioned above, in the Full RGB and limited RGB section, full RGB is good for things like gaming and regular tasks whereas limited RGB is good for things like content creation and watching movies on Blu-ray.

YCbCr also known as YUV, where Y stands for Luma and luminance, U stands for blue-luminance, and V stands for red-luminance, which is another color system used mainly because it can separate luminance from chrominance more effectively than RGB color space.

Is RGB better than YCbCr?

RGB is better for common computing tasks, and YCbCr is better for watching movies.

From 10scopes.com

I always thought YCbCr was better for images and something to do with separate luminance from chrominance but didn't really look into it.
 
Back
Top Bottom