• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Sorry Rofflay, we may be at cross wires here. I didnt make any video. My illustration which I referred to, and my points in general was from a couple of pages back where I had shown photos taken of the desktop as rendered by nVidia and AMD cards in my machine where there were differences.

It has got me looking at more of the settigns on the nVidia control panel though to see if there any any differences I can make. In particular, I would say that there is a difference within nVidia itself between RGB (full) and YCbCr444 with 444 being a little clearer for the desktop environment. (imho)

Yah, YCbCr444 is your best bet. I have a GTX 1060 and spent time adjusting comparing it to my HD 7770 but at the end - 444 is closest. Same with my GTX 1050s.
 
For an objective test you'd set them to the same equivalent, optimal, settings. Anything else is silly and doesn't prove anything.

Sure, no issues with doing that either, and that would not mean the images will look the same either and one will still be better then the other.

Perhaps all performance reviews are null in this case then since settings weren't punched in properly then either.
 
The settings you need to use depends on the screen that you plug in, it's that simple. The driver can't adjust itself perfectly for the million different panel models out there, don't be silly.

Even the screens themselves need tweaking - TV's can cost $5000 but you still need to change settings - doesn't mater how much they charge you for the panel.

What I found with the expensive monitors or TV's, they require less adjustments where cheaper TV's/monitors can require huge adjustments for the colour balance/gamma and can even break the quality of the picture since it can be off greatly.
 
Sure, no issues with doing that either, and that would not mean the images will look the same either and one will still be better then the other.

Perhaps all performance reviews are null in this case then since settings weren't punched in properly then either.


Changing gamma, saturation, vibrance, temperature doesn't effect performance.
 
It will depending on what one considers to be better IQ.

The industry leading IQ is produced with Radeons with the following Radeon Settings adjustments:

Texture Filtering Quality - High
Surface Format Optimisation - Off
Anti-aliasing Method - Adaptive multisampling
Pixel Format - YCbCr 4:4:4 Pixel Format
Colour Temperature - Automatic
 
There's more to image processing then these, colour depth, filtering, compression are all part of this.

There are programs to check anisotropic and anti-aliasing filtering quality, compression is a more difficult one because GPUs use a mixture of techniques internally which for the most part are lossless but would require analysis of the final output.

I guarantee though in cases where people are seeing much of a difference using equivalent settings it is almost always down to the user either not understanding the settings or not having things setup as they thought.
 
They wont actually improve image quality though.

It definitely does. The wrong gamma can screw with shadow detail or crush it, wash it out or hurt midtones. Especially if there are big spikes.

Saturation/Vibrance can destroy a lot of things. Colour temperature is the worst. You could have tinted blacks, screwed up midrange or even the high end of the greyscale looking magenta, green or blue and you've overall got a pretty messed up picture. Or even a sunburnt looking picture throwing off all skin tones. There are so many scenarios.
 
Not actually. To boot there is a different in colour spectrum because by default Nvidia has all their cards with culled RGB to improve bandwidth and performance.
Has to be manually activated.

Man this argument is still going on.

OK, citation please. You say it's to improve performance, so presumably you've seen benchmarks. Link please?
 
AMD gonna stop overvolting their gpus?

They should stop by implementing some smart AI utility which detects the particular chip's characteristics and automatically adjusts it to the lowest possible stable values.
This is much less irritating problem than the image quality because the first thing you actually notice in front of your screen is exactly the image quality.

It's funny this conversation died on February 25, then kicked off again on October 12.

Err.... Sounds like a special desire for censorship? The topic is not convenient for someone?
 
They should stop by implementing some smart AI utility which detects the particular chip's characteristics and automatically adjusts it to the lowest possible stable values.

There are metrics for ASIC quality but without exhaustive tests on every single chip individually (which just isn't worth it) it isn't really possible to implement that - it is much cheaper to just assess a range that chips will work within and set the minimum voltage for that.
 
They should stop by implementing some smart AI utility which detects the particular chip's characteristics and automatically adjusts it to the lowest possible stable values.
This is much less irritating problem than the image quality because the first thing you actually notice in front of your screen is exactly the image quality.



Err.... Sounds like a special desire for censorship? The topic is not convenient for someone?

Those kind of things never work properly though. Intel do it with their CPUs and it's the first thing most people turn off because it's all over the place.
 
Back
Top Bottom