• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

This sounds more like a black level issue on Nvidia cards. If the black level is wrong the picture will look washed out.

AMD cards probably run the opposite black level to Nvidia so I recommend messing around with limited and full setting if your using RGB( also bare in mind you may have to change the settings twice and a reboot for the change to stick ).

Also change the black level settings on the TV or monitor if the driver settings fail to work.

I personally use 422 10 bit - everything looks vibrant and crisp on my oled TV with Nvidia card. If someone told me otherwise then they got poor eyes.

Just to add I don't use digital vibrance, tv has been calibrated too. I would say Forza Horizon 4 I'm HDR has probably the widest and most natural looking colours in any game. Good game to use as a test for screen settings
 
Last edited:
image quality depends on desired framerate in games. e.g vega 64 vs 1080ti, vega64 will have worse image quality since it will have to run at medium settings to achieve similar framerate as the 1080ti which runs ultra settings.
 
That is just default color profiles though. Nvidia use more realistic color tones by default, AMD jack up saturation and vibrancy , which is how you sell cheap TVs to consumers.


I believe Nvidia keep the color profiles of the Gerforce cards the same as for professional Quadro cards, while AMD's Firepro cards aim for more realistic colors and vibrancy and look closer to Nvidia.

Amd settings by default are all default, they at normal setting.
The colour difference is from the full pc range RGB AMD have on by default.
Nvidia is using limited RGB giving a washed out colour.
So how can nvidia have a more "realistic" tone when by default it's limited? Lol
 
It’s not just down to colour settings

This is what I've tended to think as well; there seems to be a better sense of depth to AMD's image, heavier contrasts/deeper blacks and so on (FO4 looked quite different on my 7970 compared to the 980Ti). That said, for a little while now I've started to think that there's a bit less of a difference between the two than there used to be (e.g. compared to the ATi days). That was the time when, as someone else said on another thread a while back, when you upgraded there was a very noticeable difference in the IQ.

Both companies deliver great IQ, IMO, but I've always preferred team red in this regard.
 
Last edited:
image quality depends on desired framerate in games. e.g vega 64 vs 1080ti, vega64 will have worse image quality since it will have to run at medium settings to achieve similar framerate as the 1080ti which runs ultra settings.

Ay?
Ultra at 30fps vs ultra at 60fps will be the same lol

The flow of motion is the only thing that changes.
 
First thing I noticed when I changed from a GTX770 to a Vega64 was not how much faster or soomther it was at gaming - it was how my desktop background picture suddenly looked a lot better.
So yeah, AMD deffo has the visual IQ over nVidia.
 
4 months passed and haven't posted the images :(

However, I can only say for WOT as the other games look all the same. WOT new game client designed with Nvidia "optimizations" as per Wargaming statement.

With GTX1080Ti the FPS was 150-160ish with Vega 64 is around 110-120ish. However with the 1080Ti (and the GTX1060 6GB on the laptop) on max out setting there is less foliage on the trees and bushes, while smoke and fire (like the one found in Overlord map) and water and mirror reflections (Paris & Lakevile maps) look dull and uninteresting. Same applies to grass.
With Vega 64 the trees and bushes have more foliage, and especially the latter (bushes) I cannot see and aim like used to behind a single bush. Let alone when double and triple bushes are in line.
Paris you can see the reflections of the Eiffel Tower on the windows also, while on winter maps, there is more snow particles getting lifted by the tank tracks.

In addition, without FPS limiter on the Nvidia drivers, the game is horrible to play on 120hz 2560x1440 monitor, (since Sept 2016 XL2730Z is one of those Freesync monitors Nvidia broke the 144hz refresh rate and sound) as it feels like a very fast slideshow at 160fps (especially obvious on city maps). With the Vega having Freesync (144hz) the game looks smooth even at 95fps (Power save move with 176W cap).
 
I found AMD are better at default, as they use more Saturation, Nivida look washed out, dull, but once you put their Digital Vibrance up a bit (their Saturation), they are identical.
 
Amd settings by default are all default, they at normal setting.
The colour difference is from the full pc range RGB AMD have on by default.
Nvidia is using limited RGB giving a washed out colour.
So how can nvidia have a more "realistic" tone when by default it's limited? Lol


That isn't the default behavior on nvidia cards though.

AMD's default color profile doesn't have to be the same as nvidia's default color profile There is no international standard. The only really consensus is that content creation professionals, movie buffs, photographers, video editors, graphics designers etc all prefer a more neutral color tone. A large chunk of the public that have no real knowledge of image quality and color accuracy prefer massive levels of saturation and vibrancy giving things that plastic neon glowing look. Spend 5 minutes ion Instagram/500px to see truly awful photo editing.


AMD defaults to setting which people who mistakenly think high saturation is a sign of good image quality. From a business perceptive this seems reasonable, as seen in threads like this where AMD users make stupid comments about the washed out colors of Nvidia graphics cards.
 
Lol!

Limited RGB is NOT more natural/the industry standard........ For "PC" usage, you absolutely NEED FULL RGB in order for colours to be displayed "correctly" when using the display with a PC and you also need to make sure that your black level is set correctly too (if using a HDMI connection)

I would advise people to look at the likes of lagom if they don't have hardware calibrators to see just how messed up their displays look when they start faffing about with limited RGB and raising digital vibrancy/saturation etc. :o

http://www.lagom.nl/lcd-test/black.php
 
I had a fair amount of trouble forcing my GT 430 to output Full RGB over HDMI but I'd be surprised if other connection types weren't Full RGB by default.
 
image quality depends on desired framerate in games. e.g vega 64 vs 1080ti, vega64 will have worse image quality since it will have to run at medium settings to achieve similar framerate as the 1080ti which runs ultra settings.
You are clearly missing the point- it is about the difference in imagine quality when the exact same game graphic settings are used.

Frame rate/performance of what a card can run at is not the subject of the discussion here.
 
I like the look of the Intel images as well. They look even richer than the radeon ones (for the most part).

It's difficult to tell without actually seeing it yourself. If your looking at images online your still only looking at it via your own GPU so the same limitations apply :p
 
Back
Top Bottom