• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Interesting video. What I could see from the comparisons was not so much the detail level, but that the nVidia image clearly showed signs of the compression artefacting as a whole. Which makes me ask myself:

if nVidia's compression allows for higher bandwidth, how much does that account for the (very roughly in general) higher frame rates that nVidia appear to pull out compared to AMD for similar teir levels of cards.

Whilst not a gaming as such, I have a ryzen 2400G which I ran a 4k TV from for a while through the HDMI port on the backplate of the motherboard. I've also plugged a 1080ti into the same a machine and ran the same TV from hdmi plugged into the card using the same cable.

Without doubt, the Ryzen GPU appears to give a better quality of signal to the TV ... for example when starting up, the nVidia set regularly appears to mis-timed with the pixels slightly out from left to right. Forcing the TV to switch inputs back and forth gets the timing right again. Didn't happen with the Ryzen chip. Secondly, with the nVidia card, the TV frequently pops up with its on-screen message of which input and resolution its at ... its like its just detected or switched over a new source / resolution and is showing the user the pop-up message. Yet there definitely hasn't been such a change on my part. I would be using Word or something like that, and the messages pop up. I just dont remember that happening with the Ryzen before.
 
Interesting video. What I could see from the comparisons was not so much the detail level, but that the nVidia image clearly showed signs of the compression artefacting as a whole. Which makes me ask myself:

if nVidia's compression allows for higher bandwidth, how much does that account for the (very roughly in general) higher frame rates that nVidia appear to pull out compared to AMD for similar teir levels of cards.

Whilst not a gaming as such, I have a ryzen 2400G which I ran a 4k TV from for a while through the HDMI port on the backplate of the motherboard. I've also plugged a 1080ti into the same a machine and ran the same TV from hdmi plugged into the card using the same cable.

Without doubt, the Ryzen GPU appears to give a better quality of signal to the TV ... for example when starting up, the nVidia set regularly appears to mis-timed with the pixels slightly out from left to right. Forcing the TV to switch inputs back and forth gets the timing right again. Didn't happen with the Ryzen chip. Secondly, with the nVidia card, the TV frequently pops up with its on-screen message of which input and resolution its at ... its like its just detected or switched over a new source / resolution and is showing the user the pop-up message. Yet there definitely hasn't been such a change on my part. I would be using Word or something like that, and the messages pop up. I just dont remember that happening with the Ryzen before.

I can confirm that there are problems when connecting nvidia-based configuration with an external screen.
Connecting with a 4K TV, results in sudden and constant black screens/loss of signal to the 4K TV, while when connecting to a 4K monitor, it is difficult to get the signal to the 4K monitor.

if nVidia's compression allows for higher bandwidth, how much does that account for the (very roughly in general) higher frame rates that nVidia appear to pull out compared to AMD for similar teir levels of cards.

I guess 5% performance gain?
 
Yep I don't think Nvidia is using quite the same standards as everyone else. They have messed with the ports etc.

On my main monitor it will automatically find the active port my AMD card is connected to, but I have to select it manually for Nvidia ones.
 
Last edited:
Yep I don't think Nvidia is using quite the same standards as everyone else. They have messed with the ports etc.

On my main monitor it will automatically find the active port my AMD card is connected to, but I have to select it manually for Nvidia ones.

Like Apple, closed ecosystem and super premium prices..
 
Interesting video. What I could see from the comparisons was not so much the detail level, but that the nVidia image clearly showed signs of the compression artefacting as a whole. Which makes me ask myself:

if nVidia's compression allows for higher bandwidth, how much does that account for the (very roughly in general) higher frame rates that nVidia appear to pull out compared to AMD for similar teir levels of cards.
.

THe compression is absolutely lossless, in a mathematical sense, no "visually" loseeless, but the compression doesn't change the output data at all. It is like a zip file.

AMD also uses the exact same compression technology, merely AMD is typvcially about a generation behind. Vega GPU has compression similar to what Pascal, I'm sure Navi will be similar Turing.
 
LMAO, just checked and the option is still in the Nvidia panel, and with the latest drivers it defaults to "quality" on a GTX1060 and an RTX2080 instead of "high quality". Oh Nvidia, you never change xD

I think it has always defaulted to Quality - for most of history it has been equivalent to ATI/AMD's default setting - there were times when both got caught playing silly games with what parameters were actually used for a named quality level.
 
I think it has always defaulted to Quality - for most of history it has been equivalent to ATI/AMD's default setting - there were times when both got caught playing silly games with what parameters were actually used for a named quality level.

It doesn't help that Nvidia is still using a GUI from the mid 90s. AMD have changed theirs a few times since then and theres more variables now and can't really be compared.
 
Last edited:
It doesn't help that Nvidia is still using a GUI from the mid 90s. AMD have changed theirs a few times since then and theres more variables now and can't really be compared.

You make it sound as if that is the only adjustable setting in the NVidia control panel, which of course it isn't.
The default is 'let the 3D application decide'
only if you go into the advanced settings do you get all the other options.
 
You make it sound as if that is the only adjustable setting in the NVidia control panel, which of course it isn't.
The default is 'let the 3D application decide'
only if you go into the advanced settings do you get all the other options.

Assuming all of those options actually still work. I suspect some don't.
 
Something I have noticed a few times over the years switching between AMD and NVidia the colours just look better on AMD cards varies from game to game though some games not so much difference others like WoW just seem to look better on AMD than it dose on Nvidia, I have noticed this time or maybe it is just the placebo effect that the textures in some games look better as well Conan exiles the armour sets and items like rugs just seem to look more detailed on my Vega than they did on my 980ti and Hellblade as well the textures on Senua's clothes just seem to look better and they are both UE4 games.

If any of you have just switched cards and owner either Conan or Hellblade would be interested to see what you think.
 
Something I have noticed a few times over the years switching between AMD and NVidia the colours just look better on AMD cards varies from game to game though some games not so much difference others like WoW just seem to look better on AMD than it dose on Nvidia, I have noticed this time or maybe it is just the placebo effect that the textures in some games look better as well Conan exiles the armour sets and items like rugs just seem to look more detailed on my Vega than they did on my 980ti and Hellblade as well the textures on Senua's clothes just seem to look better and they are both UE4 games.

If any of you have just switched cards and owner either Conan or Hellblade would be interested to see what you think.

Well, clearly nvidia cards don't render ultra high. Their "ultra high" must be high or medium on AMD cards..
 
Assuming all of those options actually still work. I suspect some don't.

Really!!!! You think that NVidia provide a driver control panel that half the functions don't do anything? :rolleyes:


Well, clearly nvidia cards don't render ultra high. Their "ultra high" must be high or medium on AMD cards..

This is a lot of nonsense, if NVidia was blatantly cheating is this fashion all the tech sites would be ripping NVidia a new one.
 
This is starting to feel like the whole HPET and 16bit/44 vs 48/92/24bit audio discussions that runs for years.
 
Thought I would follow up on my comments from earlier where I said the nVidia + TV combo was frequently flashing up the messages. A look about the web this afternoon and others who were experiencing the same made the comment that refresh rate timing may be the issue. When running at 4k 60Hhz, the 60hz timing is being pushed a little too hard and going outwith the bounds of what the TV can handle ( or considers acceptable ), and when it comes back into the limits, the TV locks on again, considers it a new signal and shows the message of new input.

A suggestion was to go into the NVCP and change the refresh rate from 60Hz to 59Hz.

I've done that, and so far, it appears to have worked successfully. There definitely seems to be a slight improvement to the display sharpness of text, and I've not had the message flash up once so far ... when it definitely would have done multiple times in the time its taken me to type this post.

I'd even go so far as to say, it has the same sort of look as the Ryzen GPU output. ( I wonder if nVidia's refresh rate settings are out by 1Hz... )

So if anyone else is having bother .... might be worth a try to drop it from 60hz to 59Hz.

( I know that gamers will scoff at the low refresh rate ... but when most of my time is spent in apps, it makes no difference )
 
Back
Top Bottom