Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Interesting video. What I could see from the comparisons was not so much the detail level, but that the nVidia image clearly showed signs of the compression artefacting as a whole. Which makes me ask myself:
if nVidia's compression allows for higher bandwidth, how much does that account for the (very roughly in general) higher frame rates that nVidia appear to pull out compared to AMD for similar teir levels of cards.
Whilst not a gaming as such, I have a ryzen 2400G which I ran a 4k TV from for a while through the HDMI port on the backplate of the motherboard. I've also plugged a 1080ti into the same a machine and ran the same TV from hdmi plugged into the card using the same cable.
Without doubt, the Ryzen GPU appears to give a better quality of signal to the TV ... for example when starting up, the nVidia set regularly appears to mis-timed with the pixels slightly out from left to right. Forcing the TV to switch inputs back and forth gets the timing right again. Didn't happen with the Ryzen chip. Secondly, with the nVidia card, the TV frequently pops up with its on-screen message of which input and resolution its at ... its like its just detected or switched over a new source / resolution and is showing the user the pop-up message. Yet there definitely hasn't been such a change on my part. I would be using Word or something like that, and the messages pop up. I just dont remember that happening with the Ryzen before.
if nVidia's compression allows for higher bandwidth, how much does that account for the (very roughly in general) higher frame rates that nVidia appear to pull out compared to AMD for similar teir levels of cards.
Yep I don't think Nvidia is using quite the same standards as everyone else. They have messed with the ports etc.
On my main monitor it will automatically find the active port my AMD card is connected to, but I have to select it manually for Nvidia ones.
Interesting video. What I could see from the comparisons was not so much the detail level, but that the nVidia image clearly showed signs of the compression artefacting as a whole. Which makes me ask myself:
if nVidia's compression allows for higher bandwidth, how much does that account for the (very roughly in general) higher frame rates that nVidia appear to pull out compared to AMD for similar teir levels of cards.
.
LMAO, just checked and the option is still in the Nvidia panel, and with the latest drivers it defaults to "quality" on a GTX1060 and an RTX2080 instead of "high quality". Oh Nvidia, you never change xD
I think it has always defaulted to Quality - for most of history it has been equivalent to ATI/AMD's default setting - there were times when both got caught playing silly games with what parameters were actually used for a named quality level.
It doesn't help that Nvidia is still using a GUI from the mid 90s. AMD have changed theirs a few times since then and theres more variables now and can't really be compared.
You make it sound as if that is the only adjustable setting in the NVidia control panel, which of course it isn't.
The default is 'let the 3D application decide'
only if you go into the advanced settings do you get all the other options.
Something I have noticed a few times over the years switching between AMD and NVidia the colours just look better on AMD cards varies from game to game though some games not so much difference others like WoW just seem to look better on AMD than it dose on Nvidia, I have noticed this time or maybe it is just the placebo effect that the textures in some games look better as well Conan exiles the armour sets and items like rugs just seem to look more detailed on my Vega than they did on my 980ti and Hellblade as well the textures on Senua's clothes just seem to look better and they are both UE4 games.
If any of you have just switched cards and owner either Conan or Hellblade would be interested to see what you think.
Assuming all of those options actually still work. I suspect some don't.
Well, clearly nvidia cards don't render ultra high. Their "ultra high" must be high or medium on AMD cards..
Really!!!! You think that NVidia provide a driver control panel that half the functions don't do anything?![]()
Why do you suspect some don't? What do you think doesn't work?Assuming all of those options actually still work. I suspect some don't.
Assuming all of those options actually still work. I suspect some don't.
Why do you suspect some don't? What do you think doesn't work?
This is a lot of nonsense, if NVidia was blatantly cheating is this fashion all the tech sites would be ripping NVidia a new one.
Yes lol