Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
You can see that there is greater deviation in colour accuracy between the two signals than there was on the Nvidia GPU. This is true for deep red and certain grey and pastel shades in particular, amongst others.
What i will say is while playing on a Nvidia setup i have never noticed any detail missing. Might have missed it if there was its been more that the screen has been more pleasing on my eye on ati/amd. Most likely just a preference thing and Nvidia do kick out more performance easily atm but for a lot more cash though. Like i said within 2 years my mate paid out 2k on Nvidia graphics cards but missed amds image. No bias just what our eyes could see. Metro on the 2080ti at 4k RTX on still looked the business and to my eye one of the best best looking games i have played.
Nvidia owners prefer their image and Amd owners prefer their image.
None are better than the other, they are both just different.
Rubbish. Where is it shown that NVidia 'render less'? Why should the R VII be faster than a 2080Ti? Cores alone, the 2080Ti has 4352 as opposed to the 3840 on the Radeon.The thing is that nvidia gains unfair performance advantage by rendering less, thus stealing the top performance crown from the Radeon Vii that should be in reality faster than RTX 2080 Ti.
Where is it shown that NVidia 'render less'?
You are using a post that is over 10 years old (GTX 285 against the HD 5870) as your proof and using an image of differing attributes as the reasoning. Look at both shots moving and you will see they are not the same. I had my 290X in my PC a short while back and whilst I also felt the IQ was better, that was down to the fact that default settings are superior to NVidia's and not even I will argue with that but after some calibration (which is needed on both), I couldn't see a difference at all.Here:
Radeon IQ
![]()
GeForce IQ
![]()
The GeForce IQ is rubbish.
You are wasting your time mate, the guy see's 20% difference from Nvidia to AMD cards IQYou are using a post that is over 10 years old (GTX 285 against the HD 5870) as your proof and using an image of differing attributes as the reasoning. Look at both shots moving and you will see they are not the same. I had my 290X in my PC a short while back and whilst I also felt the IQ was better, that was down to the fact that default settings are superior to NVidia's and not even I will argue with that but after some calibration (which is needed on both), I couldn't see a difference at all.
Maybe. Should I go back even further and really show my age by saying Quake 3 on ATI was ummm questionable lolYou are wasting your time mate, the guy see's 20% difference from Nvidia to AMD cards![]()
Yeah. Silliness really. If he wants to be taken seriously he needs to provide a Pascal/Turing Nvidia GPU with a Vega/Navi and do so demonstrating the results over say 5-10 games.Maybe. Should I go back even further and really show my age by saying Quake 3 on ATI was ummm questionable lol
Quake Vs Quack is worth a read.... For comedy and nothing more.
You are using a post that is over 10 years old (GTX 285 against the HD 5870) as your proof and using an image of differing attributes as the reasoning. Look at both shots moving and you will see they are not the same. I had my 290X in my PC a short while back and whilst I also felt the IQ was better, that was down to the fact that default settings are superior to NVidia's and not even I will argue with that but after some calibration (which is needed on both), I couldn't see a difference at all.
But it's quite strange if they pursue performance in desktop 2D mode?
They cheap out on some elements on the circuit boards, hence the lower image quality in 2D... That's a problem since year 2000 and perhaps before.
@Panos can show you the same differences with Radeon Vega 64 vs GTX 1060.
I said that nvidia's images are extremely aggressive and unpleasant to the eye - washed out colours, missing details, too high contrast in some areas, too high brightness in others.
It's like image produced by some amateurs who have no clue of graphics and art, rather than super paid professionals...
I remember you from early in this thread and you was stopping a Youtube video where it suited your argument and then posting it as proof of NVidia lowing IQ. Comical and pretty lame in truth. It also shows a lack of knowledge and whilst I was looking to see what Panos was posting I found this and laughed out loud (I need to get a life).
How on earth does "Cheaping out on some elements on the circuit boards" even remotely equate to 2D IQ? Seriously, I would love you to show me this and make me eat my words and if you can prove it, I will never ever buy NVidia again.
Sorry but you haven't actually answered any of my questions. I also agree and assessing IQ in front of a monitor isn't rocket science and I even said, "I feel AMD have a better 'out the box' IQ over NVidia"Look, assessing image quality in front of a monitor as an ordinary user is not rocket science. You don't need knowledge to see the defects. They are just there and have been there for decades.
@Panos can show you the same differences with Radeon Vega 64 vs GTX 1060.
I said that nvidia's images are extremely aggressive and unpleasant to the eye - washed out colours, missing details, too high contrast in some areas, too high brightness in others.
It's like image produced by some amateurs who have no clue of graphics and art, rather than super paid professionals...