• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD IQ Vs Nvidia IQ - shenanigans or something else?

Soldato
Joined
25 Jun 2011
Posts
16,797
Location
Aberdeen
single 980Ti, 1440p, full ultra, nothing touched in drivers, some white lines in shanghai :rolleyes:

ScreenshotWin32-0024.png
 
Caporegime
Joined
17 Mar 2012
Posts
48,320
Location
ARC-L1, Stanton System
Both vendors by default degrade IQ.

I always just change it and then force 16x AF (The AF is mainly a help for older games, it makes a dramatic difference in stuff like HL1)

Then its the extent to how much, i mean Gregs run looks like Ultra on the Fury-X but High or even Medium on the Titan-X side, Missing objects, low multisampling giving it a blurry washedout look, almost all the detail and decal texture layers are missing. it looks dull.
 
Last edited:
Caporegime
Joined
18 Sep 2009
Posts
30,142
Location
Dormanstown.
Then its the extent to how much, i mean Gregs run looks like Ultra on the Fury-X but High or even Medium on the Titan-X side, Missing objects, low multisampling giving it a blurry washedout look, almost al the detail and decal texture layers are missing.

But it looks like something's up with Greg's.
But if we're going down this road, I'll end the conversation here (Although I do agree with the overall sentiment it depends how much the IQ is degraded, I'd rather it wasn't by default.)
 
Soldato
Joined
10 Oct 2012
Posts
4,454
Location
Denmark
The problem is the official review sites phix.

They "most likely" use the default settings and here is the issue, they don't mention in their review what the gpu driver settings are or compare image quality. They only adjust in game settings and look at the performance, as a result if the image quality is indeed being downgraded due to the default nvidia CP settings and thus giving the performance boost then it isn't a fair test. This leads to nvidia GPU's looking quite a bit better but... in fact they aren't really because the IQ is being downgraded in order to get that extra performance.

I agree its a user or in this case reviewers error. I think its sad that you cant 100% count on these review sites but it has sadly been like this for a long time. There are so many variables that are not accounted for.
 
Caporegime
Joined
22 Nov 2005
Posts
45,467
I thought everyone instantly sets all the driver setts for max quality whenever they get a new gpu anyway?

first thing I changed on my 980ti was the texture quality settings to highest.
always done the same with amd too makes no sense that they don't default to max IQ

I remember when people used to run 3dmark back in the early 00's with texture clamp set to like -10 or whatever the setting can go to on nvidia cards lol
 
Caporegime
Joined
17 Mar 2012
Posts
48,320
Location
ARC-L1, Stanton System
It has spread.

Good, it needs to be highlighted and debated, Reviewers should look into it, AMD should look into it, 'if' Nvidia are using NV Inspector or whatever it is to reduce IQ in order to get an artificial performance gain that doesn't really exists then they should be challenged on it.

Oh Greg, what have you done? :p
 
Soldato
Joined
4 Feb 2006
Posts
3,223
Good, it needs to be highlighted and debated, Reviewers should look into it, AMD should look into it, 'if' Nvidia are using NV Inspector or whatever it is to reduce IQ in order to get an artificial performance gain that doesn't really exists then they should be challenged on it.

Oh Greg, what have you done? :p

Looks like it's not just Gregster's setup or BF4. Check the following link on HardOCP.

http://hardforum.com/showthread.php?t=1867421&page=5

example4i.jpg
The bottom of the page shows Sleepings dogs and the lighting difference between AMD and Nvidia. Definitely some cheating going on on Nvidia's part.
 
Last edited:
Caporegime
OP
Joined
24 Sep 2008
Posts
38,280
Location
Essex innit!
I am glad I came across it in fairness and it can skew results. I just hope nobody thinks I deliberately did something to skew results, as I wanted this to be something that review sites don't do and that is show a side by side comparison of settings all the same for both AMD and Nvdiia.

All future vids will be done with Prefer max details set in the NCP and I will continue to do them. Just been down the beach with the Wife and Dogs and we both agree it is far too hot, so no more testing today and now time for beer....Lots of Beer :D
 
Caporegime
Joined
17 Mar 2012
Posts
48,320
Location
ARC-L1, Stanton System
Looks like it's not just Gregster's setup or BF4. Check the following link on HardOCP.

http://hardforum.com/showthread.php?t=1867421&page=5

example4i.jpg
The bottom of the page shows Sleepings dogs and the lighting difference between AMD and Nvidia. Definitely some cheating going on on Nvidia's part.

Brent is defensive in that, don't know why as no one is accusing him of any foul play, its clearly down to NV default CP settings and nothing about anything he or any reviewer or Greg did, they just worked with what was given to them by Nvidia.
 
Caporegime
Joined
4 Jun 2009
Posts
31,309
I would love to see some comparisons for games like GTA 5, battlefront (hide your name first though) etc.

After ~20 months of owning AMD I'm still not that clued up on CCC, but I think...
"standard" texture filtering quality
and
"on" surface format optimisation

applies AMD's game optimisations.

Now what I still don't know is whether "system settings" will apply those settings globally, or whether CCC looks for a custom profile (having located game exe) to not apply their optimisations.

I'm still clueless about the tessellation options (mainly as I have no idea if pCARS uses it), what the info means when you have the mouse cursor over "morphological filtering," the EQAA modes etc.

haha same here except 2+ years :o

I always google best CCC settings but there never seems to be an agreement on settings :p

I just use these settings for everything now:

xbtkbE1.png

As for the tess. option, I will use that if certain games use a stupid setting just to kill fps with no benefit whatsoever to graphics *cough* witcher 3 hairworks *cough* :p

I agree its a user or in this case reviewers error. I think its sad that you cant 100% count on these review sites but it has sadly been like this for a long time. There are so many variables that are not accounted for.

Yeah, I don't think I am going to go by reviewers benchmarks for games now, will just use the benchmark/official threads on here, too much can impact the results and it is hard to trust who is and isn't bias these days.
 
Permabanned
Joined
28 Nov 2006
Posts
5,750
Location
N Ireland
Do AMD not label thier settings? How can you not know by 2 years :D


The last thing not Nvidia i owned was a 4890 and it would be confusing had Nvidia not made it lemon squeezey to know what goes off and what goes on. Childs play no excuse for anyone on Nvidia to be effected by anything here at all. That is unless both at max iq have a difference. Now that would be a hornets nest ready to explode.

But max iq is not setting a slider to max iq imo. Manual or death.
 
Back
Top Bottom