• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD IQ Vs Nvidia IQ - shenanigans or something else?

single 980Ti, 1440p, full ultra, nothing touched in drivers, some white lines in shanghai :rolleyes:

ScreenshotWin32-0024.png
 
Both vendors by default degrade IQ.

I always just change it and then force 16x AF (The AF is mainly a help for older games, it makes a dramatic difference in stuff like HL1)

Then its the extent to how much, i mean Gregs run looks like Ultra on the Fury-X but High or even Medium on the Titan-X side, Missing objects, low multisampling giving it a blurry washedout look, almost all the detail and decal texture layers are missing. it looks dull.
 
Last edited:
Then its the extent to how much, i mean Gregs run looks like Ultra on the Fury-X but High or even Medium on the Titan-X side, Missing objects, low multisampling giving it a blurry washedout look, almost al the detail and decal texture layers are missing.

But it looks like something's up with Greg's.
But if we're going down this road, I'll end the conversation here (Although I do agree with the overall sentiment it depends how much the IQ is degraded, I'd rather it wasn't by default.)
 
The problem is the official review sites phix.

They "most likely" use the default settings and here is the issue, they don't mention in their review what the gpu driver settings are or compare image quality. They only adjust in game settings and look at the performance, as a result if the image quality is indeed being downgraded due to the default nvidia CP settings and thus giving the performance boost then it isn't a fair test. This leads to nvidia GPU's looking quite a bit better but... in fact they aren't really because the IQ is being downgraded in order to get that extra performance.

I agree its a user or in this case reviewers error. I think its sad that you cant 100% count on these review sites but it has sadly been like this for a long time. There are so many variables that are not accounted for.
 
I thought everyone instantly sets all the driver setts for max quality whenever they get a new gpu anyway?

first thing I changed on my 980ti was the texture quality settings to highest.
always done the same with amd too makes no sense that they don't default to max IQ

I remember when people used to run 3dmark back in the early 00's with texture clamp set to like -10 or whatever the setting can go to on nvidia cards lol
 
It has spread.

Good, it needs to be highlighted and debated, Reviewers should look into it, AMD should look into it, 'if' Nvidia are using NV Inspector or whatever it is to reduce IQ in order to get an artificial performance gain that doesn't really exists then they should be challenged on it.

Oh Greg, what have you done? :p
 
Good, it needs to be highlighted and debated, Reviewers should look into it, AMD should look into it, 'if' Nvidia are using NV Inspector or whatever it is to reduce IQ in order to get an artificial performance gain that doesn't really exists then they should be challenged on it.

Oh Greg, what have you done? :p

Looks like it's not just Gregster's setup or BF4. Check the following link on HardOCP.

http://hardforum.com/showthread.php?t=1867421&page=5

example4i.jpg
The bottom of the page shows Sleepings dogs and the lighting difference between AMD and Nvidia. Definitely some cheating going on on Nvidia's part.
 
Last edited:
I am glad I came across it in fairness and it can skew results. I just hope nobody thinks I deliberately did something to skew results, as I wanted this to be something that review sites don't do and that is show a side by side comparison of settings all the same for both AMD and Nvdiia.

All future vids will be done with Prefer max details set in the NCP and I will continue to do them. Just been down the beach with the Wife and Dogs and we both agree it is far too hot, so no more testing today and now time for beer....Lots of Beer :D
 
Looks like it's not just Gregster's setup or BF4. Check the following link on HardOCP.

http://hardforum.com/showthread.php?t=1867421&page=5

example4i.jpg
The bottom of the page shows Sleepings dogs and the lighting difference between AMD and Nvidia. Definitely some cheating going on on Nvidia's part.

Brent is defensive in that, don't know why as no one is accusing him of any foul play, its clearly down to NV default CP settings and nothing about anything he or any reviewer or Greg did, they just worked with what was given to them by Nvidia.
 
I would love to see some comparisons for games like GTA 5, battlefront (hide your name first though) etc.

After ~20 months of owning AMD I'm still not that clued up on CCC, but I think...
"standard" texture filtering quality
and
"on" surface format optimisation

applies AMD's game optimisations.

Now what I still don't know is whether "system settings" will apply those settings globally, or whether CCC looks for a custom profile (having located game exe) to not apply their optimisations.

I'm still clueless about the tessellation options (mainly as I have no idea if pCARS uses it), what the info means when you have the mouse cursor over "morphological filtering," the EQAA modes etc.

haha same here except 2+ years :o

I always google best CCC settings but there never seems to be an agreement on settings :p

I just use these settings for everything now:

xbtkbE1.png

As for the tess. option, I will use that if certain games use a stupid setting just to kill fps with no benefit whatsoever to graphics *cough* witcher 3 hairworks *cough* :p

I agree its a user or in this case reviewers error. I think its sad that you cant 100% count on these review sites but it has sadly been like this for a long time. There are so many variables that are not accounted for.

Yeah, I don't think I am going to go by reviewers benchmarks for games now, will just use the benchmark/official threads on here, too much can impact the results and it is hard to trust who is and isn't bias these days.
 
Do AMD not label thier settings? How can you not know by 2 years :D


The last thing not Nvidia i owned was a 4890 and it would be confusing had Nvidia not made it lemon squeezey to know what goes off and what goes on. Childs play no excuse for anyone on Nvidia to be effected by anything here at all. That is unless both at max iq have a difference. Now that would be a hornets nest ready to explode.

But max iq is not setting a slider to max iq imo. Manual or death.
 
Back
Top Bottom