• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD IQ Vs Nvidia IQ - shenanigans or something else?

I use the: Use the advanced 3d settings option, and change a few settings myself.

I am not 100% convinced that puttning it to max IQ is highest image quality setting because sometimes it bugs for me and puts one or two settings in the wrong way. Like the option for clamp is very important you use application controlled AF and set to 16 ingame and use Clamp. The same for AA if you want max IQ but with transparency at multisample.

There is also another trilinear optimization that sometimes depending on games will either be greyed as disabled or sometimes ON. So i disable that manually too if the game will allow because Nvidia clearly give a description under it that says disable for max iq. You just have to go from top to bottom and do this for each one because as i said this one bugs and every setting matters at driver level.


Example my first 5 from the top if a game lacks AO but has AF and MSAA options:

AO (Quality)
AF (Application controlled)
Openglsetting (off)
Openglsetting (off)
AA (Application controlled)
Transparency (Multisample)
 
Last edited:
Yea I'd be in the camp of wanting default to not do any optimisations. Hopefully this is just a bug at Gregsters end.

It could well be a bug. I would like someone else to clean install drivers (use DDU first) and fire up BF4 and see if it has set the same as what mine did. I have been swapping cards and drivers about, something could be up.
 
I would be interested in seeing what the difference for both performance and IQ is like in a few other games too i.e. gta 5.

It could very well be a bug but there is also a good chance that is just how nvidia have set it up.
 
Here is my problem with Nvidia not having it set to Application controlled.

"If its one game bug then ignore"

From your screenshot they is around 10fps difference, Benchmarks Fury X vs TX/980ti is also very close around 10fps sometimes.

But in reality the FuryX is doing more work and getting made out to be weaker GPU. :confused:

Review sites are now going to after list what settings in the Drivers they are indeed using.

Lets say we look at Tomb Raider 2 both drivers on default, yet Nvidia has this setting on lower than AMD giving the benchmark the upper hand to Nvidia.
Nvidia 90fps vs AMD 80fps for example

At the best of times I refuse to trust most benchmark sites! I get my understandings from places like here, on how a GPU is performing but how can we now trust benchmark results?
 
Well all my runs since doing the BF4 test have been with "Prefer max quality", so if you want results you can trust, look no further than me :D

(shameless plug)

Can you also see, what happens at driver default please? I know its adding in more runs lol
That GTA 5 run is very close, hard to tell in the IQ department a lot image is missing. But Frame rate seemed very close.

Can you find away of putting the games full image side by side and not have it cut off?
 
"If its one game bug then ignore"

From your screenshot they is around 10fps difference, Benchmarks Fury X vs TX/980ti is also very close around 10fps sometimes.

At the best of times I refuse to trust most benchmark sites! I get my understandings from places like here, on how a GPU is performing but how can we now trust benchmark results?

Also, its a 10fps difference between a stock Furty x and a Titan x that is overclocked to 1.3+ghz

In gregs GTA V video, they are both neck and neck. Many of the differences in fps can be attributed to the place he is and where he is looking as both are not identical runs. etc.

But its all looking good greg, keep it up.
 
Can you also see, what happens at driver default please? I know its adding in more runs lol
That GTA 5 run is very close, hard to tell in the IQ department a lot image is missing. But Frame rate seemed very close.

Can you find away of putting the games full image side by side and not have it cut off?

I will certainly do that but not today Shanks. I am off from next Saturday for a week and will do a couple of comparison vids with the ones I have now or even screenshots.
 
Personally I rather have screenshots like what you did earlier on, easier to compare and then you can't have people blaming it on youtube compression...

There you go, using the ingame ultra preset
hT3TfDQ.jpg

Thanks.

Can't see any issues with the IQ there, is that using the "prefer max quality" option in nvidia control panel?
 
Yeah in your free time :D must be doing your head in by now haaa

Actually I am enjoying it but swapping about here and there will end up with me making mistakes, so would rather stick to a set routine to keep me on track. I don't multitask well :D

Also worth noting is Windows 10 - I will be re-doing these when I switch over on the 28th of this month, so things could well swing in the favour of the fury :cool:
 
Personally I rather have screenshots like what you did earlier on, easier to compare and then you can't have people blaming it on youtube compression...



Thanks.

Can't see any issues with the IQ there, is that using the "prefer max quality" option in nvidia control panel?
Your welcome.

My NVCP is:
Anisotropic Filtering = 16x (set by me)
Antialiasing options = all default
Texture filtering - Anisotropic sample optimization = off(default)
Texture filtering - Negative LOD bias = clamp (default)
Texture filtering - Quality = High Quality (set by me)
Texture filtering - Trilianear optimizations = On (default)
 
Great thread, Thank you very much for all the time and effort you've put in greg.

One thing I would say, is that with the default setting being "Let the 3D application decide" doesn't that put the onus on the developer of the game rather than NVidia ?
 
Good to see that you are not long back and already back to your old ways.... Must surely be on your last strike by now?

Do you mean standing up to the resident AMD mob who like to spread nonsense based on dodgy videos? I bet you all even have a system going where you report my posts in unison, can't have people upstting the status quo of swooning over AMD can we? If OCUK want to ban me then that's up to them but this forum will be all the worse for it sending it yet further into pro-AMD territory, you only have to read the Anandtech thread where people are openly commenting on being "ashamed of Radeon fans" because of some of the zealotry present here. I'm not going to be bullied/intimidated by the resident AMD mob.

How many times has Gregster admitted to making mistakes in this video now? how many people/reviewers have replicated this anomoly? I'm one of the few people on this forum who has been saying from the beginning that AF is just not enabled for whatever reason, just like others have been finding.
 
Last edited:
Great thread, Thank you very much for all the time and effort you've put in greg.

One thing I would say, is that with the default setting being "Let the 3D application decide" doesn't that put the onus on the developer of the game rather than NVidia ?

Thanks Bru and I think it comes down to what Nvidia set in the profile, so the onus is on Nvdia. At least I learned something new as well.
 
Anisotropic Filtering = 16x (set by me)
Antialiasing options = all default
Texture filtering - Anisotropic sample optimization = off(default)
Texture filtering - Negative LOD bias = clamp (default)
Texture filtering - Quality = High Quality (set by me)
Texture filtering - Trilianear optimizations = On (default)

For what game? These settings seem a tiny bit odd because this brute forces AF 16 + Clamp which is great for max IQ and so is the highest quality setting. But then you have trilinear optimizations on which is bad.

This is the setting that sometimes is grey, Others can be on or off but the correct setting is 100% off. If you edit manually for the game exe it might become available to turn off like i said.
 
Do you mean standing up to the resident AMD mob who like to spread nonsense based on dodgy videos?

How many times has Gregster admitted to making mistakes in this video now? how many people/reviewers have replicated this anomoly? I'm one of the few people on this forum who has been saying from the beginning that AF is just not enabled for whatever reason, just like others have been finding.

Nothing wrong with using the video. If it was a recording/compression issue then BOTH halves of the video would have been effected. Recording issues wont cause LOD problems.

Here is the image with the Nivida control panel set to Display max quality:

6voZccp13.jpg

Here is the image that greg posted with all option at default in the nvidia control panel.

bvOSUNPr1.jpg

It is something in the drivers.
 
Last edited:
Back
Top Bottom