• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD IQ Vs Nvidia IQ - shenanigans or something else?

I will do some more investigating but it is game dependent. I think it is a good move by Nvidia personally as you would want the highest possible frames in games like BF4 and then for something like Tomb Raider, you would want the better quality.

It's not a good move at all as we have graphics options to turn settings up or down to gain the needed fps. If it turns out these are what most are seeing at default in BF4 then it's not good in my book. Nvidia are meant to be the premium brand :D:D
 
As said, it can be an advantage or disadvantage depending on how you look at it, however, for "performance" reviews, it isn't a fair test.

Any idea which settings should be used for AMD ccc?

R7oGFTl.png
 
I should learn not to have an opinion. My bad.

I don't understand this response.
Is Therealdeal not allowed to disagree (Surely that's his opinion)?

I agree with his sentiments, but at driver defaults AMD has performance optimizations on to lower IQ in CCC (Texture filtering, surface optimizations and tessellation). I'd rather see both have no IQ degradation at their default.
 
Well, I can only show what is what and that is it. Done for me now and no more BF4 runs :D I will be using "prefer max quality" for the rest of my recordings on Nvidia. Thanks for the deep dissecting guys and I learnt something tonight.

Which setting is this exactly?

I see nothing that refers to "Prefer max quality"

Is it the Texture Filtering setting?
 
I don't understand this response.
Is Therealdeal not allowed to disagree (Surely that's his opinion)?

When his opinion tells me my opinion is wrong, then no point carrying on. He is welcome to his opinion of course but I have no wish to get into a long drawn out debate - especially when I have loads to do today, so I will leave it there. Saves any grief as well.

Which setting is this exactly?

I see nothing that refers to "Prefer max quality"

Is it the Texture Filtering setting?

open up the Nvidia control panel/adjust image settings with preview and it is there.
 
I should learn not to have an opinion. My bad.
No I think your opinion is fine...it's just that doesn't Nvidia already have GeForce Experience for providing recommended graphic setting? I don't really see the merit of enforcing that at driver level...

It's kinda like how when you installing some software, it install some add-on along with it without asking for you permission.
 
Last edited:
No I think your opinion is fine...it's just that doesn't Nvidia already have GeForce Experience for providing recommended graphic setting? I don't really see the merit of enforcing that at driver level...

It's kinda like how when you installing some software, it install some add-on along with it without asking for you permission.

This is different AMD how? They have the gaming evolved app and it's also at driver level.
 
so in theory, Nvidia and AMD should have everything set to 'use application settings' and let the game do the rest ?

It should be the case. But when the IQ is lower between competitors cards at default driver settings then there is something wrong in the driver. Considering the game is set to max quality. And the driver is set to use application settings.

The driver must be enforcing lower quality texture filtering at default. IT could be a BF4 + Titan x issue, it could be a bug and a wider issue. Or it could be Nvidia messing with driver settings for performance metrics.

More likely one of the first two, but the last is not a long shot either.
 
Last edited:
So is the consensus that max quality in the NVCP is worth it but gives a performance hit?

The consensus is that NVIDIA reduced the default setting to a notch down so that benchmarks looked better in reviews, knowing that AMD would retain the max setting, and hoped no-one would notice. They noticed now and it's blowing up.

Performance hit is ~10+%. Performance indexes look a lot different vs. the FX with -10% for the 980Ti & TX.
 
When his opinion tells me my opinion is wrong, then no point carrying on. He is welcome to his opinion of course but I have no wish to get into a long drawn out debate - especially when I have loads to do today, so I will leave it there. Saves any grief as well.



open up the Nvidia control panel/adjust image settings with preview and it is there.

Ah right, l always go I to detailed settings!
 
I find it funny when people quote bandwidth by multiplying bus speed x bus width, it does not work that way and HBM proves it.

HBM is definitely the way forward but it does have a few weaknesses. It does not clock very high so 1080p performance can be bottlenecked but 2160p performance is better because it needs a wide bus a lot more than high clock speeds.:)

What HBM proves is current GPUs are not not bandwidth limited and GDDR5 can still beat an HBM solution.
 
God this subforum is pathetic.

Indeed it is an absolute joke. Much better forums out there where you can freely discuss both the pro and cons of nvidia and AMD hardware designs without a load of fanboys making up nonsense about image quality or magical abilities of DX12.
 
Indeed it is an absolute joke. Much better forums out there where you can freely discuss both the pro and cons of nvidia and AMD hardware designs without a load of fanboys making up nonsense about image quality or magical abilities of DX12.

Yep, the straw clutching in the name of one-upmanship constantly trashes any thread in here worthy of reasonable discussion.

Sad really as this sub used to be brilliant for this kind of discussion.
 
Indeed it is an absolute joke. Much better forums out there where you can freely discuss both the pro and cons of nvidia and AMD hardware designs without a load of fanboys making up nonsense about image quality or magical abilities of DX12.

k94an7a.jpg


Emmm, how are people making up nonsense regarding image quality? Gregster has just shown what the default driver setting does to image quality....

Either way the point of this is that "reviewers" are most likely using default settings for both nvidia and AMD control panel thus you are getting worse image quality on the nvidia side but an extra 10% boost in performance (at least for bf 4, any one fancy testing other games out?), however, these reviewers don't mention anything about image quality or the driver settings in their reviews...

So hardly nonsense.


And if you don't like the forum, don't post here then? Simples.
 
Back
Top Bottom