• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD IQ Vs Nvidia IQ - shenanigans or something else?


Default.

IF this is default settings. And the previous was the modified settings. Then there is something wrong with the default setting. Since the line Is visibly blurred from the lamp post onwards in that one.

Yet on your first picture the line was clear into the distance like on the fury x in the video.

First picture; if this is with use high quality in Nvcpl


There is a clear FPS hit between the two. But is understandable with the visual difference. 11 fps difference.
 
Last edited:
Well, I can only show what is what and that is it. Done for me now and no more BF4 runs :D I will be using "prefer max quality" for the rest of my recordings on Nvidia. Thanks for the deep dissecting guys and I learnt something tonight.
 
Is cool bud, was interesting to find the problem in the end. But it now leave questions as to why it is happening at default. Not that you need to stay involved anymore greg. :P

considering the 6 - 15 fps performance hit on the comparisons video between the two settings. it is worth investigating.
 
Last edited:
Glad its resolved, however for me, the graphics quality becomes irrelavent because I am so bad at the game that I sit about 12 inches from the screen shooting anything that looks likely to move. Which as you can imagine results in lots of death and strained eyes.
 
For games like BF4, COD, you would want the maximum frames, so I imagine this is why the profile is edging towards performance over quality. I will do some further testing on other games when I am off work next but I am glad the quality was better with GVR and ShadowPlay :D
 
I can understand that. But the driver should not make the IQ worse than the competitor to gain performance. especially when the in game settings are set to max.

Most people would not think beyond the in game settings. And i can assume many sites that benchmark do not either.
 
Is cool bud, was interesting to find the problem in the end. But it now leave questions as to why it is happening at default. Not that you need to stay involved anymore greg. :P

considering the 6 - 15 fps performance hit on the comparisons video between the two settings. it is worth investigating.

I bet most of the reviewers out there will have used the default setting which will no doubt give about 10% better performance at the expense of texture quality. A seriously unfair advantage which will make the Nvidia cards look faster when in reality its probably neck and neck.

It might be just this one game but more investigation is needed and reviewers need to be made aware. Maybe the Fury is not slower than a Ti afterall.
 
I bet most of the reviewers out there will have used the default setting which will no doubt give about 10% better performance at the expense of texture quality. A seriously unfair advantage which will make the Nvidia cards look faster when in reality its probably neck and neck.

It might be just this one game but more investigation is needed and reviewers need to be made aware. Maybe the Fury is not slower than a Ti afterall.

It could also be the cause of the yo-yo differences in performance between review sites. Some could be setting both to best quality while others are leaving the drivers at default.

The above is still speculation, but should be further investigated.
 
Last edited:
I feel if you lean one way or another you can swing this to your advantage in an argument but personally, I will be using prefer Max quality in my reviews.

I am now going to test both out on Dirt rally and see what is what.
 
I feel if you lean one way or another you can swing this to your advantage in an argument but personally, I will be using prefer Max quality in my reviews.

I am now going to test both out on Dirt rally and see what is what.

So is the consensus that max quality in the NVCP is worth it but gives a performance hit?
 
I bet most of the reviewers out there will have used the default setting which will no doubt give about 10% better performance at the expense of texture quality. A seriously unfair advantage which will make the Nvidia cards look faster when in reality its probably neck and neck.

It might be just this one game but more investigation is needed and reviewers need to be made aware. Maybe the Fury is not slower than a Ti afterall.


What has become of online reviewers I wonder? Reviewers used to do IQ tests and were more careful with IQ settings. At some point in time they seem to have mostly stopped doing IQ comparisons. I suppose it's just the result of the state of affairs.
 
Gosh I did not know I had HBM on my TitanXs.:rolleyes:

The difference is because we are talking about a brand new tech (HBM) compared to old tech GDDR5. It is going to take the people who write the drivers time to get the most out of the new tech where as with GDDR5 there is not much more that can be had.

Ah, so hbm is Magic then? I hav been waiting to hear confirmation of this!

So, apart from being memory with higher bandwidth, what does it bring?

I'd love to know what debs/amd need to do to get more out of the gpu due to its memory architecture. I would ask in the fury thread, but people in there seem to get upset when non fury owners post...
 
Ah, so hbm is Magic then? I hav been waiting to hear confirmation of this!

So, apart from being memory with higher bandwidth, what does it bring?

I'd love to know what debs/amd need to do to get more out of the gpu due to its memory architecture. I would ask in the fury thread, but people in there seem to get upset when non fury owners post...

I find it funny when people quote bandwidth by multiplying bus speed x bus width, it does not work that way and HBM proves it.

HBM is definitely the way forward but it does have a few weaknesses. It does not clock very high so 1080p performance can be bottlenecked but 2160p performance is better because it needs a wide bus a lot more than high clock speeds.:)
 
Just read my post and it sounded a bit sarcy. Should be more careful when typing on little sleep.

I agree that hbm is way forward, but what I am struggling to see is how it will have much overall impact on gpu performance overall, or what developers or amd need to do differently that will result in more performance.

It's fast, yes, but it's still just memory, so until such time as the memory interface is the limiting factor, it shouldn't make a huge difference, no?
 
What has become of online reviewers I wonder? Reviewers used to do IQ tests and were more careful with IQ settings. At some point in time they seem to have mostly stopped doing IQ comparisons. I suppose it's just the result of the state of affairs.

Yeah, wish they would, as it is a graphics card after all. I am interested to see if other users who own both a tx and fx can replicate these results. Also need to test more games.

Sad to see users who out right do not think any more testing is warranted and it is just Greg being a noob.

Ah, so hbm is Magic then? I hav been waiting to hear confirmation of this!

So, apart from being memory with higher bandwidth, what does it bring?

I'd love to know what debs/amd need to do to get more out of the gpu due to its memory architecture. I would ask in the fury thread, but people in there seem to get upset when non fury owners post...

Don't think people get upset at non fury owners posting, it's more to do with what/how non fury owners post. I would rather not read posts by people who seem to have an agenda to unjustifiably make the FX look as bad as possible at every given opportunity. I am not talking about you out by the way, I thought you had legitimate reason for switching cards, as you play a lot of elite and AMD did not address your concern for months.
 
On topic, having read through this now, I think I better go stick that setting on prefer max quality.

But I wonder if this affects all games?

Everyone on the internet must be bored that all cards are now out if this is causing a stir :)
 
Don't think people get upset at non fury owners posting, it's more to do with what/how non fury owners post. I would rather not read posts by people who seem to have an agenda to unjustifiably make the FX look as bad as possible at every given opportunity. I am not talking about you out by the way, I thought you had legitimate reason for switching cards, as you play a lot of elite and AMD did not address your concern for months.

Initially, I thought it was a bit f a let down, but reading more, performance from reviews is somewhat all over the place?

I think I described it as a grapefruit rather than a lemon... Which I think works. Hardly anybody would want to eat a lemon, but grapefruit can be rather enjoyable by most :)

As for my switching, I hope amd can win me back in time. But it will be a few years at least. As ever, things just work for me with nvidia, which means I game more and think less about my hardware.

Well, I game more when I have time that is :)
 
So is the consensus that max quality in the NVCP is worth it but gives a performance hit?

I will do some more investigating but it is game dependent. I think it is a good move by Nvidia personally as you would want the highest possible frames in games like BF4 and then for something like Tomb Raider, you would want the better quality.
 
Back
Top Bottom