• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exploring ATI Image Quality Optimizations

What is default is a very hard question to answer in this case, as clearly default by vertue in both drivers is not 100% the correct image, if it was you would not have high quality to pick what would be the point??,

You cannot go beyond 100% correct as thats the master image, to add extra eye candy or bells and whistles as it where, would in fact reduce the orignal image, as it is no longer 100% correct,

So what are we talking here Nvidia at default is 99% correct and AMD is 98% correct?

Yet again it comes back to the burning question that nobody here or the experts have been able to show "does it look any diffrent in a real world test?"

So at the moment its a no, so its not a cheat its not reduced image vs brand (a), its just an optimisation of a setting, which gives a frame boost with no visable diffrence in picture.

Again yes if nvidia are able to achive the same results given the way AMD did without the end user seeing any diffrence in the real world tests then its fair as by point its an optimisation, if they do it and its clear to everyone and their gran they just made metro 2033 look like a specy game then no its not fair.
 
I think the issue(not really an issue but not sure what to call it) is that ATI had a setting as defualt, but then changed it to the lower IQ setting as default which gave a performance boost.

Well that's Nvidia's spin on it. But the reality is that they optimised a process without noticably lowering IQ, therefore they can perfectly legitimately continue to call it a 'default' setting as it looks the same as their previous default setting which looks the same as nvidia's default setting. No issue (other than Nvidia wish they'd thought of it first).
 
None of these sites have shown it will be discernable during normal game play though...... :confused:
Thats all I care about, can I see the difference when I'm gaming, from what I have read so far that answer is no.
If this is wrong, please show me so I can jump on 'cheating AMD' bandwagon.
It shouldn't really bother you too much anyway Raven, you don't even use an AMD GPU. Those of us who still use one will let you know if it's a problem!! :D
 
Last edited:
Major tech site Guru3D has weighed in now, the crap is just starting to hit the fan.


ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting

Them shimmering textures are very noticeable to me.
 
Last edited:
Yet Guru3d, claim its not a cheat! Adding in all real world tests they could not see the diffrence without attacking the image and they would not alter the way they test based on their findings, so not really a fuss more a stand point.

I would say the one point raven jumped on, was really nothing more than to say AMD change it back, helps to stop flame wars on the Oc forum.
 
Either way there is no need to have it as the default setting as clearly there is a drop in IQ that can be very noticeable under certain circumstances as many tech sites have pointed out. Any loss in IQ not matter how small should be left down to the user to enable.
 
Still, nobody can show me how this will affect my gaming experience except in a positive way. I'm more than happy to change my opinion if it can be proved.....
Raven, PLEASE show me the 'very noticeable' drop in IQ, I must have missed it some how.
 
Last edited:
Yet again you prove the point we have been making, only under certain (extreme) (thier words) circumstances has this been spotted not in any real world test, not sure the last time i played a game or looked at a word document etc going at only a certain angle at a certain speed zoomed in.

Also the end user has a choice they can kick it up to high if they want, AMD have not hid this they made a number of slides about it and they have not locked you out. So the user is still in control
 
Either way there is no need to have it as the default setting as clearly there is a drop in IQ that can be very noticeable under certain circumstances as many tech sites have pointed out. Any loss in IQ not matter how small should be left down to the user to enable.

It's never been so, vide Catalyst AI, different AA modes, Nvidia profiles.
 
bit late for that that methinks..... hence this thread....:D

You might be right there. :D

Guru3d said:
With that said I would like to plea this, it would be very wise to see the graphics industry move to a gentlemen's agreement where visible optimizations are simply not a default preference.

I completely agree with this, we've been through this with nvidia and the G7* series where IQ hit rock bottom yet some people never noticed it. :eek:

We don't need to go through this again, do we? lol
 
Last edited:
That is conclusive proof the default IQ setting AMD are now using is inferior to Nvidias default setting which an example of can be found at the bottom of the same link with a 460's IQ compared, that's enough for me.

Heard of google translate?
 
Good for you, not good enough for me..
SHOW ME HOW THIS WILL AFFECT MY GAMING EXPERIENCE!!
Show me a game that looks worse because of this during normal game play and I will agree whole heartedly with you.
Edit:
For some reason I can't use Google Translate in the office?? sorry.
 
Last edited:
Raven, nobody is saying they want flickering textures or will put up with them, what we are saying is nobody can see them in a real world test and as of yet we have not been show any proof that they are there in any game etc, so its not an issue.

All these videos your holding on to clearly state that they are exterme cirumstances and must be done this one way to show it.

How many times can you clearly not read, that we agree if there is a noticable drop in IQ then we would not be happy, but thats not the case, it looks the same it acts the same but its kicking out 7-10% more power.

The guru3d link you posted confirms this
 
I'm going with the tech sites on this who confirm AMD's IQ is worse than Nvidia's, after all if they can notice it I'm sure the end user will as well game depending. That's enough to have this as a performance setting enabled by the user.
 
Back
Top Bottom