• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exploring ATI Image Quality Optimizations

Soldato
Joined
6 Oct 2007
Posts
23,078
Location
North West
We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you loose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromizes on image quality. And sure it raises other questions, does ATI compromize on other things as well ? See, the cost already outweigh the benefits.


Far Cry 2





Dirt2:










http://www.guru3d.com/article/exploring-ati-image-quality-optimizations/
 
Last edited:
It should be left down to the user to user lower IQ settings in turn for extra performance, only reason AMD has set it to default is to to push FPS up in reviews which is what all the fuss is about. Nivida currently don't lower the IQ at all at its default settings in the CP so its not a fair comparison with AMD cards in reviews, that's why you have the likes of Guru3D having a go at AMD.
 
Major tech site Guru3D has weighed in now, the crap is just starting to hit the fan.


ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting

Them shimmering textures are very noticeable to me.
 
Last edited:
Either way there is no need to have it as the default setting as clearly there is a drop in IQ that can be very noticeable under certain circumstances as many tech sites have pointed out. Any loss in IQ not matter how small should be left down to the user to enable.
 
That is conclusive proof the default IQ setting AMD are now using is inferior to Nvidias default setting which an example of can be found at the bottom of the same link with a 460's IQ compared, that's enough for me.

Heard of google translate?
 
I'm going with the tech sites on this who confirm AMD's IQ is worse than Nvidia's, after all if they can notice it I'm sure the end user will as well game depending. That's enough to have this as a performance setting enabled by the user.
 
This is what it boils down to
The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromizes on image quality.

AMD are compromising on IQ with the default settings to gain FPS in benchmarks, IMO it's not done for the end user as it would have been left as a performance setting and not the default.

They all flicker.


Compare them, the vid on the right is how it should look and is the optimum IQ, as you can see the AMD IQ flickers a lot more than Nvidia which matches the optimum IQ just about perfectly.
 
Last edited:
The problem is reviews showing the AMD cards performing quicker with lower IQ, this skews the results when Nvidia cards are tested with default setting as are AMD cards but Nvidia provide better IQ, all AMD have to do is revert this lower IQ to a performance option, then all is fine in GFX land.
 
Back
Top Bottom