• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exploring ATI Image Quality Optimizations

Both available options default to high quality and have been since launch, I've never needed to change them.
Factory default settings in CCC have always been Balanced and High Quality for me, i'll have to check 10.10e. There are only two options for High Quality for me. In "Standard Settings" there is Optimal performance - High performance- Balanced- High quality- Optimal quality- Custom settings. In "Mipmap detail level" there is High performance- Performance- Quality- High quality.
 
A 7-10% boost in 6850 performance is enough to take it above GTX460 performance in quite a few benchmark tests. I applaud ATI for making these optimisations available, but they should not be applied by default. There is no point using benchmarks to determine relative performance unless all of the products tested comply to the same rules. In the past Futuremark banned drivers that made such hidden optimisations.
 
Factory default settings in CCC have always been Balanced and High Quality for me, i'll have to check 10.10e. There are only two options for High Quality for me. In "Standard Settings" there is Optimal performance - High performance- Balanced- High quality- Optimal quality- Custom settings. In "Mipmap detail level" there is High performance- Performance- Quality- High quality.

That setting affects a range of options, balanced is basically no AA/AF, increasing to high or optimal quality simply enables varying levels of AA/AF, the basic texture filtering quality and mipmap level which is what we are talking about is exactly the same. It's basically an IQ option for dummies.

Of course if you lower it to anything below balanced then the quality will suffer, I think you've got confused somewhere.
 
Looks like ATI can fix the shimmering in drivers, by setting LOD to the same as Nvidia's which makes the textures blurrier.

http://www.tomshardware.co.uk/geforce-gtx-570-gf110-performance,review-32062-5.html

If you don't believe Nvidia's textures are blurrier just look at the following screenshot, the text on the banner is noticeably worse on Nvidia cards.

http://img16.imageshack.us/img16/1054/atiy0.jpg

And also from Nvidia's own driver notes where they added a clamp for LOD

This control lets the user manually set negative LOD bias to "clamp" for applications that automatically enable anisotropic filtering. Applications sometimes use negative LOD bias to sharpen texture filtering. This sharpens the stationary image but introduces aliasing when the scene is in motion.

Because anisotropic filtering provides texture sharpening without unwanted aliasing, it is desirable to clamp LOD bias when anisotropic filtering is enabled. When the user enables anisotropic filtering through the control panel, the control is automatically set to "Clamp".
 
Image quality really has very little to do with the brand of GPU surely. ATI or Nvidia, there maybe a fragment or two difference. Personally I have had both and they have been the same on my Monitors, or to what i can tell.

I would say its more due to the cabling solution you have, and also the monitor/tv a user has.
 
Image quality really has very little to do with the brand of GPU surely. ATI or Nvidia, there maybe a fragment or two difference. Personally I have had both and they have been the same on my Monitors, or to what i can tell.

I would say its more due to the cabling solution you have, and also the monitor/tv a user has.

What are you talking about? :confused:
 
Back
Top Bottom