• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exploring ATI Image Quality Optimizations

Soldato
Joined
6 Oct 2007
Posts
23,077
Location
North West
We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you loose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromizes on image quality. And sure it raises other questions, does ATI compromize on other things as well ? See, the cost already outweigh the benefits.


Far Cry 2





Dirt2:










http://www.guru3d.com/article/exploring-ati-image-quality-optimizations/
 
Last edited:
Personally, if the differences are that hard to spot, it's optimized very well to give smoother gameplay.

It really doesn't bother me personally - nVidia can have bragging rights as having better IQ if they want, that's what they gain from this, but the differences are so small that in gaming situations it's like a none event. nVidia is usually more expensive than AMD, so perhaps you get a minute extra IQ for your cash. ;)

AMD should be able to choose what their defaults are - then users vote with their pockets if they got it right.
 
Last edited:
I'm ok with this kind of thing, it's up front and honest.

It's when changing the exe gets you better results I think is a bit fishy, and both companies have been guilty of that
 
Hi all,

Dont normally post in this section as its a kin to full scale war if your view is diffrent to others, but after much time reviewing who posts what i.e which camp they are in Green or red, its clear some are trying to start wars / keep a tourch alive on a bit of a non issue.

This is always going to be down to what the end user sees when playing a game, as from the mass effect shots, i cannot see what the diff is, i would also say if no one told you you would have never known. It's so near that you would just think its not a 100% same pic which clearly its not as the john shepherd is not in the same place.

Think an important part from the artical was also not highlighted

We have a hard time spotting differences as much as you do, and while making the screenshots we increased gamma settings to 50% and applied a resolution of 2560x1600 to try it look more visible.
Do you spot the difference ? Probably not, that is the rule we life by here at Guru3D, if you can not see it without blowing up the image or altering gamma settings and what not, it's not a cheat. And sure, we know .. this game title is not a perfect example, it however is a good real world example.

So in short AMD are doing something that could be seen as reducing the image but can you see it without major help? i guess not!

This is nothing like the nvidia of old, who learnt a major leason on when something goes to far, maybe AMD will think well fair play we will change the default back so we have 100% like for like not the 99% we have now at deafult but i dont see the need to based on the pics given and the major lenghts taken to show it.

Jock out
 
7-9% performance improvement at 1920x1200 for a little loss in quality that is not/is barely noticeable in games? I can live with that :)
 
^^Exactly, seems more like an intelligent use of resources to me. I think its called an 'improvement'. :)

Back when NVidia used to do it it was called cheating, why weren't they getting praise for an intelligent use of resources? it just goes to show how far our standards have dropped.
 
Back when NVidia used to do it it was called cheating, why weren't they getting praise for an intelligent use of resources? it just goes to show how far our standards have dropped.

Because it was plainly noticeable.
 
Back when NVidia used to do it it was called cheating, why weren't they getting praise for an intelligent use of resources? it just goes to show how far our standards have dropped.

Any chance you can show what Nvidia did along side this from AMD?
It's intelligent because you can't tell the difference in quality and it increases FPS. People seem to be saying Nvidia's attempt displayed a noticeable decrease in IQ.
Are you saying this is untrue and these two are the same?
 
Its not identical tho - there is a difference even tho generally slight - and you can never be sure a new game or application won't be badly affected by it.

For objective comparisions it does make things difficult as its changed the boundaries of the "Quality" settings which previously meant no selective opptimisations.
 
Way back then, anybody and there gran could tell there was clearly something not right about the images been produced by the nvidia cards, under certain things i.e 3dmark. I don’t think in this case anyone let alone their gran could see it. This is pointed out even more by how much work has gone in to show us the difference.

Now I’m not saying its 100% as the work on this shows it’s not, but by god it’s so near you would never know in the real world, this is why Guru3D have said it’s not a cheat.

The simple point as I see it, AMD are getting 8-10% more speed this way but is the image 8-10% worse clearly not if anything its less than a 1%, so it’s a good thing for the end user. More power without the user seeing a loss in quality i.e. optimized. If we were getting 8-10% more power with a clear reduction in picture (back to everyone and their gran) then it’s a cheat as the impact for the end user is something less than expected.

Also yes its not identical but even if you use the non opptimised option there would still be a diffrence in the image as Nvidia & AMD cards dont work 100% the same, even diffrent gen of the same make dont look the same.

I dont want to come across like some crazied fan boy, but there is a clear diffrence between this and the last time Nvidia & AMD got spotted with thier hands in the cookie jar.

Jock over and out
 
Its not identical tho - there is a difference even tho generally slight - and you can never be sure a new game or application won't be badly affected by it.

For objective comparisions it does make things difficult as its changed the boundaries of the "Quality" settings which previously meant no selective opptimisations.

SOrry but where has it changed the boundaries, at all, most review sites don't use default settings anyway and secondly, where does it say anywhere that AMD's default has to match Nvidia's default settings?

Most users don't change settings at all so default should offer the best possible comprimise between performance and IQ, thats completely logical, nothing more or less. if you want high quality, you choose high quality, if you want speed, you choose speed, if you want the best comprimise, thats default, please explain to me where AMD promised their default setting should match Nvidia's, or where AMD lock reviewers into using one setting, or where Nvidia recommend default settings, or where Nvidia don't try to provide the best comprimise between performance and IQ?

You can also NEVER be sure that EITHER companies highest quality setting might not work in ANY game and offer the worst IQ of any driver option, to use that as an excuse "ok, it really isn't noticable, but it MIGHT be in the future" is laughable.
 
Last edited:
Quality has always been the setting that has predictable output that developers can rely on and is why its the default setting. While the effects of this opptimisation are for the most part slight and generally hard to notice it does change the rules and you can no longer rely 100% on it having the intended effect. These sort of opptimisations should be on the performance preset and not on the quality and definitely not on the high quality preset. This is why Guru3D has made a fuss about it and not that they consider it some kinda "cheat" or anything which some people seem to be thinking.


EDIT: What I'm saying is you want to make a game or application that works correctly with the default (quality) driver settings - you don't want to have to be wasting time getting AMD to make a compatibility profile or telling your end users to adjust their settings because your standard filtering implementation isn't compatible with AMD's (non-standard) opptimisation causing noticeably degraded results - which isn't an impossibility even if its unlikely on the balance of probablity in this specific case.
 
Last edited:
Whether or not you can notice is beside the point, it shouldn't be the default setting. NV could just do the same and regain the performance advantage. Where does it end? Keep the quality high as a default IMO.

In benches, reviewers need to bear this in mind as it is potentially misleading.
 
Back
Top Bottom