• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exploring ATI Image Quality Optimizations

Lowering IQ to gain performance to make your new products look better than they are is wrong and that's what they've done, simple as that.

I'm curious, has this been implimented in drivers for the HD5 series yet?
 
Last edited:
Yet again people are taking the stand point that it’s night and day.
Reduction is not the correct word to use as it implies its much less than the original, lowering the IQ is also implying that its much less than the original, it’s not any of these.
You have to ask yourself does it, Side by side in a real world test look any different?
I would say not and I’m sure so would everyone else, this is only based on screen shots from Guru3D, which show no difference to the normal eye; it makes me wonder why no other site has tested it that way? Answer! It looks the same, so if it looks the same then it’s a fair representation of the image. So and this seems the hard bit for some, AMD are not giving the user less IQ or a reduction in IQ to a level that makes the image look worse that its rival. It still gives a fair representation of the image but with 7-10% more kick.
If Nvidia can go and do the same thing then its fair game. As long as the end user does not see any difference before and after the change and then the image is as expected, as was the case with the AMD update.
If however as I stated above, it could be show that the image is not as good to the eye, then yes you would have to remove the 7-10%, but nobody has been able to do that so far.
 
Who defines the standards then?


The standard is what they have been using for years. ;)


One thing I did notice when I upgraded my 4870X2 to a GTX480 was better image quality. It is also something I have seen others mention. I hope for AMD's sake they havent moved the goal posts in the wrong direction to increase this gap.

Reading this thread and looking at the pictures I think its fair to say that some games will be effected and others wont. Personally I do not like any decrease in image quality at all. One of the main reasons I choose pc gaming over console gaming is so that I can have the very best image quality possible in games.
 
Last edited:
Dont see what all the fuss is about. There's no such thing as 'standard' or 'default' quality - Nvidia and AMD cards do pretty much everything differently - every step of the pipeline is implemented differently and optimised differently, and the final image output is different. This is just another difference. A manufacturer can call whatever they want the 'default' quality, and if it looks as good as the next manuacturer's 'default' quality then the two can be fairly compared in performance terms. AMD have simply optimised their way of doing something better than before, there is no drop in IQ, so they still call it 'default'. Well done AMD. Nvidia are doing their usual spoilt child tantrum, putting spin on it and whining about it.
 
Dont see what all the fuss is about. There's no such thing as 'standard' or 'default' quality - Nvidia and AMD cards do pretty much everything differently - every step of the pipeline is implemented differently and optimised differently, and the final image output is different. This is just another difference. A manufacturer can call whatever they want the 'default' quality, and if it looks as good as the next manuacturer's 'default' quality then the two can be fairly compared in performance terms. AMD have simply optimised their way of doing something better than before, there is no drop in IQ, so they still call it 'default'. Well done AMD. Nvidia are doing their usual spoilt child tantrum, putting spin on it and whining about it.


I think you missing the point. Default setting should be the an unmolested setting. Other settings like performance and High Quality should have the image adjusted in some way.

Its a bit like buying a hifi which has bass boost forced on all the time.
 
everybody is saying there is no drop in image quality.......well obviously there is, however slight, otherwise this entire issue wouldn't have sprung up in the first place.

in my honest opinion it is not a cheat, but an optimization, to my knowledge the real issue lies in the way it has been done. did AMD tell people that the default setting were changing to improve performance, or was it just done under the table. as that it what it seems, when i read threads like this one.
 
No they're completely open about it. As has been mentioned already it was in the reviews when the 6xxx series cards came out.

Default setting should be the an unmolested setting.

I don't think there is such a thing, just one companies idea of what default is.
 
going by the logic in this thread then ,if the 6970 is a little bit faster than the 580 then nvidia can release new drivers that reduce the IQ claim back the fastest GPU crown and that would be acceptable as long as there open and up front about it.
 
going by the logic in this thread then ,if the 6970 is a little bit faster than the 580 then nvidia can release new drivers that reduce the IQ claim back the fastest GPU crown and that would be acceptable as long as there open and up front about it.
They don't even need a new driver, just to make the default setting the 'Quality' setting instead of the 'High Quality'
 
I don't think there is such a thing, just one companies idea of what default is.


From what I have read there is. Performance settings introduce optimisations which may decrease the IQ but boost the speed. The High Quality setting forces extra eyecandy which may effect the overall frame rate but does increase the image quality.
Default setting in the past had no optimisations or forced eyecandy.
 
I think you missing the point. Default setting should be the an unmolested setting. Other settings like performance and High Quality should have the image adjusted in some way.

Its a bit like buying a hifi which has bass boost forced on all the time.

I dont think I am missing the point. Define 'unmolested'. Do you think there is some holy grail 'default' setting that applies to all graphics cards - that if you pass a scene for rendering to any graphics card set to these 'default' settings, the result should be pixel-to-pixel identical to the result from any other card? This default doesn't exist. AMD/Nvidia, and even different cards from one brand, do everything differently anyway already, have always done. They can call whatever they want 'default'. There's nothing to say AMD's default has to look anything like Nvidia's default, though if that were actually the case reviewers/consumers would rightly be peeved at having to adjust settings to fairly compare the cards. However, that is not the case - despite this optimisation, AMD's choice of default settings looks the same as Nvidia's, so the cards can still be fairly compared set to default. That's the whole point.
 
From what I have read there is. Performance settings introduce optimisations which may decrease the IQ but boost the speed. The High Quality setting forces extra eyecandy which may effect the overall frame rate but does increase the image quality.
Default setting in the past had no optimisations or forced eyecandy.

What I mean is that one persons default is different to another's. I think that's what Liam is saying too.
 
I think the issue(not really an issue but not sure what to call it) is that ATI had a setting as defualt, but then changed it to the lower IQ setting as default which gave a performance boost.
 
It should be left down to the user to user lower IQ settings in turn for extra performance, only reason AMD has set it to default is to to push FPS up in reviews which is what all the fuss is about. Nivida currently don't lower the IQ at all at its default settings in the CP so its not a fair comparison with AMD cards in reviews, that's why you have the likes of Guru3D having a go at AMD.
 
What a lot of fuss about nothing!
I also got in to PC gaming because the graphics advantage over the original PS back in the day. That hasn't changed and would be shouting if this had caused a drop in the discernible IQ.
Fact is, it hasn't so I couldn't give 2 hoots if it was Nvidia/AMD/INTEL/MATOX that implmented it, it works and is a good thing imho.
If ANYONE can show me that this will affect the discernable IQ during a gaming experience then I will agree, until that time I think AMD have done nothing wrong.
 
Last edited:
The standard is what they have been using for years. ;)


One thing I did notice when I upgraded my 4870X2 to a GTX480 was better image quality. It is also something I have seen others mention. I hope for AMD's sake they havent moved the goal posts in the wrong direction to increase this gap.

Reading this thread and looking at the pictures I think its fair to say that some games will be effected and others wont. Personally I do not like any decrease in image quality at all. One of the main reasons I choose pc gaming over console gaming is so that I can have the very best image quality possible in games.

The standard is what they've been using for years... Funnily enough, both Nvidia and ATI have changed their methods of implementing AA and AF, the latter in particular changes with each generation release. Obviously you don't know about this but yet you have your opinion based on facts that don't exist.

You say you don't like a decrease in image quality. Oh well, what is the CCC for then? Isn't it the High Quality option that allows people like you, who don't like to compromise on IQ, to have all eye-candies on? It will still be different between AMD and Nvidia as they have different methods of implementing AF.

And a quick note on HD4870x2 vs GTX480 image quality, don't take it for granted, you either have not calibrated your monitor, have different settings in Control Panel, use different settings in game or see the tiny differences between both vendors. Or better yet, experience placebo effect, more consistent framerates, lack of microstutter, anything along these lines could be taken as a better or worse image quality. It's been discussed on these Forums several times, the differences are there but they are subjective.

I think you missing the point. Default setting should be the an unmolested setting. Other settings like performance and High Quality should have the image adjusted in some way.

Its a bit like buying a hifi which has bass boost forced on all the time.

And yet ATI have always used Catalyst AI at Standard mode as default setting that has its own driver optimizations? :rolleyes: What you're complaining about is that they took it further. I doubt many AMD/ATI owners will complain as the image quality differences are unnoticeable and the performance boost is definitely there. Whether these settings should be used in reviews is a subject of another matter.

everybody is saying there is no drop in image quality.......well obviously there is, however slight, otherwise this entire issue wouldn't have sprung up in the first place.

in my honest opinion it is not a cheat, but an optimization, to my knowledge the real issue lies in the way it has been done. did AMD tell people that the default setting were changing to improve performance, or was it just done under the table. as that it what it seems, when i read threads like this one.

There is a drop in IQ compared to High Quality Catalyst AI settings (I guess it turns off the optimizations).

If you've read the thread before jumping on the "AMD should not optimize their drivers, Nvidia doesn't" bandwagon, you'd know better and not spread the rumours that AMD didn't tell people. Lol at this statement. Anisotropic Filtering optimization was one of the things promoting Radeon 6800s (along with Morphological AA).

Why people are allowed to post when they clearly don't even bother to read the thread, I don't even...


I think the issue(not really an issue but not sure what to call it) is that ATI had a setting as defualt, but then changed it to the lower IQ setting as default which gave a performance boost.

You too are wrong. CCC had different settings before the optimization so they're not the same.
 
Last edited:
Back
Top Bottom