• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exploring ATI Image Quality Optimizations

Whether or not you can notice is beside the point, it shouldn't be the default setting. NV could just do the same and regain the performance advantage. Where does it end? Keep the quality high as a default IMO.

In benches, reviewers need to bear this in mind as it is potentially misleading.

It is the point though.

They made it so you CANT see any diffrence and gained in performance.

errrm.. Win win?
 
I would say the fact you cannot notice it, is the point, if Nvidia can do the same thing without the user noticing the change its fair, as the user see's the image as expected.

we are talking a change so small that nobody would see it without expert help and modifications been done to the image to show it,
 
Your acting like its night and day, which its not. it's like saying NV is 9.00am and AMD is 8.59am, to the normal eye its going to look the same outside (snow).

I will review the info raven has put out when i get home from work, but i would guess as in the case with guru an image had to be made to show the effect, i.e without expert help in the real world (i.e the one where we play games etc) you would never see it
 
Meh really no difference to me fella, I'd rather have the 8% more performance lol

Most people would take the extra performance for a difference that small - but it should have its own setting. The problem is having non-standard opptimisations on whats supposed to be a standard setting. As per the nVidia blog how far do you take it? - it makes a nightmare for developers if the AMD and nVidia default standard "quality" setting is a hotch potch of non-standard opptimisations for the sake of the benchmark scores.
 
Most people would take the extra performance for a difference that small - but it should have its own setting. The problem is having non-standard opptimisations on whats supposed to be a standard setting. As per the nVidia blog how far do you take it? - it makes a nightmare for developers if the AMD and nVidia default standard "quality" setting is a hotch potch of non-standard opptimisations for the sake of the benchmark scores.

As far as it doesn't have a visible impact on the IQ. Sure, this can be subjective but I doubt anyone would be happy with either AMD/Nvidia reducing default image quality by a margin that is visible in real-world gaming.

For those wanting the best IQ there are several options in the CCC.

Think about this way, knowing about all drawbacks of this setting in CCC but also noting that it improves the framerate by 8% or so, how many people do you think would not use this setting if it wasn't set to it on default? I think only those that own multi-GPUs and have enough power to get decent enough performance with whatever settings they want and those who don't know about CCC ;)
 
Right o,

Just got through the info posted by raven and well its much as i thought. Please remember that google translation is broken english, so you could read some of the info in many ways. So looking at the pictures is the best thing to do.

The images your looking at are extreme images made to show the diffrence, (not sure the last time a game looked like that, but yes there is a diffrence), however yet again this is not a real world test. so this does not really put card (a) up against card (b) in the way we use them.

We all agree its not 100% the same, but i would love to see a true test of game (x) running side by side, on card (a) & card (b), using the diffrent modes highlight in that artical.

The reason i guess its not been done that way is due to running a game at the normal viewing distance on card (a) vs card (b) shows no diffrence to the eye, bringing us right back to the start, had some expert not gone to that great lenght to show it would you have ever seen it, yet again i guess not.

People will vote with there feet as with anything and as free thinkers its your right but i feel this needs to be put in the correct context, people talking about "its misleading" are in fact misleading people, its not night and day, its not 3dmark all over again. it's a very small, almost to the point you dont need to know about it diffrence between two diffrent companys cards/drivers.

Guess my view is not going to be welcomed by some but people need to take a step back sometimes no point been zoomed in at 200% checking a blade of grass, when you will miss all the glory around it.

Jock out
 
I think many people are missing the point, bascially remove 7-9% (1920*1200) off all benchmark results and you have the cards actual performance.

Which means HD6850 is slower than the GTX460 1GB and GTX6870 falls even further behind the GTX470.

I'm using a HD5870 and don't really like the inflated numbers as bascially all the early benchmarks against their previous products are also BS.
 
Last edited:
I think many people are missing the point, bascially remove 7-9% (1920*1200) off all benchmark results and you have the cards actual performance.

Which means HD6850 is slower than the GTX460 1GB and GTX6870 falls even further behind the GTX470.

I'm using a HD5870 and don't really like the inflated numbers as bascially all the early benchmarks against their previous products are also BS.

A very good point!
 
It is a very good point you make and yes you should if this was a cheat, which its not!

If it can be show that the image in a real world test i.e game / benchmark hell even word is diffrent to the point it does not look as good as the rivals image, taken into account normal viewing distance (i.e not zoomed in at some crazy %, with gamma going to hulk levels to show the diffrence.) then yes take off 7-10%. If you cannot do that then the score stands.

Yet again it sounds like people are trying to say its night and day which its not, its a very small change that the user will never see but gain 7-10% performace from, in very short terms "win, win".
 
It is a very good point you make and yes you should if this was a cheat, which its not!

If it can be show that the image in a real world test i.e game / benchmark hell even word is diffrent to the point it does not look as good as the rivals image, taken into account normal viewing distance (i.e not zoomed in at some crazy %, with gamma going to hulk levels to show the diffrence.) then yes take off 7-10%. If you cannot do that then the score stands.

Yet again it sounds like people are trying to say its night and day which its not, its a very small change that the user will never see but gain 7-10% performace from, in very short terms "win, win".

Whilst i agree with the point of people not noticing, well i for one would not. What they are saying is that if Nvidia reduced it as standard in their drivers(level the playing field as such) then the benchmark results would be different.

Disclaimer:
I do not know if this was even implemented in the drivers for the benchmarks
 
A funny thing about this "cheat" is that it was never hidden in any way by AMD.

http://www.guru3d.com/article/radeon-hd-6850-6870-review/8

The first review to appear before NDA was lifted tells you about this. There are even slides from AMD to show you the difference in AF between 5800 and 6800.

This is the new way of implementing Anisotropic Filtering in games and isn't an option that had existed earlier as a performance setting. AMD didn't just move the bar to Performance setting. They actually implemented an optimisation that is meant to insignificantly lower IQ in order to boost performance by a fair margin.
 
It's not a cheat, it's a cheeky way to inflate the scores compared to Nvidia cards.

I have to agree with this. However you look at it, it's unfair to compare two graphics cards at different settings. I can see why AMD implements this as a default setting though.

Any reduction from the standard IQ should not become the default setting.

Who defines the standards then?
 
Back
Top Bottom