• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exploring ATI Image Quality Optimizations

I wish someone would explain to me how it can be reducing the IQ and unnoticeable at the same time. Very confusing. :confused:

I think that the problem, some see it as a reduction of IQ and others see no difference or very very small but better fps and you got 2 company's and groups of people fighting over it.
 
I think that the problem, some see it as a reduction of IQ and others see no difference or very very small but better fps and you got 2 company's and groups of people fighting over it.

Well from the videos I've seen you can see the difference, given the right circumstances so this isn’t something nobody is going to notice, of course some people won't notice it but this was also the case in the past where IQ was been lowered to increase performance, just because many people don't notice it doesn’t make it right.

I’m using a HD5870 so I can't comment directly as for some strange reason these optimization are only for the HD6 series

So one has to ask, is there something wrong with the HD6 series?
 
Last edited:
As it's been said before, reviewers don't use default settings for benchmarks, well the good ones don't.

So how does this change anything, it certainly doesn't invalidate the benchmarks.
 
As it's been said before, reviewers don't use default settings for benchmarks, well the good ones don't.

So how does this change anything, it certainly doesn't invalidate the benchmarks.

Actually most do use default filtering options, the only settings they usually change are AA/AF.

FYI HD5 series default to highest settings anyway, nvidia default to quality which is or at least was comparable.
 
Last edited:
With 2million pixels (1920x1080) being rendered multiple times per second it will be difficult to spot optimisation degradition within a few, a few hundred, or perhaps even a few thousand of said pixels. However, to play on an even playing field both NVida and ATI should abide by the the same default image quality, without trickery of any kind.

It is good that ATI provides solid performance gains for almost no visible loss, but it is wrong perform this as a default action. If NVidia counters with their own default optimisations it could start a downward trend with both sides progressively sacrificing IQ, little by little in an attempt to outdo the other.

ATI are wrong to implement this as default behaviour and NVidia would be wrong to copy them.
 
If this was the only setting you could have and no one had known about it maybe this would be a big issue but neither of those is true. I think this is a storm in a teacup and not something the majority of user's are going to care about or spend anytime worrying about. Are there any 6xxx series owner's on here that are worried about this, have you noticed any difference in real world game playing or anything else. I would like to hear from the people that actually have the cards rather then a bunch of people that don't but are telling everyone how bad it is and how much of an issue it is.
 
If this was the only setting you could have and no one had known about it maybe this would be a big issue but neither of those is true. I think this is a storm in a teacup and not something the majority of user's are going to care about or spend anytime worrying about. Are there any 6xxx series owner's on here that are worried about this, have you noticed any difference in real world game playing or anything else. I would like to hear from the people that actually have the cards rather then a bunch of people that don't but are telling everyone how bad it is and how much of an issue it is.

But you have a HD5870, they are delibrately making the HD6870 look closer to the HD5870 in benchmarks then it really is, or are holding back a 7-10% performance increase on the HD5870 to make the HD6870 look better.

Either way you look at it, doesn’t that bother you even a little?
 
If this was the only setting you could have and no one had known about it maybe this would be a big issue but neither of those is true. I think this is a storm in a teacup and not something the majority of user's are going to care about or spend anytime worrying about. Are there any 6xxx series owner's on here that are worried about this, have you noticed any difference in real world game playing or anything else. I would like to hear from the people that actually have the cards rather then a bunch of people that don't but are telling everyone how bad it is and how much of an issue it is.

Well, being an 6870 owner I can't say I notice any difference between it and a card I don't own.

If Nvidia can lower their IQ without much notable difference and gain a performance increase then they should do it as well. If they end up making cyrsis warhead looking like doom 1 then they will suffer the consequences just as AMD would.
 
HD5 series default to highest settings anyway, nvidia default to quality which is or at least was comparable.
Only the Mipmap detail level defaults to High Quality for the 5000 series and CCC 10.11. The Catalyst A.I. Texture Filtering quality and Enable Surface Format Optimization are not even in CCC 10.11, this fetaure is only in the 10.10e unsupported driver and control panel.
 
The problem is reviews showing the AMD cards performing quicker with lower IQ, this skews the results when Nvidia cards are tested with default setting as are AMD cards but Nvidia provide better IQ, all AMD have to do is revert this lower IQ to a performance option, then all is fine in GFX land.
 
Only the Mipmap detail level defaults to High Quality for the 5000 series and CCC 10.11. The Catalyst A.I. Texture Filtering quality and Enable Surface Format Optimization are not even in CCC 10.11, this fetaure is only in the 10.10e unsupported driver and control panel.

Both available options default to high quality and have been since launch, I've never needed to change them.

"Enable Surface Format optimization" is basically Catalyst A.I set to advanced, it defaults to standard usually and as you said is only in 10.10e.

What setting do you think "good" reviewers are changing to improve filtering quality? excluding AA/AF etc, please be specific because I really don't know what your talking about.


Right, so you don't care that they are holding back a performance improvement of around 7-10% to make another product look good in benchmarks... Great.

I won't reply to you again as there is clearly no point.
 
Last edited:
Ebil why would i care lol all the games i play run smooth as can be and i have no problems. What exactly am i supposed to be getting upset about the fact amd made optimisations to a new series of cards i would be more upset if they didn't. I honestly couldn't give a monkeys about synthetic benchmarks i go by my real world game playing and right now i play everything with no probs on max settings. This seems to me to be something people are throwing at amd i mean seriously is the the best thing to throw at them isn't there some info on bulldozer people could have a go at amd for in the cpu section or a bad chipset they could go on about in the mobo section.

Plus not being funny but if this is something so bad affecting the user's of amd gfx cards why arn't nvidia owner's just happy it isn't them why are so many of you making such a fuss about this. We have heard from an actual owner of a 6xxx series card that they are not bothered by this and that is someone directly affected so if they are happy to be honest those of us who don't have 6xxx series cards should just shut up and leave it to those who do.
 
I wish someone would explain to me how it can be reducing the IQ and unnoticeable at the same time. Very confusing. :confused:

Because the graphics vendors are supposed to be selling us hardware that is superior than the previous generation, not cutting corners in terms of the workload rendered in order to artifically inflate performance.

The accuracy and quality of rendering is supposed to get better with each generation not worse, whether it is easily detected or not.
 
Last edited:
Because the graphics vendors are supposed to be selling us hardware that is superior than the previous generation, not cutting corners in terms of the workload rendered in order to artifically inflate performance.

Hang on, optimizing how efficiently their product works is a bad thing? they should just throw more brute force at the issue with every generation?

Do you feel outraged by using a 3mb mp3 file instead of a 200mb wav?


The only sensible issue is comparable benchmarks, if a reviewer wants to compare the midrange cards of the new generation to the high end cards of the previous generation then they need to ensure the correct settings.
Its the reviewer who must justify their techniques and processes to form a report which is knowledgeable and relevant.
Read the performance reviews and read the IQ reviews then come up with your own conclusion on how you spend your money.
 
It's not about the owners, it's about dodgy benchmark results from review sites.

To be honest most results from most web sites are dodgy anyway half the time, but i cannot say i agree with what AMD did as i don't want it to become tit for tat as that will just get messy and not help anyone.
 
I don't think there is any need to worry on a tit for tat i really don't see this as the issue some do and end of the day anyone who spends their money without dong proper research asks for what they get anyway. Nowhere have i seen amd trying to hide this or the fact that the now called 6870 is not superior to the 5870. Although i will admit to being puzzled as to why they have changed their naming scheme but it's their company and they can do what they want just as we can with our money.
 
It's not about the owners, it's about dodgy benchmark results from review sites.
This is where I could not disagree with you more. To me, and apparently ATI/AMD, it is all about the owners.

How many end users ever open the control panel, or have any real knowledge of what the settings there do? The vast majority don't, and that is why making this optimization default will end up benefiting the gaming experience of 95+% of users. From your general users' perspective it could be seen as a disservice not to turn this on.

There is a mindset that it's somehow unfair because Nvidia's default settings are different. The only people this affects are those to whom image quality is of utmost importance, review sites, and tweakers who like to micromanage for themselves. This group is infinitely more likely to actually be aware the settings even exist, let alone how to bump one slider to the right one notch to get performance "like before."

So long as there remains an option to bump quality settings back up, demanding companies eschew their own enhancements to copy rivals' settings (which are hardly apples-to-apples regardless) is unreasonable.

Its the reviewer who must justify their techniques and processes to form a report which is knowledgeable and relevant.
Read the performance reviews and read the IQ reviews then come up with your own conclusion on how you spend your money.
THIS. The onus should be on review sites to examine the technology and settings before critical analysis. If card makers want to provide information that makes this easier, good on them. Transparency benefits everyone.

But my main point, again: default settings should provide the best balance of performance and quality with your average user in mind. Tell the review sites you've changed your default optimizations and let them sort it out (in this case, bump one slider to the right).
 
Last edited:
I really don't think someone who has no real knowledge of the control panel is even going to notice an 8% difference in performance. Its much more important that default settings have some kinda standard so things just work as intended as much as possible under as wide a variety of situations as possible.
 
Back
Top Bottom