• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exploring ATI Image Quality Optimizations

Let's look at one example, Crysis v Crysis: Warhead. The image quality was slightly worse with the sequel even though they both looked fabulous. The latter was significantly smoother as well. Image Quality should not always be a priority.

IQ is always a priority for me. I spend allot of time and cash trying to get the best IQ from my games. If not I would be running a single card set up with a mid range card.

If you're such an enthusiast of AV, tell me what are the exact differences between two images. Sharpness, colours, contrasts? Better AA, AF, mipmapping?
As I said in my previous post.

Reading this thread and looking at the pictures I think its fair to say that some games will be effected and others wont. Personally I do not like any decrease in image quality at all. One of the main reasons I choose pc gaming over console gaming is so that I can have the very best image quality possible in games.

You are, you aren't, what does it matter? You're discussing the subject without having a first hand experience. I've never implied that you are/were not an owner of ATI/AMD graphics card.
It matters as I own a few ATI cards and will own many over the years to come. It matters as I am a customer and have an opinion just the same as you do.

As you are also discussing this topic can you expand on your first hand experiences?


What strikes me is that you complain about IQ based on the videos that some German website posted to prove that the IQ is indeed worse. If you even bothered to read what is on that website, or Guru3D for that matter, you have to look very carefully for those flickering textures/anisotropic filtering distortions as they're not noticeable fully in an image.

Your main point is that graphics cards manufacturers should not optimize their drivers any further because there is a supposed "standard" for IQ.
Where did I mention videos or anything like that? How do you know what I have read?

Seriously Krugga you need to take a chill pill and stop being so defensive of ATI/AMD.

What I have said and I will repeat once again for you is that I do not like any form of reduction on IQ being set as a default setting. I hope this is now clearer to you.

You think its a great idea to lower the IQ on default and I think its a bad idea. I personally think by doing this AMD/ATI have opened their selves to being perceived as not providing as much quality. IMO what would have been a better move is if AMD/ATI had kept the default IQ and then released a new mode where you get 99% of the IQ for a 8% boost in frames. This to me personally would have been a better move. However I suspect that AMD/ATI didn't do this so that when a card gets reviewed on stock default settings in a review they get more frames on the chart. Call it a hunch but you gotta admit its a strategic move in the war of gpu's. Could it potentially harm them in the long run? Time will tell. ;)
 
Last edited:
You implied that AMD have done it "under the table" which is not true at all. Claiming that you've read the thread doesn't help it either.

Please do tell me how I'd suggested that you mentioned Nvidia.

well your earlier post was kinda implying it.

If you've read the thread before jumping on the "AMD should not optimize their drivers, Nvidia doesn't" bandwagon, you'd know better and not spread the rumours that AMD didn't tell people.


did AMD tell people that the default setting were changing to improve performance, or was it just done under the table.

well that in my book is a question....oh sorry i forgot the ?

i suppose we have both read this thread and interpreted various posts the way we wanted too.
 
what will be very interesting about this whole affair is to see just how close the 6970 and the gtx580 come out in the upcoming benchmarks. if the difference is with in this optimizations margin of improvement then maybe that is why AMD has decided to go down this route.
 
Can people really not see the issue here? if the default driver settings become a mess of non-standard opptimisations - regardless of how little the IQ difference is things are gonna get very messy for developers and end user a like.

Please, evaluate. I don't think I've thought of developers regarding the subject.

IQ is always a priority for me. I spend allot of time and cash trying to get the best IQ from my games. If not I would be running a single card set up with a mid range card.

Good for you :)


As I said in my previous post.

It matters as I own a few ATI cards and will own many over the years to come. It matters as I am a customer and have an opinion just the same as you do.

As an aware customer, could you please tell me which drivers are you using with your ATI cards atm? I assume you must have a general idea on what has been changed recently? Everyone's allowed to have an opnion, anyone can also post rubbish without either a first-hand experience or background knowledge. Don't take it personally, you might not be one of those people for all I care.

Where did I mention videos or anything like that? How do you know what I have read?

Ok kind sir, could you please tell me what sources are you using to make your assumptions about reduced IQ in some games, as you claim in this thread?

Seriously Krugga you need to take a chill pill and stop being so defensive of ATI/AMD.

You need to stop posting if your arguments are rubbish. I'm not going back to this thread because it is getting too messy and brainless.

I'd love to see where I'm defensive of AMD at all. There's no way you can answer this question though, is there? :rolleyes:
What I have said and I will repeat once again for you is that I do not like any form of reduction on IQ being set as a default setting. I hope this is now clearer to you.

No, it's not clear at all. Neither CCC nor Nvidia Control Panel have high quality options on default.

You think its a great idea to lower the IQ on default and I think its a bad idea.

Hang on a second, where have I said that it's a great idea? I do not personally see anything bad about driver optimization although I haven't said it should be default setting in CCC. This method of applying Anisotropic Filtering is great as it does not reduce IQ significantly (or even noticeably) but is not fair for comparison with other cards that do not have this optimization implemented. On the other hand, for the end user how does it matter if it's fair or not? It does matter in the game though.

I personally think by doing this AMD/ATI have opened their selves to being perceived as not providing as much quality. IMO what would have been a better move is if AMD/ATI had kept the default IQ and then released a new mode where you get 99% of the IQ for a 8% boost in frames. This to me personally would have been a better move. However I suspect that AMD/ATI didn't do this so that when a card gets reviewed on stock default settings in a review they get more frames on the chart. Call it a hunch but you gotta admit its a clever move in the war of gpu's. ;)

I respect your personal opinion mate.


EDIT: haven't noticed you edited your post since:

As you are also discussing this topic can you expand on your first hand experiences?

I haven't experienced any loss in image quality, I used to get texture flickering issues a long time ago, it hasn't been a case for months now.

http://forums.overclockers.co.uk/showthread.php?p=17924626#post17924626

That was regarding your alleged experience of better IQ with GTX480. I'd love to know what the difference are.
 
Last edited:
well your earlier post was kinda implying it.

Unintentionally.


well that in my book is a question....oh sorry i forgot the ?

i suppose we have both read this thread and interpreted various posts the way we wanted too.

Well, if you bothered to read your post in full, I'm sure you'd find the right outcomes ;)

did AMD tell people that the default setting were changing to improve performance, or was it just done under the table. as that it what it seems, when i read threads like this one.


Seriously guys, I'm out of this thread. It doesn't lead anywhere and I'm sure everyone feels a little disappointed with AMD as the new AF implementation will cause such debates over future reviews. We can only hope that it will be less and less of IQ loss along with significant performance improvements in both camps.
 
Last edited:
As an aware customer, could you please tell me which drivers are you using with your ATI cards atm? I assume you must have a general idea on what has been changed recently? Everyone's allowed to have an opnion, anyone can also post rubbish without either a first-hand experience or background knowledge. Don't take it personally, you might not be one of those people for all I care.

10.11



You need to stop posting if your arguments are rubbish. I'm not going back to this thread because it is getting too messy and brainless.

I'd love to see where I'm defensive of AMD at all. There's no way you can answer this question though, is there? :rolleyes:

There are certain fans in this forum and I am sorry to say Krugga you do stand out as one of them. I would have no other reason to mention your defence of AMD/ATI. If I am wrong I apologize but that is the way you do come across at times. :)


Hang on a second, where have I said that it's a great idea? I do not personally see anything bad about driver optimization although I haven't said it should be default setting in CCC. This method of applying Anisotropic Filtering is great as it does not reduce IQ significantly (or even noticeably) but is not fair for comparison with other cards that do not have this optimization implemented. On the other hand, for the end user how does it matter if it's fair or not? It does matter in the game though.

So what are you arguing about then?



I respect your personal opinion mate.


Cheers. ;) Seriously there was just no need for all those posts over a minor difference of opinion and a disbelief that I noticed an IQ improvement moving from a 4870x2 to a gtx480. All in all a total waste of time really for both of us. ;)
 
Sorry but obviously it does not look the same as there would not be a big fuss made over it by over four tech sites now.

So you're saying that if you you've been shown 100 random screenshots and a random number of them was on Q and the other was HQ without actually being told which is which or how many of which one are there - you would perfectly guess all of them since the image quality is so obvious right?

Wanna bet a grand on that ? You can even have 10% miss on that.
 
I wish someone would explain to me how it can be reducing the IQ and unnoticeable at the same time. Very confusing. :confused:
 

You can't know what the differences are then?
There are certain fans in this forum and I am sorry to say Krugga you do stand out as one of them. I would have no other reason to mention your defence of AMD/ATI. If I am wrong I apologize but that is the way you do come across at times. :)

Whatever you say, I'd never thought I come across as a fanboy. Don't care either, I think there's enough of my posts around to prove you wrong.

So what are you arguing about then?

About misinformation spread over this thread by misinformed people? Those who think the issue is in a totally different place than it really is?

Cheers. ;) Seriously there was just no need for all those posts over a minor difference of opinion and a disbelief that I noticed an IQ improvement moving from a 4870x2 to a gtx480. All in all a total waste of time really for both of us. ;)

Yes, it was a waste of time. I'm still waiting for you, an AV enthusiast, to tell me what the differences in IQ between HD5870 and GTX480 are. I stated my opinion about the subject, you said I'm wrong, still no description of your first hand experience with IQ differences between the two cards. Not that it matters in this topic but since you mentioned it...
 
About misinformation spread over this thread by misinformed people? Those who think the issue is in a totally different place than it really is?

Your arguing because I said and I quote I do not like any form of reduction on IQ being set as a default setting. :confused:

How is this misinformation? Its mearly my feelings on this topic. Judging by your responses you have me confused with someone else or my point of view upsets you in some way. :confused:


Yes, it was a waste of time. I'm still waiting for you, an AV enthusiast, to tell me what the differences in IQ between HD5870 and GTX480 are. I stated my opinion about the subject, you said I'm wrong, still no description of your first hand experience with IQ differences between the two cards. Not that it matters in this topic but since you mentioned it...

Ok. The difference for me was sharper image and more pop. The colour chart was also smack on once I deselected RGB in the control panel. My 4870X2 had to be adjusted to get the colour chart smack on.
 
Last edited:
It's probably worth pointing out, in his defence, that Razor is particularly anal about IQ, so anything like this is going to get his back up! :p :D

Still, I didn't realise that any of this stuff was affecting 4xxx series cards?
 
Can people really not see the issue here? if the default driver settings become a mess of non-standard opptimisations - regardless of how little the IQ difference is things are gonna get very messy for developers and end user a like.

Honestly, No, can't see any issue at all unless I can see it games.
I also think it is VERY strange that the Nvidia owners/ supporters seem to have more of an issue with this than the actual 6XXX OWNERS!
If you have an Nvidia card what do you care if AMD do this? Nvidia would never sink so low as to follow suit after making such a fuss so why worry?
Don't get me wrong, it is something I will think of when I came to buy a new card but it certainly wouldn't be the deciding factor.
Again, if I can't see it in game I couldn't give a **** and why on earth would I?!?!?!
 
Last edited:
The problem is even high quality mode on the 6xxx series is considerably worse than that of the 5xxx, it's rather suspicious that they didn't also degrade 5xxx IQ if this is their new "standard" is it not?

So either the 6xxx has less accurate texture filtering at the hardware level or AMD are intentionally holding back the 5xxx to make 6xxx appear faster than what it is, either way they are pulling a fast one.

Back in the Quack days I'm sure there were lots of people who didn't notice the degradation there either, that doesn't make it right and we shouldn't be encouraging it otherwise as NVidia stated it will quickly turn into a race to the gutter.

Yep, it's pretty sad AMD felt the need to do this, not only are they making the HD6 cards appear faster compared to the GTX460 but also to their previous products which makes no sense, HD5 cards get MLAA but not this? In a way I'm glad they never but you have to wonder, why only for the HD6 series?

It's appears they are deliberately trying to trick the consumer into to thinking HD6 series is faster than it is, rebranding I’ve never had a problem with but changing IQ settings to manipulate benchmarks for specific card’s is highly suspicious.
 
If I had a ati card and I could see the difference then maybe I would worry, but I do think it a mess and I hope either camp first thinks about the fall out before doing something like this again before they do it,I do think a lot of it down to politics from review sites and some users.

I am not sure what either camp does in driver optimizations from game to game and driver to driver lease so i am not sure what to make of it really, Rroff is right it could be a slipperly slope if it were the start of optimization but both camps have done it in the past as well, but the last thing we want is either or both camps playing silly buggers with IQ.

edit:- At times i feel i am back in the play ground and that is a looooong time ago, and i not on about the forums i mean amd and nvidia and the reviews sites who stir up all this crap at times and yes i get pulled into myself at times as well, as i am passionate about pc's.
 
Last edited:
Rroff is right it could be a slipperly slope if it were the start of optimization but both camps have done it in the past as well, but the last thing we want is either or both camps playing silly buggers with IQ.

I'm just wondering how long til the default quality settings on nVidia cards have similiar opptimisations to claw back the advantage and around it will go again.
 
Honestly, No, can't see any issue at all unless I can see it games.
I also think it is VERY strange that the Nvidia owners/ supporters seem to have more of an issue with this than the actual 6XXX OWNERS!
If you have an Nvidia card what do you care if AMD do this? Nvidia would never sink so low as to follow suit after making such a fuss so why worry?
Don't get me wrong, it is something I will think of when I came to buy a new card but it certainly wouldn't be the deciding factor.
Again, if I can't see it in game I couldn't give a **** and why on earth would I?!?!?!

Just a thought
But do Nvidia not care??

All the competing cards are tested and scored on various review sites.
If AMD are pulling a flanker(all be it transparent), and artificially boosting their performance relative to the competition, is this not going to force Nvidia to reduce prices in order to compete, with products that more than likely will undercut the Nvidia top end cards.

It strikes me that AMD want their cake & want to eat it.

I also believe that this is a sign that AMD have been caught short with their pants down:)

Time will tell
 
Back
Top Bottom