• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exploring ATI Image Quality Optimizations

Thanks EL JOCK, just what I was trying to say but couldn't articulate.

As i said at the very start i dont post in here often, in fact as you see i dont post that much at all, but by god some people need to wake up, just glad someone understands my point and its helped to get theirs across to.

At this rate i may get to MM before the end of the decade:D
 
So you just go with what they say and not make your own mind up when you see the evidence? i.e. during real game play it is either evident or not.
You believe that everything you read is irrefutable?
I only believe it when I see it with my own eyes.
 
So you just go with what they say and not make your own mind up when you see the evidence? i.e. during real game play it is either evident or not.
You believe that everything you read is irrefutable?
I only believe it when I see it with my own eyes.

Stab in the dark Raven reads "The Sun"
 
Stab in the dark Raven reads "The Sun"

Please EL JOCK, lets not start personal attacks as it detracts from the thread and we end up WAY off topic. Have to admit I did think the same thing tho. :D
And yes, if someone accused me of reading the SUN I would take that as a personal attack! :D
 
As i said at the very start i dont post in here often, in fact as you see i dont post that much at all, but by god some people need to wake up, just glad someone understands my point and its helped to get theirs across to.

At this rate i may get to MM before the end of the decade:D

You've stated your opinion on the matter and others theirs, theres no right or wrong here

Trying endlessly to convince others they are wrong for having a different opinion than yours is only going to cause conflict. :rolleyes:
 
This is what it boils down to
The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromizes on image quality.

AMD are compromising on IQ with the default settings to gain FPS in benchmarks, IMO it's not done for the end user as it would have been left as a performance setting and not the default.

They all flicker.


Compare them, the vid on the right is how it should look and is the optimum IQ, as you can see the AMD IQ flickers a lot more than Nvidia which matches the optimum IQ just about perfectly.
 
Last edited:
The standard is what they've been using for years... Funnily enough, both Nvidia and ATI have changed their methods of implementing AA and AF, the latter in particular changes with each generation release. Obviously you don't know about this but yet you have your opinion based on facts that don't exist.

Of course, this is why IQ has progressed over the years. It makes total sense to improve the AA and AF as the generations go by.


And a quick note on HD4870x2 vs GTX480 image quality, don't take it for granted, you either have not calibrated your monitor, have different settings in Control Panel, use different settings in game or see the tiny differences between both vendors. Or better yet, experience placebo effect, more consistent framerates, lack of microstutter, anything along these lines could be taken as a better or worse image quality. It's been discussed on these Forums several times, the differences are there but they are subjective.
My main passion is AV and yes my display is set up correctly. The IQ differences were noticeable and I know a few 4870X2 owners who have changed to a fermi and have mentioned the same thing. I really do not think its a placebo effect at all. The IQ just looks a bit better. It was not something I expected to see at all when I changed cards.


And yet ATI have always used Catalyst AI at Standard mode as default setting that has its own driver optimizations? :rolleyes: What you're complaining about is that they took it further. I doubt many AMD/ATI owners will complain as the image quality differences are unnoticeable and the performance boost is definitely there. Whether these settings should be used in reviews is a subject of another matter.
But I am an ATI/AMD owner. Why does everyone have to be categorised in this forum as being either on the red or green side?

My main point is that the default setting should not be a reduced IQ setting even if for the main part it is not noticeable. :)
 

Because default for me is unmolested. :)

I can see why it bothers some and doesnt bother others.

One thing I would like to know is owners of top end cards be it the 5870/5970 or crossfire setups ie people who have plenty of overhead and do not need the extra frames. Will they be changing the quality setting to high quality now that they know that the default has a slightly reduced IQ? :)
 
How can it be reducing IQ if it's not noticeable? Surely that's an oxymoron! :p

One thing I would like to know is owners of top end cards be it the 5870/5970 or crossfire setups ie people who have plenty of overhead and do not need the extra frames. Will they be changing the quality setting to high quality now that they know that the default has a slightly reduced IQ? :)

It's always at high quality anyway, isn't that why we have the top end stuff?
 
What a load of old Pish. This would have been news years ago but we've known about optomisations in drivers for both sides for yonks now. And considering the hoops they had to jump through to make it barely noticable its really just a yawn fest. Must be a slow news day for guru...
 
bru said:
If Nvidia can go and do the same thing then its fair game.

Well of course they can but I hope they don't, we really don't need them both seing how low they can go before end users start complaining.

bit late for that that methinks..... hence this thread....:D
bru said:
everybody is saying there is no drop in image quality.......well obviously there is, however slight, otherwise this entire issue wouldn't have sprung up in the first place.

in my honest opinion it is not a cheat, but an optimization, to my knowledge the real issue lies in the way it has been done. did AMD tell people that the default setting were changing to improve performance, or was it just done under the table. as that it what it seems, when i read threads like this one.

ok this is what i have posted in this thread up to this point.

then you have posted this.

If you've read the thread before jumping on the "AMD should not optimize their drivers, Nvidia doesn't" bandwagon, you'd know better and not spread the rumours that AMD didn't tell people. Lol at this statement. Anisotropic Filtering optimization was one of the things promoting Radeon 6800s (along with Morphological AA).

Why people are allowed to post when they clearly don't even bother to read the thread, I don't even...

well i cant see me even mentioning Nvidia let alone jumping on any bandwagons, and since when has asking a question been classed as spreading rumours.

you accuse me of not reading the thread.... thats a good un, have you heard the one about mr krugga walking into a bar....he said ouch.
 
How can it be reducing IQ if it's not noticeable? Surely that's an oxymoron! :p



It's always at high quality anyway, isn't that why we have the top end stuff?

The problem is even high quality mode on the 6xxx series is considerably worse than that of the 5xxx, it's rather suspicious that they didn't also degrade 5xxx IQ if this is their new "standard" is it not?

So either the 6xxx has less accurate texture filtering at the hardware level or AMD are intentionally holding back the 5xxx to make 6xxx appear faster than what it is, either way they are pulling a fast one.

Back in the Quack days I'm sure there were lots of people who didn't notice the degradation there either, that doesn't make it right and we shouldn't be encouraging it otherwise as NVidia stated it will quickly turn into a race to the gutter.
 
What driver do i need to get to try this out on my system then i can see for myself in real world play not specific little bits if this is an issue or not. Raven sorry i looked at those video's of yours and same as someone else they all flickered yes some a bit less then others but that amount of flickering on any real world game for any amount of time would pee me right off.
 
Of course, this is why IQ has progressed over the years. It makes total sense to improve the AA and AF as the generations go by.

Let's look at one example, Crysis v Crysis: Warhead. The image quality was slightly worse with the sequel even though they both looked fabulous. The latter was significantly smoother as well. Image Quality should not always be a priority.

My main passion is AV and yes my display is set up correctly. The IQ differences were noticeable and I know a few 4870X2 owners who have changed to a fermi and have mentioned the same thing. I really do not think its a placebo effect at all. The IQ just looks a bit better. It was not something I expected to see at all when I changed cards.

If you're such an enthusiast of AV, tell me what are the exact differences between two images. Sharpness, colours, contrasts? Better AA, AF, mipmapping?

But I am an ATI/AMD owner. Why does everyone have to be categorised in this forum as being either on the red or green side?

My main point is that the default setting should not be a reduced IQ setting even if for the main part it is not noticeable. :)

You are, you aren't, what does it matter? You're discussing the subject without having a first hand experience. I've never implied that you are/were not an owner of ATI/AMD graphics card. What strikes me is that you complain about IQ based on the videos that some German website posted to prove that the IQ is indeed worse. If you even bothered to read what is on that website, or Guru3D for that matter, you have to look very carefully for those flickering textures/anisotropic filtering distortions as they're not noticeable fully in an image.

Your main point is that graphics cards manufacturers should not optimize their drivers any further because there is a supposed "standard" for IQ.

well i cant see me even mentioning Nvidia let alone jumping on any bandwagons, and since when has asking a question been classed as spreading rumours.

you accuse me of not reading the thread.... thats a good un, have you heard the one about mr krugga walking into a bar....he said ouch.

You implied that AMD have done it "under the table" which is not true at all. Claiming that you've read the thread doesn't help it either.

Please do tell me how I'd suggested that you mentioned Nvidia. It's only that is subject is being exaggerated by angry Nvidia owners who think it's unfair to compare new Radeons with GeForces in the reviews and I can't help but... agree with them. Unfortunately I disagree with a lot of misinformation that is being spread in this topic and one of such things is claiming that AMD has done the optimization in conspiracy.
 
Can people really not see the issue here? if the default driver settings become a mess of non-standard opptimisations - regardless of how little the IQ difference is things are gonna get very messy for developers and end user a like.
 
Back
Top Bottom