both will be cheating for the new futuremark as always
That's a given, and one set of fans will denounce the test as a load of biased rubbish. I can't wait.

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
both will be cheating for the new futuremark as always
In comparison to the optimum AF image on the right, they are identical, now compare the 6*** left image with the right, the flickering is bad.
Just skimmed through the thread, has anyone raised the question, is it a noticeable IQ change?
OK, I can see the point that it's not strictly fair for an apples to apples comparison which Nvidia is making. HOWEVER, I don't blame ATi (or even Nvidia) if they can get away with gaining a 10% performance increase with no easily noticeable IQ degradation.
Just wondering.
Ah, the flickering issues. Yep, had them before, haven't noticed for a long time though.
Pot calling the kettle black tbh.
I'm concentrating more on the fact that ati are doing it now, rather than nvidia having done it before. it's not an excuse is it?
I think a big fuss is being made because AMD have made the quality setting the default for the 10.1 drivers, this is the setting with the supposed dodgy AF optimizations. Now reviewers will use the default settings and maybe miss the fact that AMD IQ is reduced to gain FPS. This makes it an unfair comparison with Nvidia cards tested with default IQ settings but deliver better IQ. AMD need to leave any lowering of IQ down to the user.
The fact that the usual suspects are jumping to AMD's defence on this issue is frankly laughable.
Seriously, stand away from the keyboard and have a good think about it and see how pathetic you look.
This forum becomes more unreadable by the day.
Just skimmed through the thread, has anyone raised the question, is it a noticeable IQ change?
OK, I can see the point that it's not strictly fair for an apples to apples comparison which Nvidia is making. HOWEVER, I don't blame ATi (or even Nvidia) if they can get away with gaining a 10% performance increase with no easily noticeable IQ degradation.
Just wondering.
FP16 Render Observations
In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate.