I find it funny, they've increased the ability to do FP16 filtering, when Nvidia have complained for years that AMD have the drivers choose FP16 filtering over FP32 when theres no image quality improvement, and they've gone into detail about how AMD cheat in such games as, Far Cry 1, Serious Sam etc.
When you look at the basic speed improvement in some games from 480-580gtx, then some games get another 10% boost(what you see from using fp16 over fp32) you do wonder if the increase in FP16 filtering, they previously complained about and subsequently boosted performance for dramatically, isn't the reason here. Which by Nvidia's own definition would be cheating but there you go.
Nvidia are a big complaining pile of poop, not a single serious website managed to notice or care about any IQ changes, nor any end users who bought the cards, nor, most importantly, do all review sites only use default settings, which makes their whole position rather stupid.
Also with AMD things like the FP16 optimisation are ONLY used with the advanced catalyst option, something no one seems to use for benchmarking, Nvidia' optimises for EVERY game with different settings, so at default the driver is optimised for basically every benchmarking game, for AMD that only happens with the advanced setting in Cat AI, which no one uses.
I'd also question "all these sites" seeing it as, they've linked to one site that spots a difference, who I've never heard of, as for performance, who says the performance changes 10%.
Has anyone changed from all "fastest" settings in drivers to all "quality" settings and seen a 10% performance difference, let alone one setting up one level, I certainly haven't. Sounds like Nvidia see one review that questions it and have jumped all over it.
http://www.computerbase.de/bildstrecke/31423/19/
first pics I clicked on an obscure German website, I can't see ANY difference between HQ AF and AF in the barts, I also can't see any difference between 10.9 and 10.10 hq, and I can't see any difference between normal AF on Barts, and Fermi HQ.
EDIT:- there are very marginaly differences, but without seeing the original texture to know which was closest I have no idea which is "right" they look equally good, with some very very marginal differences with 98% identical on each card, for all intents and purposes they are all very good, identical quality with no difference between them.
http://www.computerbase.de/bildstrecke/31423/14/
Can't see a difference again between HQ and normal, and both look "marginally" better than Fermi, theres one area not that "far away" where the AMD AF is clearer in one patch, its a very small difference.
You can see here again that 10.10 is BETTER than 10.9, VERY marginally, the difference is best described as, 10.9 seems to change the darkness of the ground textures very close to the character, 10.10 seems to maintain uniform darkness of the ground into the distance, and thats the difference with Fermi aswell, theres a layer of "lighter" ground where clearly the quality changes at a certain distance level. 10.10 looks very marginally better than 10.9, and Fermi's, and HQ and normal look all but identical.
If it was a "standard" setting where AMD gained 10% across the board, frankly those two games would not look BETTER on Barts and on the 10.10 than Fermi and the 10.9 drivers, so there you go, disproved and rubbish.