• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia says ATI are cheating again

Ok must be the truth then coming from Nvidia :rolleyes:... both are as bad as each other imo.. both in drivers and cards and price's lately.
 
Last edited:
If you can't see/tell the differance in image quality and you get better performance, who cares?

Nvidia should just follow suit, or if the differance is apparent not worry as it will be pointed out in reviews
 
If you can't see/tell the differance in image quality and you get better performance, who cares?

Nvidia should just follow suit, or if the differance is apparent not worry as it will be pointed out in reviews

I read that article and thought the same thing - although you have to ask why they felt the need to raise the settings on the ATI card instead of lowering the settings on the Nvidia cards - or perhaps thats because lowering the settings on the Nividia cards doesn't produce the magic 10% they are claiming.
 
They all found that changes introduced in AMD’s Catalyst 10.10 default driver settings

Yeah the lower IQ is reported to be with the default settings now, obviously AMD IQ at default must be lower than Nvidia's at default, it wouldn't be reported by all these sites otherwise. So it looks like they introduced this in 10.10's to edge out extra performance over Nvidia at the sacrifice of image quality, I don't mind changing settings in the CP to lower IQ to get better FPS but I don't want lower IQ as the default.
 
Last edited:
And Nvidia know how to spin lol.


We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing. We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.

IQ gutter..ouch.
 
I find it funny, they've increased the ability to do FP16 filtering, when Nvidia have complained for years that AMD have the drivers choose FP16 filtering over FP32 when theres no image quality improvement, and they've gone into detail about how AMD cheat in such games as, Far Cry 1, Serious Sam etc.

When you look at the basic speed improvement in some games from 480-580gtx, then some games get another 10% boost(what you see from using fp16 over fp32) you do wonder if the increase in FP16 filtering, they previously complained about and subsequently boosted performance for dramatically, isn't the reason here. Which by Nvidia's own definition would be cheating but there you go.


Nvidia are a big complaining pile of poop, not a single serious website managed to notice or care about any IQ changes, nor any end users who bought the cards, nor, most importantly, do all review sites only use default settings, which makes their whole position rather stupid.

Also with AMD things like the FP16 optimisation are ONLY used with the advanced catalyst option, something no one seems to use for benchmarking, Nvidia' optimises for EVERY game with different settings, so at default the driver is optimised for basically every benchmarking game, for AMD that only happens with the advanced setting in Cat AI, which no one uses.

I'd also question "all these sites" seeing it as, they've linked to one site that spots a difference, who I've never heard of, as for performance, who says the performance changes 10%.

Has anyone changed from all "fastest" settings in drivers to all "quality" settings and seen a 10% performance difference, let alone one setting up one level, I certainly haven't. Sounds like Nvidia see one review that questions it and have jumped all over it.

http://www.computerbase.de/bildstrecke/31423/19/

first pics I clicked on an obscure German website, I can't see ANY difference between HQ AF and AF in the barts, I also can't see any difference between 10.9 and 10.10 hq, and I can't see any difference between normal AF on Barts, and Fermi HQ.

EDIT:- there are very marginaly differences, but without seeing the original texture to know which was closest I have no idea which is "right" they look equally good, with some very very marginal differences with 98% identical on each card, for all intents and purposes they are all very good, identical quality with no difference between them.

http://www.computerbase.de/bildstrecke/31423/14/

Can't see a difference again between HQ and normal, and both look "marginally" better than Fermi, theres one area not that "far away" where the AMD AF is clearer in one patch, its a very small difference.

You can see here again that 10.10 is BETTER than 10.9, VERY marginally, the difference is best described as, 10.9 seems to change the darkness of the ground textures very close to the character, 10.10 seems to maintain uniform darkness of the ground into the distance, and thats the difference with Fermi aswell, theres a layer of "lighter" ground where clearly the quality changes at a certain distance level. 10.10 looks very marginally better than 10.9, and Fermi's, and HQ and normal look all but identical.

If it was a "standard" setting where AMD gained 10% across the board, frankly those two games would not look BETTER on Barts and on the 10.10 than Fermi and the 10.9 drivers, so there you go, disproved and rubbish.
 
Last edited:
This thread will deliver very soon :)

On the side note, I'm using Catalyst 10.11 and my CCC hasn't even changed so have no idea what they're talking about :p
 
We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing. We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.

Amen to that. :)
 
http://www.computerbase.de/bildstrecke/31423/19/

first pics I clicked on an obscure German website, I can't see ANY difference between HQ AF and AF in the barts, I also can't see any difference between 10.9 and 10.10 hq, and I can't see any difference between normal AF on Barts, and Fermi HQ.

EDIT:- there are very marginaly differences, but without seeing the original texture to know which was closest I have no idea which is "right" they look equally good, with some very very marginal differences with 98% identical on each card, for all intents and purposes they are all very good, identical quality with no difference between them.

http://www.computerbase.de/bildstrecke/31423/14/

Can't see a difference again between HQ and normal, and both look "marginally" better than Fermi, theres one area not that "far away" where the AMD AF is clearer in one patch, its a very small difference.

If you read the blog article NVidia make it clear that the degradation is mainly visible in motion and thus will not show up in screenshots.
 
I'm liking how DM is hand waving about this and posting walls of text to say how this doesn't matter and that Nvidia are big poopy heads for daring to say anything negative about his precious stock portfolio. When we all know if the situation was reversed (as it has been in times past) he and many others would be raging and banging on about what big green meanies they are.

Fact is AMD have been caught with their pants down (I'm liking how it appears that AMD have a specific AF path when a popular AF testing app is detected..) and I would expect many other sites to shortly pick up on this. This thread should get very entertaining very quickly.

f3s4rn.jpg
 
I find it funny, they've increased the ability to do FP16 filtering, when Nvidia have complained for years that AMD have the drivers choose FP16 filtering over FP32 when theres no image quality improvement, and they've gone into detail about how AMD cheat in such games as, Far Cry 1, Serious Sam etc.

When you look at the basic speed improvement in some games from 480-580gtx, then some games get another 10% boost(what you see from using fp16 over fp32) you do wonder if the increase in FP16 filtering, they previously complained about and subsequently boosted performance for dramatically, isn't the reason here. Which by Nvidia's own definition would be cheating but there you go.


Nvidia are a big complaining pile of poop, not a single serious website managed to notice or care about any IQ changes, nor any end users who bought the cards, nor, most importantly, do all review sites only use default settings, which makes their whole position rather stupid.

Also with AMD things like the FP16 optimisation are ONLY used with the advanced catalyst option, something no one seems to use for benchmarking, Nvidia' optimises for EVERY game with different settings, so at default the driver is optimised for basically every benchmarking game, for AMD that only happens with the advanced setting in Cat AI, which no one uses.

I'd also question "all these sites" seeing it as, they've linked to one site that spots a difference, who I've never heard of, as for performance, who says the performance changes 10%.

Has anyone changed from all "fastest" settings in drivers to all "quality" settings and seen a 10% performance difference, let alone one setting up one level, I certainly haven't. Sounds like Nvidia see one review that questions it and have jumped all over it.

http://www.computerbase.de/bildstrecke/31423/19/

first pics I clicked on an obscure German website, I can't see ANY difference between HQ AF and AF in the barts, I also can't see any difference between 10.9 and 10.10 hq, and I can't see any difference between normal AF on Barts, and Fermi HQ.

EDIT:- there are very marginaly differences, but without seeing the original texture to know which was closest I have no idea which is "right" they look equally good, with some very very marginal differences with 98% identical on each card, for all intents and purposes they are all very good, identical quality with no difference between them.

http://www.computerbase.de/bildstrecke/31423/14/

Can't see a difference again between HQ and normal, and both look "marginally" better than Fermi, theres one area not that "far away" where the AMD AF is clearer in one patch, its a very small difference.

You can see here again that 10.10 is BETTER than 10.9, VERY marginally, the difference is best described as, 10.9 seems to change the darkness of the ground textures very close to the character, 10.10 seems to maintain uniform darkness of the ground into the distance, and thats the difference with Fermi aswell, theres a layer of "lighter" ground where clearly the quality changes at a certain distance level. 10.10 looks very marginally better than 10.9, and Fermi's, and HQ and normal look all but identical.

If it was a "standard" setting where AMD gained 10% across the board, frankly those two games would not look BETTER on Barts and on the 10.10 than Fermi and the 10.9 drivers, so there you go, disproved and rubbish.

I think your comment needs more explanation because all i see is well reasoned post by DM.

So go ahead & point out & disprove the numerous blind hate parts while i take a bath.

Well there you go... It just seems every nvidia thread i go on DM is always trashing Nvidia and defending AMD... But then he has got shares in AMD so... ;)
 
Back
Top Bottom