• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia says ATI are cheating again

I think your comment needs more explanation because all i see is well reasoned post by DM.

So go ahead & point out & disprove the numerous blind hate parts while i take a bath.

He is basically saying 'well I cant see it, therefore everyone else is wrong and I am right'. No reasoned rebuttal of Nvidia's blog post at all, just a wall of text designed to make people think he actually has a valid point beyond AMD=good Nvidia=bad.
 
He is basically saying 'well I cant see it, therefore everyone else is wrong and I am right'. No reasoned rebuttal of Nvidia's blog post at all, just a wall of text designed to make people think he actually has a valid point beyond AMD=good Nvidia=bad.

Yup
 
If they are using those kinda opptimisations they may also be using the age old trick of re-rendering the buffer with higher setting if something tries to read it i.e screenshots are a lie. Not saying it is the case but both nVidia and ATI have done it in the past.
 
I'm liking how DM is hand waving about this and posting walls of text to say how this doesn't matter and that Nvidia are big poopy heads for daring to say anything negative about his precious stock portfolio. When we all know if the situation was reversed (as it has been in times past) he and many others would be raging and banging on about what big green meanies they are.

Fact is AMD have been caught with their pants down (I'm liking how it appears that AMD have a specific AF path when a popular AF testing app is detected..) and I would expect many other sites to shortly pick up on this. This thread should get very entertaining very quickly.

f3s4rn.jpg

The IQ between ATI & NV have never been exactly the same & sometimes one does better than the other & vice versa, its only when its significant is when in bothered about it NV 7xxx ATI 18xx.

Its gotten to the point that some serious hair-splitting is need in this day & age & is no different to the numerous LCD panels that can show very little difference in IQ but huge difference in cost & most people would not notice the difference.

Since the NV 8xxx I.Q difference has not entered my mind & I'm fussy but not anal retentive.
 
Last edited:
Just skimmed through the thread, has anyone raised the question, is it a noticeable IQ change?

OK, I can see the point that it's not strictly fair for an apples to apples comparison which Nvidia is making. HOWEVER, I don't blame ATi (or even Nvidia) if they can get away with gaining a 10% performance increase with no easily noticeable IQ degradation.

Just wondering.
 
He is basically saying 'well I cant see it, therefore everyone else is wrong and I am right'. No reasoned rebuttal of Nvidia's blog post at all, just a wall of text designed to make people think he actually has a valid point beyond AMD=good Nvidia=bad.

Well im sure he is not speaking for everyone & me moving from 3870 to 5970 i have not noticed any Q.I improvement outside of DX11 even though we all know statistically that the 5970 has the better Q.I all round i have not been bothered to do a side by side to see it because i simply don't care.
 
There goes DM again with his hate for Nvidia....

Nice contribution :(

How about you try and attack what he says and not him personally then you might have said something worth reading. Except you come over as a troll.

And to comment on he actual thread topic. Nvidia can't have a sense of irony as a corporation when back when the FX cards were out against the ATi 9500/9700 and they were 'optomising' their drivers like crazy. Now at the time they needed to do it so they looked like they could compete.

I haven't seen any posts about reduced IQ on AMD cards wih cat' 10.10's, just some more NVidia muck being thrown, and to be honest is one of the main reaons I will stay an ATi/AMD g/card user for years to come. NVidia's PR machine has already spent years 'in the gutter'. I'll never rule out buying another Nvidia card as I've had a few before, but I will try to avoid it.
 
Just skimmed through the thread, has anyone raised the question, is it a noticeable IQ change?

OK, I can see the point that it's not strictly fair for an apples to apples comparison which Nvidia is making. HOWEVER, I don't blame ATi (or even Nvidia) if they can get away with gaining a 10% performance increase with no easily noticeable IQ degradation.

Just wondering.

I hope no one here plays any lossy compressions movies or music files.
 
This thread is going to make for some quality quotes the next time Nvidia is caught with their hand in the cookie jar.
 
Well there you go... It just seems every nvidia thread i go on DM is always trashing Nvidia and defending AMD... But then he has got shares in AMD so... ;)

Sorry but the balance of his post is not about hate & I'm not surprised that you could only pick out the 1% of his post & that is hardly a hate comment.

He also had shares in NV but sold them when they were worth more.
 
Nice contribution :(

How about you try and attack what he says and not him personally then you might have said something worth reading. Except you come over as a troll.

Its not a personal attack at him, its just noting what he says in pretty much everything he posts.

The last ATI card i had was a HD5870, then GTX480 and now a GTX470 so im not biased one little bit.

The only time i really noticed a difference in image quality is when the 8800GTX came out which was awesome.
 
AMD has the upper hand in IQ anyway (IMO, 2D - Hard to notice a difference in 3D as all the games I play are fast paced), so depending on the drop they might have simply matched NVs IQ.

If they did drop below that then it's a little pants...
 
i thought i was imagining it when i started questioning AF. I didnt realise ATi have been playing around with it in the 10.10's.


AMD has the upper hand in IQ anyway (IMO, 2D - Hard to notice a difference in 3D as all the games I play are fast paced), so depending on the drop they might have simply matched NVs IQ.

If they did drop below that then it's a little pants...

it states quite explicitly in the OP's link that its' not comparable to NV now.


Well there you go... It just seems every nvidia thread i go on DM is always trashing Nvidia and defending AMD... But then he has got shares in AMD so... ;)

If you are going to accuse him of anything, at least prove him wrong first.

drunkenmaster said:

first pics I clicked on an obscure German website, I can't see ANY difference between HQ AF and AF in the barts, I also can't see any difference between 10.9 and 10.10 hq, and I can't see any difference between normal AF on Barts, and Fermi HQ.

EDIT:- there are very marginaly differences, but without seeing the original texture to know which was closest I have no idea which is "right" they look equally good, with some very very marginal differences with 98% identical on each card, for all intents and purposes they are all very good, identical quality with no difference between them.

i can see it on some of those pics, but only because i know what im looking for - those pics are too small really but you can see problems with filtering and flickering at 45 degree angles , the same problems ive been having with my 5850. some games more than others though.
 
Last edited:
Check this comparison out of the 10.1 default quality setting and the high quality setting, the texture flickering is very bad. Notice how there is no flickering on the Nvidia default IQ setting.

So comparing still images will not always reveal the lower IQ as shown here.

http://www.tweakpc.de/hardware/tests/grafikkarten/amd_radeon_hd_6870_hd_6850/s09.php

The Radeon HD 6000 cards show a broad band fibrillation, which is HQ mode, almost as clearly arises in the Q mode. The flickering occurs on several parts of the image, and is striking.

Fibrillation? The GeForce 400 series of more or less unheard of. It makes no difference here between the right and left sides actually recognize. The GeForce makes this so clearly better than the Radeon cards.
 
Last edited:
Check this comparison out of the 10.1 default quality setting and the high quality setting, the texture flickering is very bad. Notice how there is no flickering on the Nvidia default IQ setting.

So comparing still images will not always reveal the lower IQ as shown here.

http://www.tweakpc.de/hardware/tests/grafikkarten/amd_radeon_hd_6870_hd_6850/s09.php

Yep the 460 has no flickering.

But you have to also take into account.

Of course fibrillation is the assessment of a relatively subjective matter.
It depends on the game / program on the texture and not least on the resolution and order and even from the screen. Particularly adverse to us in fine textures flicker on our 30''2560x1600 test screens have noticed.
 
Last edited:
Back
Top Bottom