• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia says ATI are cheating again

In comparison to the optimum AF image on the right, they are identical, now compare the 6*** left image with the right, the flickering is bad.

The 6000 is worse but as my edit spells out it has to be in a specific situations for it to be that bad.
 
Just skimmed through the thread, has anyone raised the question, is it a noticeable IQ change?

OK, I can see the point that it's not strictly fair for an apples to apples comparison which Nvidia is making. HOWEVER, I don't blame ATi (or even Nvidia) if they can get away with gaining a 10% performance increase with no easily noticeable IQ degradation.

Just wondering.

Reviewers noticed it, how long before gamers noticed it? If ATI is still plagued by shimmering issues as this video seems to suggest, why would you reduce your default image quality even more?

http://www.tweakpc.de/hardware/tests/grafikkarten/amd_radeon_hd_6870_hd_6850/s09.php

Google translate version

http://translate.google.com/transla...afikkarten/amd_radeon_hd_6870_hd_6850/s09.php
 
Ah, the flickering issues. Yep, had them before, haven't noticed for a long time though.

I have noticed it in some games & playing at 2560x1600 since 2006 makes it more noticeable but i have never really cared as the AF was set to low thus blur the distant textures anyway & AA reduces the flicker as my gfx cards in the past have not had the power for high AF & AA until now anyway but i would always favour AA over AF.
 
I'm concentrating more on the fact that ati are doing it now, rather than nvidia having done it before. it's not an excuse is it?

Reviewer's have mentioned the ATI flickering for years its just lately that is has not been mentioned as much maybe because its a given.

Personally there are other graphically priorities more important to me.
 
Last edited:
I think a big fuss is being made because AMD have made the quality setting the default for the 10.1 drivers, this is the setting with the supposed dodgy AF optimizations. Now reviewers will use the default settings and maybe miss the fact that AMD IQ is reduced to gain FPS. This makes it an unfair comparison with Nvidia cards tested with default IQ settings but deliver better IQ. AMD need to leave any lowering of IQ down to the user.
 
Last edited:
I think a big fuss is being made because AMD have made the quality setting the default for the 10.1 drivers, this is the setting with the supposed dodgy AF optimizations. Now reviewers will use the default settings and maybe miss the fact that AMD IQ is reduced to gain FPS. This makes it an unfair comparison with Nvidia cards tested with default IQ settings but deliver better IQ. AMD need to leave any lowering of IQ down to the user.

Good point.
 
The fact that the usual suspects are jumping to AMD's defence on this issue is frankly laughable.

Seriously, stand away from the keyboard and have a good think about it and see how pathetic you look.

This forum becomes more unreadable by the day.
 
AMD doesn't like me.

I have owned 4 different AMD cards in the last 2 years and I get the same problem every time, red flicking dots.

I got my gtx 580 few days ago, no problem so far. The IQ does look better then my last card which was a 5970.
 
I wonder if the videos posted showing the shimmering are accurate, as if they are then nVidia has better IQ - the shimmering is noticeable in both Quality and High Quality on AMD cards but not on nVidia cards.

To me the situation is simple. If AMD is intercepting calls made for one quality setting and setting them to a lower setting then that is cheating if enabled by default. However, I also feel it is cheating for nVidia to throw money and programmers at developers in ensure better optimisations for their cards.
 
AMD doesn't like me.

I have owned 4 different AMD cards in the last 2 years and I get the same problem every time, red flicking dots.

I got my gtx 580 few days ago, no problem so far. The IQ does look better then my card which was a 5970.
 
The fact that the usual suspects are jumping to AMD's defence on this issue is frankly laughable.

Seriously, stand away from the keyboard and have a good think about it and see how pathetic you look.

This forum becomes more unreadable by the day.

There will always be the usual suspects jumping to the defence on either side, but just mentioning that fact does not solve nor achieves anything without putting some reasoned comments forwards to debunk there defences.
 
Last edited:
Just skimmed through the thread, has anyone raised the question, is it a noticeable IQ change?

OK, I can see the point that it's not strictly fair for an apples to apples comparison which Nvidia is making. HOWEVER, I don't blame ATi (or even Nvidia) if they can get away with gaining a 10% performance increase with no easily noticeable IQ degradation.

Just wondering.

Funny, I don't remember ATI fans having this opinion when it was NVidia cutting corners? it seems like double standards to me.

When NVidia used to force FP16 on Geforce FX they were slaughtered for it, even though the visual difference was not noticeable in normal gaming (and performance was 50% faster as a result).

AMD now caught doing similar...

FP16 Render Observations
In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate.
 
Back
Top Bottom