• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will GTX570 be caught in new app detection cheat?

Soldato
Joined
8 Aug 2010
Posts
6,453
Location
Oxfordshire
Is there a difference and is the difference real?
OK, so enough preamble, what is it we’re really talking about? Below is a simple animated GIF that (according to the posts we have received) seems to show that the anti-aliasing image quality (IQ) drops off for the GTX570/580 is the driver detects that HAWX (commonly used for benchmarking) is running. Looking along the plane’s edge, one of the images definitely appears to show more detail. How was this achieved? By renaming the application, we’re being told. It could all be an elaborate ruse, but if benchmarks are being detected to allow image quality to be dropped and benchmark scores to be raised, then that’s pretty serious stuff. The animated GIF will take a few seconds to load. We recommend that you let it roll through a few times and you’ll see that more detailed sampling seems to be done when the same program is called HACKS than HAWX.
If these shots tell the whole truth and you had to write this ‘logic’ into a sentence (that anyone could understand), it would say “If you’re being asked to run a game that’s commonly used as a benchmark, then do less work”. Have a look and tell us if you can see less sampling when the drivers detect HAWK and not HACKS.


GTX570-Possible-IQ-Issue.gif


slide0.png



slide2.png



Edit:

Oops forgot link...
http://www.kitguru.net/components/graphic-cards/jules/will-gtx570-be-caught-in-new-app-detection-cheat/

Original thread in forum...
http://www.kitguru.net/forum/showthread.php?p=22541#post22541
 
Last edited:
If you look at other parts of the image theres no difference tho - so its possible just they didn't capture 2 quite identical frames. Being as its kitguru I'm just gonna laugh as they seem to have a technical expertise well below the average for this forum even.

EDIT: Also its a GIF... even slight variations in the 2 source images could produce quite a difference due to the use of color palettes.
 
Highly likely - GIF uses an 8bit palette - to save a long winded post on the topic - 2 slightly different colors in a 24bit source image can come out widely different due to the lack of precision with an 8bit palette.

Show me the uncompressed, original resolution, source images and then we can start a meaningful analysis.
 
Highly likely - GIF uses an 8bit palette - to save a long winded post on the topic - 2 slightly different colors in a 24bit source image can come out widely different due to the lack of precision with an 8bit palette.

Show me the uncompressed, original resolution, source images and then we can start a meaningful analysis.

slide30.png


slide40.png
 
No prizes for guessing AMD are pointing this out then? It's always fun to see a good mudslinging match between two corporate giants. :p
 
Yes, the performance goes down by an amount, I would guess, would be to do with potentially, the FP16 vs fp32 filtering.

THe thing Nvidia complained for YEARS about AMD using, then all of a sudden they've bumped up FP16 filtering capacity in the hardware, and theres in some games, an abnormal growth rate in the 580gtx vs the 480gtx, in speed. IE most things should be around 16% faster, shaders + overclock, then a small selection of games seem to show between 20-40% performance gain, which is roughly speaking the same performance difference AMD got using FP16 filtering when it thinks IQ isn't effected negatively(which is also ONLY used in like 6-7 games going back as far as Far Cry 1 and Serious Sam) and only when Cat AI was set to advanced, which basically no one uses in reviews as a driver setting anyway.

EDIT:_ I'll say I have no idea what effect downgrading filtering would do, but I found it highly suspect that Nvidia up till not long ago would send out "please don't use these games, pretty please, AMD cheats in them in settings you won't use for reviews" rants in their "review advice packs". Without both a 480/580gtx here to compare myself, it was always just a suspicion that quality had been reduced in some of the better performance gaining titles, Hawx 2 was one of those that scaled quite a bit beyond the expected 15% performance increase wasn't it?

But when I saw they'd bumped up fp16 filtering, vs complaining about its lack of quality and it not thinking it was good enough and complaining about AMD doing it, I suggested this a week, or two weeks ago on a couple forums, as a reason for the bigger performance gains between 580/480gtx.
 
Last edited:
Back
Top Bottom