• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will GTX570 be caught in new app detection cheat?

Its pretty immaterial but it does make me chuckle!

This is Nvidia...last week!

For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality. During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”. Special-casing of testing tools should also be considered a “cheat”.

We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing. We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.

Both companies are just as bad.
 
Last edited:
Hi Everybody,

What is being experienced is not an “Antialiasing cheat” but rather a HawX bug that is fixed by our driver using an application specific profile.
In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.

You may remember that Geforce 8800 introduced Coverage Sampling AA (CSAA) technology, which added higher quality AA using little extra storage. Prior to 8800 GTX and CSAA, there was only one “sample quality level” for each AA level, so if an application requested four AA samples, the hardware performed standard 4xAA. However, with 8800 GTX GPUs onwards, our drivers expose additional sample quality levels for various standard AA levels which correspond to our CSAA modes at a given standard AA level.

The “sample quality level” feature was the outcome of discussions with Microsoft and game developers. It allowed CSAA to be exposed in the current DirectX framework without major changes. Game developers would be able to take advantage of CSAA with minor tweaks in their code.

Unfortunately, HawX requests the highest quality level for 4xAA, but does not give the user the explicit ability to set CSAA levels in their control panel. Without the driver profile fix, 16xCSAA is applied instead of standard 4xAA. Recall that 16xCSAA uses 4 color/Z samples like 4xAA, but also adds 12 coverage samples. (You can read more about CSAA in our GeForce 8800 Tech Briefs on our Website).

When you rename the HawX.exe to something else, the driver profile bits are ignored, and 16xCSAA is used. Thus the modest performance slowdown and higher quality AA as shown in the images.

To use “standard” 4xAA in a renamed HawX executable, you should select any level of anti-aliasing in the game, then go into the NVIDIA Control Panel and set 4xAA for “Antialiasing Setting” and turn on “Enhance the application setting” for the “Antialiasing mode”.
Nick Stam, NVIDIA


fair enough ?
 
My opinion about all this is...
  • If a review site is unable to set comparable quality settings to test the card then its cheating.
  • If a company claims "fastest GPU" and another company has the faster one with comparable quality then its false advertising.
  • If a company claims "##% increase over previous generation" and this is in any significant part due to an improvement of drivers which is applicable to the previous generation then its false advertising.

The main blame for these things should be the review sites if they can work around these variables to give a sensible comparison but fail to do so.
 
I think its long since due review sites upped their game. Made more of an effort to document what quality settings they were using and that the settings were comparable and presented a % of the time a GPU was under 30fps and % over 60fps instead of meaningless min/max/avg numbers.
 
And yeah the dodgy IQ on the AMD cards is more apparent when you see it in action and not stills.


Such as? name all these benchmarks with worse IQ and provide some proof.


ATI quack anyone.

If you really want to be pedantic, read what the IQ issue is, and work out what the video's show.

The IQ issues is SUPPOSED to be WORSE AF at default setting, NOT shimmering.

The video's show shimmering with BOTH AF qualities, and both are BETTER than Cypress/Nvidia AF quality.

Nvidia are complaining about the performance supposedly gained from the WORSE AF, and that increasing the setting is the only way to make it visually on par with an Nvidia card.

Heres where they are wrong, default setting has BETTER AF than Nvidia, and in those video's(the only suggestion I've seen of anything yet) it shows shimmering in both the supposedly faster, AND THE SLOWER driver settings.

Again, Nvidia are not complaining about the shimmering, but the IQ quality of AF. 6870 in newer 6870 supporting drivers obviously offers an increase in IQ, obviously the driver has to be aware there is new AF hardware and a new mode to use.

Nvidia were moaning that IQ is worse than 10.9 with an increase in performance, nothing to do with shimmering. The fundamental flaw is, AF is improved over 10.9.

I have no clue what that game is, or benchmark, I have no doubt new architectures and modes bring with it shimmering and other bugs, none of this is what Nvidia is complaining about.
 
I think its long since due review sites upped their game. Made more of an effort to document what quality settings they were using and that the settings were comparable and presented a % of the time a GPU was under 30fps and % over 60fps instead of meaningless min/max/avg numbers.

To be fair, they do, that IS what the average is.

If you have min 0fps, average 60fps, and max 61fps, can you not deduce that the minimum fps is incredibly rare and that its basically spot on 60fps throughout?

What about min 25fps, average 60fps, max 364fps, again pretty damn easy to work out the 364fps is an outlier and performance isn't there often.

However a lot of sites have moved away from showing maximum's lately and just do averages or min and average, which doesn't tell you the complete picture.


At a certain point tabulating 50 cards results while also showing mean deviation from average fps, and the 1st and 4th quartiles would just make for a 500 page long review of graphs that hold no meaning by the end.

I'd like it if a site tended to focus on the benchmark they find most realistic, and show some more intricate frame rate detail, but for lots of games it would be a nightmare.
 
Unfortunatly the numbers are rarely that cut and dried. It doesn't take that much extra effort to produce the results I'm talking about - simply run fraps in benchmark mode to get frametime data and graph it.
 
Back
Top Bottom