• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Shenanigans again?

Lets be honest, ATI do not seem to help game developers anywhere near like Nvidia(TWIMTBP) do so maybe there is your answer(Support them).
 
Please show us a link to a case where AMD worked with a game dev to increase performance on their hardware whilst at the same time showing bad performance on Nvidia hardware, I don't think you'll find one.

Show us a case where ATI helped a game developer full stop.
 
Please show us a link to a case where AMD worked with a game dev to increase performance on their hardware whilst at the same time showing bad performance on Nvidia hardware, I don't think you'll find one.

ATi don't tend to do this, that's more an nVidia thing.

What ATi do is add .exe detection to their drivers to use reduced quality 'tweaks' to give higher performance at the cost of eye-candy.


Such as:

FP16 Demotion is something used by ATI Catalyst driver to boost a bit the performances of 3D rendering in benchmarks or games. What is FP16 Demotion? Simply the use of a 32-bit render target instead of a 64-bit render target. In HDR rendering, it’s common to use a render target where each channel (R, G, B and A) is coded on a 16-bit floating point number, called FP16. So a FP16 render target will use 16×4 = 64-bit for each texel. In order to reduce memory size and speed up access, the catalyst drivers use the R11G11B10 mode: the read channel is coded with 11 bits, the green with 11 bits and the blue on 10 bits for a total of 32 bits per texel. The alpha channel is no longer used.

...

NVIDIA has included in the NVIDIA GTS450 Reviewer’s Guide a warning about AMD:

Important note if you are testing the following applications:

* Dawn of War 2
* Empire Total War
* Need for Speed: Shift
* Oblivion
* Serious Sam II
* Far Cry 1

AMD has admitted that performance optimizations in their driver alters image quality in the above applications. The specific change involves demoting FP16 render targets to R11G11B10 render targets which are half the size and less accurate. The image quality change is subtle, but it alters the workload for benchmarking purposes. The correct way to benchmark these applications is to disable Catalyst AI in AMD’s control panel. Please contact your local AMD PR representative if you have any doubts on the above issue.
NVIDIA’s official driver optimization’s policy is to never introduce a performance optimization via .exe detection that alters the application’s image quality, however subtle the difference. This is also the policy of FutureMark regarding legitimate driver optimizations.

NOTE: If you wish to test with Need for Speed: Shift or Dawn of War 2, we have enabled support for FP16 demotion – similar to AMD – in R260 drivers for these games. By default, FP16 demotion is off, but it can be toggled on/off with the AMDDemotionHack_OFF.exe and AMDDemotionHack_ON.exe files which can be found on the Press FTP.
For apples-to-apples comparisons with our hardware versus AMD, we ask that you run the AMDDemotionHack_ON.exe when performing your graphics testing with these games. In our own internal testing, speedups of up to 12% can be seen with our hardware with FP16 demotion enabled.



ATi back in the day also used Anniso and AA quality tweaks, plus various .exe detection efforts that reduced quality.



Neither company plays entirely fair when it comes to winning benchmarks ;)
 
ATI have cheated in the past but the examples I can think of were from years ago, Doom 3 is one and I think they got caught out with a 3Dmark 2003 once as well.
 
Yeah ATI are well known for using dodgy optimizations that reduce IQ to gain faster frames, Quack3 anyone, word is it is still going on in titles as recent as crysis.
 
" Given that in their own documents, NVIDIA indicates that the R11G11B10 format "offers the same dynamic range as FP16 but at half the storage", it would appear to us that our competitor shares the conviction that R11G11B10 is an acceptable alternative "

"The section that the ATI response quoted was included in the NVIDIA DirectX10 Technical Brief (PDF, Page 11), verbatim, seemingly fully supporting the use of FP16 Demotion. Wait, what? Didn't NVIDIA just claim this was a bad thing ?"


Read the whole article here, if you haven't already :

http://www.atomicmpc.com.au/Feature...s-and-degrading-game-quality-says-nvidia.aspx


"We also can't help but wonder why NVIDIA attempted to make such an issue over this rendering technique; one that they wholeheartedly support and simultaneously condemn"
 
ATi don't tend to do this, that's more an nVidia thing.

What ATi do is add .exe detection to their drivers to use reduced quality 'tweaks' to give higher performance at the cost of eye-candy.


Such as:





ATi back in the day also used Anniso and AA quality tweaks, plus various .exe detection efforts that reduced quality.

Nvidia do this one also. And I think theirs a big difference Optimising/cheating in your drivers, than paying others to cheat as well, and then blackmailing others who didn't know what good for them...
 
Nvidia do this one also. And I think theirs a big difference Optimising/cheating in your drivers, than paying others to cheat as well, and then blackmailing others who didn't know what good for them...

There is no right way to cheat(this applies to both of them), stop trying to justify anything ATI does as different. It is getting boring
 
Last edited:
" Given that in their own documents, NVIDIA indicates that the R11G11B10 format "offers the same dynamic range as FP16 but at half the storage", it would appear to us that our competitor shares the conviction that R11G11B10 is an acceptable alternative "

"The section that the ATI response quoted was included in the NVIDIA DirectX10 Technical Brief (PDF, Page 11), verbatim, seemingly fully supporting the use of FP16 Demotion. Wait, what? Didn't NVIDIA just claim this was a bad thing ?"


Read the whole article here, if you haven't already :

http://www.atomicmpc.com.au/Feature...s-and-degrading-game-quality-says-nvidia.aspx


"We also can't help but wonder why NVIDIA attempted to make such an issue over this rendering technique; one that they wholeheartedly support and simultaneously condemn"

It's slightly different though isn't it?

Ati were putting in sneaky driver optimisations that reduce quality to improve performance, on specific games that are generally used for benchmarking in order to inflate scores. It's not a matter of whether or not Nvidia think it's a good idea.

That's not what's happening in the case of HAWX 2 here. AMD are performing worse, because the Nvidia cards are simply better at the tests. Nvidia are trying to push a test they know they'll do better in sure, but I don't see it anywhere near the same as reducing quality to improve benchmarks.

I guess it's too much to expect games developers to be competent at developing software these days without receiving large sums of 'help'...

Jog on. If Nvidia are willing to help fund development and help actual development, why would any sane developer turn them down?

If you were a developer wouldn't you want your game to work as well as possible on given hardware? If you're offered help to do this you're not going to turn it down just to be "fair" on AMD. AMD should offer to help more developers.
 
Last edited:
Of course i believe what i am typing, what is that supposed to mean?

So by saying i believe that cheating is wrong no matter what form it takes is a strange statement ?

Do you advocate cheating as long as it suits your needs ?
 
Back
Top Bottom