• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia cheating again, this time Crysis

How could it be a bug that is fixed when you change the name of the executable? yeah right....
It's very simple if you understand how drivers are made these days. Every driver release identifies the exe you're running and employs specific optimization for that game. All drivers have done this for a long time now, and it's a good thing because the manufacturers can tweak buffers sizes, data transfers, texture placement etc, all for a specific game giving a genuine performance increase. So all we have here is an optimization for a specific game, in a beta driver,that isn't quite compatible with the brand new GT. Renaming the exe means the crysis specific optimizations aren't used, so the bug doesn't show up. End of story.
 
Yeah they are, but why do the final 169.02 WHQL drivers do the same (according to those).

They will fix it, theres another set of drivers coming this week that will miraculously render the game correctly, you can be sure of that now that they've been rumbled, and how can you say its not cheats when its only the 169.xx drivers that do it, any drivers before the 169's all render the game correctly, so its no driver bug, and no, its because the games new etc..., its obvious they've added the cheats in to these drivers to boost the peformance, otherwise they would have been in every driver set before the 169's as well.

Oh dear, you're so adiment on this being a cheat that you completely ignore/forget/fail to understand why it probably isn't.

Both profile managers in ATi's and nVidia's drivers require the naming of the executable, this in turn changes how the card does it's magic - why else would it care what the executable is called?

You, yourself know that ATi released a driver during the X1800 reign that changed how the ring bus was programmed to boost performance in OpenGL games - new driver releases have these sort of optimisations all the time, sometimes they just don't have the desire outcome - after all, despite coding for a set amount of cards, nvidia simply can't compensate for all the conditions their cards will be used in (different platforms etc).

This is no different to the aforementioned OpenGL tweak going wrong, although the OpenGL tweak most likely didn't need executable name recognition as a driver should be able to recognise a different API without needed the file name.

6800 series as I said earlier had a problem with CSS and the grass and water as they tried to tweak the way shaders were used to render these items faster. The outcome was shiney white specked grass and wee wee coloured water.

Was this their desire? No, it was to render these objects faster without the end user being none the wiser due to no change in appearance.

Thank god we get value tin foil, you must go through roles of it!
 
Oh dear, you're so adiment on this being a cheat that you completely ignore/forget/fail to understand why it probably isn't.

Both profile managers in ATi's and nVidia's drivers require the naming of the executable, this in turn changes how the card does it's magic - why else would it care what the executable is called?

You, yourself know that ATi released a driver during the X1800 reign that changed how the ring bus was programmed to boost performance in OpenGL games - new driver releases have these sort of optimisations all the time, sometimes they just don't have the desire outcome - after all, despite coding for a set amount of cards, nvidia simply can't compensate for all the conditions their cards will be used in (different platforms etc).

This is no different to the aforementioned OpenGL tweak going wrong, although the OpenGL tweak most likely didn't need executable name recognition as a driver should be able to recognise a different API without needed the file name.

6800 series as I said earlier had a problem with CSS and the grass and water as they tried to tweak the way shaders were used to render these items faster. The outcome was shiney white specked grass and wee wee coloured water.

Was this their desire? No, it was to render these objects faster without the end user being none the wiser due to no change in appearance.

Thank god we get value tin foil, you must go through roles of it!

Great post, thanks a bunch!
 
Cheating is bad, m'kay! :(

Nvidia and AMD should just let the card render the game the way the developers intended, not mess with it to make it look worse for the sake of a few frames per second. Can do that in the game options if we have to.

The only tweaks drivers should make IMHO is adding AA/AF if the game has no native support.
 
It's very simple if you understand how drivers are made these days. Every driver release identifies the exe you're running and employs specific optimization for that game. All drivers have done this for a long time now, and it's a good thing because the manufacturers can tweak buffers sizes, data transfers, texture placement etc, all for a specific game giving a genuine performance increase. So all we have here is an optimization for a specific game, in a beta driver,that isn't quite compatible with the brand new GT. Renaming the exe means the crysis specific optimizations aren't used, so the bug doesn't show up. End of story.

That’s so not the case. By the sounds of it Nvidia has drivers which identify games which may be used in reviews and puts a load of optimisations in them. This is not a great idea as it means that they have to release drivers when a "review" game comes out. Now, as I stated in my previous post - this is not cheating, just a very rigid / high maintenance way of doing things. It also hides the fact that they might under perform in certain situations and means that if you buy a game that isn’t mainstream, your prize graphics card might not be performing al that good, as Nvidia are ignoring it because its not a "review" game.

I will wait on further news as to whether these are game optimisations or corner cutting cheats.
 
Nvidia has obviously put something into their drivers that pick up the game and do something specific for it. Is this cheating? Maybe - at best it shows they are unable to do things "right"; I dont think putting in optimisations for benchmark / review games a good thing to do.

Maybe there unable to get it right but lets be honest with all the rubbish ati had pre 7.10 drivers nvidia drivers have generally been superb. I remember the only time a 2900xt was getting above the GTX was in World of Conflictand that was due to an ati bug
 
Cheating is bad, m'kay! :(

Nvidia and AMD should just let the card render the game the way the developers intended, not mess with it to make it look worse for the sake of a few frames per second. Can do that in the game options if we have to.

The only tweaks drivers should make IMHO is adding AA/AF if the game has no native support.

Sound stupid tbh, when they make the drivers origionally they have no idea what game developers are going to do in the future, if they can improve the performance with no loss of quality theyd be shooting themselves in the foot no too.

If you dont want to update your drivers you dont have to, stick with the release drivers for your card.
 
Sound stupid tbh, when they make the drivers origionally they have no idea what game developers are going to do in the future, if they can improve the performance with no loss of quality theyd be shooting themselves in the foot no too.

If you dont want to update your drivers you dont have to, stick with the release drivers for your card.
Erm, these drivers came out after the beta and demo so they did know. Apparently previous driver versions did not have this "optimisation", or am I missing something? Either way GTFO of my face, I'm not in the mood for forum drama today so let's play nice or just find someone else to argue with.
 
Last edited:
That’s so not the case. By the sounds of it Nvidia has drivers which identify games which may be used in reviews and puts a load of optimisations in them. This is not a great idea as it means that they have to release drivers when a "review" game comes out. Now, as I stated in my previous post - this is not cheating, just a very rigid / high maintenance way of doing things. It also hides the fact that they might under perform in certain situations and means that if you buy a game that isn’t mainstream, your prize graphics card might not be performing al that good, as Nvidia are ignoring it because its not a "review" game.

I will wait on further news as to whether these are game optimisations or corner cutting cheats.

You're wrong, game engine optimisation by file name is not just put into drivers for reviews. They're used in every driver you've installed on your system for as long as you've owned a graphics acceleration card.

Game profiles in drivers, exist both in ATi and nVidia driver suites, every game isn't rendered the same which is why drivers require (note - not put in to cheat) a profile to make the card handle things differently to increase performance. If they didn't, everyone would be complaining about poor performance.

You see all the time in ATi release notes, that games at a certain resolution has performance improvements, they even give percentages. The graphics card is still the same in terms of specification, so they've just found a way to make the card perform better with how the engine used it to render a scene.

Everyone has seen that Crysis takes a lot of power to run, would you rather them sit around and do nothing about it?

When BF2 online demo was released (Gulf of Omar map) nVidia packed an updated forceware with it to fix a bug with previous drivers where the ground was being rendered in black. You can't just write a driver and expect it to be compatable with every game current, past and future.
 
You see all the time in ATi release notes, that games at a certain resolution has performance improvements, they even give percentages. The graphics card is still the same in terms of specification, so they've just found a way to make the card perform better with how the engine used it to render a scene.
Aye I agree with you on this, it's something that's always done and always has been done, but can't they do it without interfering with image quality? I mean the optimisations they usually do seem to manage without creating graphical glitches and things.

Seems like they just optimise the driver code most of the time as opposed to chopping stuff.
 
Erm, these drivers came out after the beta and demo so they did know. Apparently previous driver versions did not have this "optimisation", or am I missing something? :confused: Either way GTFO of my face, I'm not in the mood for forum drama today so let's play nice or just find someone else to argue with.

GPU drivers are released months down the line after games have been released that improve performance, just look at some of the ATi public change logs to see this, some games years old.

You can't just leave a game to run how desired, if you take that stance then something as good as the Chucky Patch would never have existed.
 
GPU drivers are released months down the line after games have been released that improve performance, just look at some of the ATi public change logs to see this, some games years old.

You can't just leave a game to run how desired, if you take that stance then something as good as the Chucky Patch would never have existed.
Aye I know what you mean, and while the Chuck Patch was great it too did some nasty things to image quality (Oblivion gate shimmering effect being visible through objects, etc.)

Maybe the Nvidia/AMD driver teams and game developers just need to work together a bit more or something so that they can collaborate on such issues.

Edit: I think we're replying to eachother too quickly. :p
 
Last edited:
Aye I agree with you on this, it's something that's always done and always has been done, but can't they do it without interfering with image quality? I mean the optimisations they usually do seem to manage without creating graphical glitches and things.

Seems like they just optimise the driver code most of the time as opposed to chopping stuff.

I'm sure they can and most probably do, but you get the odd glitch here and there in all software. Taking into consideration different cards, platforms, resolutions, operating systems and much more, it would be hard to test a driver under all circumstances.
 
Aye I know what you mean, and while the Chuck Patch was great it too did some nasty things to image quality (Oblivion gate shimmering effect being visible through objects, etc.)

Maybe the Nvidia/AMD driver teams and game developers just need to work together a bit more or something so that they can collaborate on such issues.

Edit: I think we're replying to eachother too quickly. :p

It's like MSN :D

It's probably one of those things where two companies come together, but each entity obviously specialises in it's own field but they may know a bit about each others fields not not enough to both change things about the game/driver to make it run as smoothly as possible.
 
It would be cheating if it was only in the benchmark, but this is also when playing the game, I would call it a optimisation bug in BETA drivers.
 
It would be cheating if it was only in the benchmark, but this is also when playing the game, I would call it a optimisation bug in BETA drivers.
True but don't the gameplay and timedemo share the same .exe file and code? If so then I would think it'd be impossible for the drivers to distinguish between them. Then again, I don't know much about code etc.
 
True but don't the gameplay and timedemo share the same .exe file and code? If so then I would think it'd be impossible for the drivers to distinguish between them. Then again, I don't know much about code etc.
True, I therefore can't imagine Nvidia taking the chance by 'cheating' and doing it on purpose.
 
Mekrel: Did you see the part where I said I was "waiting to see if its a fudge or a corner cutting cheat"? I think its fairly obvious having "optimisations" for individual games / engines is stupid. It shows poor foundations on either the gfx card or game engine. Imagine if each car needed specialised tyres?

Also in terms of "both" companies doing it - yes I see the release notes, but I don't recall ATI's fps dropping when you change the exe, which means one of a few things:
* Things have changed and I'm mistaken!
* ATI has a better way of working out what game is running.
* ATI finds an "issue" with how the drivers were working and fix it as an optimisation for ALL games that work in a given way, but would advertise the change for a single game (i.e. they say Crysis can become 10% faster, but other games would also gain from the change they have made).

I wouldnt be totally surprised if ATI did something similar, but as I said, sounds like a bodge job if you need to "fix" things for single games.
 
do we know that the image problems area direct result of the driver optimisations, or are they two seperate issues?


i would actually like this question to be answered pleased. the usual peopel have just jumped straight in there on the hate train without giving it a thought.
 
Back
Top Bottom