• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia unofficially, officially caught cheating in Crysis 2

Em...ain't that a bit of double-standard here? Cause I am getting sighs of people here painting a clear picture of "whenever AMD/ATI performance suffer, Nvidia MUST BE the big bad behind it", despite most are baseless accursations without solid evidence.

By that logic, may be there should be a conspiracy theory for game developers secretly working with AMD to code their games to "over-use" VRAM to make AMD 2GB cards seems better than Nvidia cards (i.e. Shogun 2)? :o

Even IF Nvidia was responsible for the "over-use" of tessellation in Crysis 2 (not that there's ANY evidience points to that), has AMD got anyone to blame but themself that they fail to deliver competitive tessellation performance to their rival, when they clearly marketing their cards as dx11 cards? If anything it is their weakness in tessellation that lead to the opportunity for their rival to exploit, thus leading to crippled performance across ALL graphic cards.

It's getting a bit tiring to see people keep painting AMD as a "victim" and Nvidia as a "big bully", even though the truth is that there are areas on AMD products that can be improved. Bashing Nvidia for what they might have or might not have done is pointless...the simple truth is if AMD was truly Nvidia's rival, their customers should expect them to deliver compariable dx11/tessellation performance...it is as simple as that.

You beat me here, this post is lacking soo much reason and logic, I just don't know where I would begin, and even if I did, I would just be repeating what others have already said...

Now I remember why I can only tolerate this section of the forum in very small infrequent doses.
 
The best way for AMD to combat this deviousness is to design a card that doesn't suck ass at tesselation.

I agree, but then they are just being forced to waste transistors and die space for no good reason, and it's not like they suck either at tessellation if it's implemented as it should be.
I just don't get why most Nvidia users don't realise that Nvidia is hurting it's customers as well by pulling this crap, me included.
 
Without wishing to open another can of worms, I'd like to know how decides whether something has been 'implemented as it should be'. I'm sure both groups of fanatics could argue that their favoured manufacturer has the correct implementation of a specific feature.
 
Exactly, if it runs like crap on high end AMD cards, then anything less than a 480/580 GTX will run just as bad.

Does Crysis 2 run like crap on high end AMD cards? Serious question, I've not looked into benchmark performance on this game.

As for the performance vs NV low-mid range cards, the chart at the bottom of this page does show that under heavy tessellation loads the 6970 just beats out the 460. http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/23
 
its the illuminati. They have added tesselation all over the game to mae it un playable for the poor moslems and blacks, leaving the white supremacists and jews to finish this game because they can afford better hardware!
 
Laughing out loud, I didn’t realise it was

NATIONAL SENSATIONALIST HEADLINE DAY.

You do realise your arguing about the fact that a game is too detailed, isn’t this exactly what we want as gamers, more detail in games. yes I agree that going by that article it seems the tessellation is used on what seem to be unnecessarily detailed objects, but is this Nvidia fault, hardly. In that article Nvidia are mentioned 4 times.

http://techreport.com/articles.x/21404/6

'The trouble comes when, as sometimes happens, the game developer and GPU maker conspire to add a little special sauce to a game in a way that doesn't benefit the larger PC gaming community. There is precedent for this sort of thing in the DX11 era. Both the Unigine Heaven demo and Tom Clancy's HAWX 2 cranked up the polygon counts in questionable ways that seemed to inflate the geometry processing load without providing a proportionate increase in visual quality.'

' Few games have shown a similar effect, simply because they don't push enough polygons to strain the Radeons' geometry processing rates. However, with all of its geometric detail, the DX11 upgraded version of Crysis 2 now manages to push that envelope. The guys at Hardware.fr found that enabling tessellation dropped the frame rates on recent Radeons by 31-38%. The competing GeForces only suffered slowdowns of 17-21%.'

'As a publication that reviews GPUs, we have some recourse, as well. One of our options is to cap the tessellation factor on Radeon cards in future testing. Another is simply to skip Crysis 2 and focus on testing other games. Yet another is to exclude Crysis 2 results from our overall calculation of performance for our value scatter plots, as we've done with HAWX 2 in the past. We haven't decided exactly what we'll do going forward, and we may take things on a case-by-case basis. Whatever we choose, though, we'll be sure to point folks to this little article as we present our results, so they can understand why Crysis 2 may not be the most reliable indicator of comparative GPU performance.'

It's funny how lesser developers can handle tessellation with no problem better than the very adept Crytek:

468551e57a3b473efe4e326cb9d28523.jpg


Considering that this is an animated object, not a plank of flat wood or a flat piece of concrete!
 
Without wishing to open another can of worms, I'd like to know how decides whether something has been 'implemented as it should be'. I'm sure both groups of fanatics could argue that their favoured manufacturer has the correct implementation of a specific feature.

Tessellation is not being implemented correctly if the following conditions are met:

a) Excessive amounts of tessellation put into extremely small areas, even areas smaller than a single pixel, that doesn't actually provide better image quality.

b) Where Tessellation/polygons are being squandered needlessly, where they could be better spent elsewhere, where the difference can actually be observed.
 
Last edited:
Does Crysis 2 run like crap on high end AMD cards? Serious question, I've not looked into benchmark performance on this game.

As for the performance vs NV low-mid range cards, the chart at the bottom of this page does show that under heavy tessellation loads the 6970 just beats out the 460. http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/23

As it should, being ATIs top of the range graphics card, compared to an upper mid range Nvidia card.
 
I don't care who's at fault it's just stupid.

That amount of tesselation is not needed, it adds nothing and wastes power over NOTHING.

It's balls, utter balls.

And the fact the tesselated ocean appearse when you look at a puddle is just lol worthy.

Good job failtek.
 
Who even said Crytek had anything to do with the addon anyway, it wouldn't be the first time Nvidia have worked on an update for a game and then gave the code to a dev, I'm pretty sure all the Cuda stuff in Just Cause 2 was developed by Nvidia, the AA in Batman etc.
 
The question is of course, does Crysis 2 actually run badly with all the settings turned on. I believe it is technically the most advanced game engine out there with all the patches applied.
 
The question is of course, does Crysis 2 actually run badly with all the settings turned on. I believe it is technically the most advanced game engine out there with all the patches applied.

Runs like crap NOW full settings.


Ill wait for the 11.8 drivers & try the Tessellation slider.
 
Last edited:
Is it any surprise that Crysis 2 has been designed to be unplayable at Ultra settings, on any currant card plus and card produced in the next 3-4 years?

The original Crysis was released in 2007, and current top end cards still struggle. There have been much prettier games released since Crysis that do not require the same amount of horsepower. It has been made primarily as a benchmark and the game itself is a bonus:).

Its not stunning quality and NO cards can play it at all, its an increase in gpu demands, with a reduction in performance, for NO IQ improvement in the slightest.

This is the problem, Crysis 1 on super mega ultra tweaked settings played like dog turd on current cards then, but looked better than anything before by a long margin and for a good couple years afterwards, things got closer but I don't think it was surpassed by anything.

Crysis 2 now needs, in max settings, stupid hardware for really good framerates, but it doesn't look any better.

If Nvidia added all that tesselation performance and the difference was visible across many many different textures and area's, and Nvidia lost less performance than AMD, thats one thing. When they add it to things that make no difference at all, and it kills performance for the singular reason of killing performance they are harming AMD users and critically, harming Nvidia users.

You buy a £350 580gtx, maybe 2 of them, then the company you gave the money to kills your performance JUST to harm people who bought AMD hardware. Honestly its beyond ridiculous, its pathetic and it should be widely known.

IF AMD do the same, hurt their own performance just to hurt Nvidia aswell, or hurt Nvidia only performance in any of the ways Nvidia did I'd be all over them, its bad business, its screwing your own customers, its screwing the industry.

How much time was wasted make flat objects take several times more gpu power to still be flat objects, when the same people could have been working on improving the damn game?
 
Back
Top Bottom