• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia unofficially, officially caught cheating in Crysis 2

OK rancidelf, tell me what coding has to do with the model of that concrete sleeper shown above. Tell me how coding has anything to do with that model being overly dense. Tell me how that model who's maximum poly count iteration that was created by a modeller in crytek has anything to do with coding.

We are not talking about AMD vs NVIDIA tessellation efficiency. We are talking about the models in the games being pointlessly overly dense.

Yes you are right. And I am talking about a company doing this deliberately to make Ati look bad. Trust me the programmer knows what the models are doing because its his code rendering the models on the screen. Any programmer would have noticed the frame rate drops between 2 different bits of hardware and would have looked into it. Either optimised his code to handle the rendering of the objects differently depending on card, got the modler to do objects different or just settled for end result.

What is being said in here is that Crytek deliberately did it to show up Ati.. This is my argument.. Its a ridiculous idea and have stated reasons why.
 
They do the same with Physx, they add gimmicky effects at a level that is just not required and impacts performance far too much making CPU physx a no go for most.

It's what nvidia do best, screw over their users for the sack of some marketing.
 
Laughing out loud, I didn’t realise it was

NATIONAL SENSATIONALIST HEADLINE DAY.

You do realise your arguing about the fact that a game is too detailed, isn’t this exactly what we want as gamers, more detail in games. yes I agree that going by that article it seems the tessellation is used on what seem to be unnecessarily detailed objects, but is this Nvidia fault, hardly. In that article Nvidia are mentioned 4 times.

As a very high-profile title, Crysis 2 has gotten lots of support from Nvidia in various forms. In and of itself, such support is generally a good thing for PC gaming. In fact, we doubt the DX11 patch for this game would even exist without Nvidia's urging. We know for a fact that folks at Nvidia were disappointed about how the initial Crysis 2 release played out

Unnecessary geometric detail slows down all GPUs, of course, but it just so happens to have a much larger effect on DX11-capable AMD Radeons than it does on DX11-capable Nvidia GeForces.

So where there does it say that it is Nvidia fault for the way crytek have used tessellation.



I'll make a flat out no questions about it accusation/statement.

Crysis 2 dev's could have released the patch with tesselation disabled, working much faster, with no drop in IQ, they have BEEN PAID to sabotage performance by adding in lots of absolutely woeful in quality tesselated objects that remain almost completely flat just to sabotage AMD performance worse than Nvidia performance.

WOW that is a very strong accusation/statement to make, can we have your proof/evidence please or is it just another case of

NATIONAL SENSATIONALIST HEADLINE DAY.
 
Yes you are right. And I am talking about a company doing this deliberately to make Ati look bad. Trust me the programmer knows what the models are doing because its his code rendering the models on the screen. Any programmer would have noticed the frame rate drops between 2 different bits of hardware and would have looked into it. Either optimised his code to handle the rendering of the objects differently depending on card, got the modler to do objects different or just settled for end result.

What is being said in here is that Crytek deliberately did it to show up Ati.. This is my argument.. Its a ridiculous idea and have stated reasons why.

But they clearly didn't do the bit in bold so clearly the programmer didn't know what he was doing by your own example.
 
Laughing out loud, I didn’t realise it was

NATIONAL SENSATIONALIST HEADLINE DAY.

You do realise your arguing about the fact that a game is too detailed, isn’t this exactly what we want as gamers, more detail in games. yes I agree that going by that article it seems the tessellation is used on what seem to be unnecessarily detailed objects, but is this Nvidia fault, hardly. In that article Nvidia are mentioned 4 times.





So where there does it say that it is Nvidia fault for the way crytek have used tessellation.





WOW that is a very strong accusation/statement to make, can we have your proof/evidence please or is it just another case of

NATIONAL SENSATIONALIST HEADLINE DAY.

At last someone who can read without his rose tinted glasses!

Same old people jumping on the same old band waggons.
 
Bru, that poly count could have gone somewhere that counted.

-EDIT original comment, somehow managed to edit instead of post

This is the problem though, its either Crytek have gone amazingly incompetent, something fishy is going on or Crytek really don't give a **** about PC gaming anymore

What modeller in their right mind would tessellate a flat surface and to that degree, and what lead artist would allow said sloppy work past his approval. What programmer would see the maximum tessellation version of the model and say hey that's fine I don't mind optimising my backside off so you can waste it all on pointless detail that wont be seen.

Ok, after that I think that Crytek don't care about PC gaming and thought hey DX-11 patch, lets mash the tessellate button in MAX and leave it at that.
 
Last edited:
This is the problem though, its either Crytek have gone amazingly incompetent, something fishy is going on or Crytek really don't give a **** about PC gaming anymore

What modeller in their right mind would tessellate a flat surface and to that degree, and what lead artist would allow said sloppy work past his approval. What programmer would see the maximum tessellation version of the model and say hey that's fine I don't mind optimising my backside off so you can waste it all on pointless detail that wont be seen.

Ok, after that I think that Crytek don't care about PC gaming and thought hey DX-11 patch, lets mash the tessellate button in MAX and leave it at that.

And it doesn't help to clear Crytek of claims that Crysis is an unoptimised mess as well.
 
This is not the point I have been making.. My point from start has been... "To think Crytek deliberately did it that way to show up Ati is ludicrous

Crytek did it because NV paid for it & not because Crytek personally wanted to hurt ATI.

But either way you look at it the tessellation usage in that game is just flat out wrong.

And in every case of bad over use of tessellation usage its been in a NV sponsored game.
 
Last edited:
I don't things at work which i know full well is rubbish at times, way below the standard that i can produce & the simple fact I'm told to do it that way then i do it.
 
You do realise your arguing about the fact that a game is too detailed, isn’t this exactly what we want as gamers, more detail in games

No, we're arguing that they have purposely used too many polygons in an area that could have been made up with much, much less and still had the exact same graphical quality. They have made what is effectively a box out of thousands, if not hundreds of thousands of polygons where's they could have done it with half a dozen with a lot more being used on the chain part at the top, which is what would have been required and not tanked framerate down so much.
 
You realise this also hurts most of Nvidia's own cards right? I'm guessing not because you don't actually seem to understand the article at all and just jump in with fanboy comments.

Hehe, fanboy comments, i am no fanboy, i buy nvidia because i have never had problems, never tried AMD simply because i have had zero issues with nvidia, if it's not broke why try and fix it ?
All my comments in this thread are light hearted and jokey to try and have a laugh lol, if u can't take that i guess u shouldn't be using a forum.
 
Crytek did it because NV paid for it & not because Crytek personally wanted to hurt ATI.

But either way you look at it the tessellation usage in that game is just flat out wrong.

And in every case of bad over use of tessellation usage its been in a NV sponsored game.

Can you supply me a link or something that actually shows that NV paid Crytek. I do not mean a link to a site where people are speculating.. I mean actual evidence that NV pays a company.

U mention sponsorship.. Well u know this can come in the form of tech support for developers. It does not mean NV are actually paying companies.
 
Can you supply me a link or something that actually shows that NV paid Crytek. I do not mean a link to a site where people are speculating.. I mean actual evidence that NV pays a company.

U mention sponsorship.. Well u know this can come in the form of tech support for developers. It does not mean NV are actually paying companies.

Seriously, just use google...
 
Can you supply me a link or something that actually shows that NV paid Crytek. I do not mean a link to a site where people are speculating.. I mean actual evidence that NV pays a company.

U mention sponsorship.. Well u know this can come in the form of tech support for developers. It does not mean NV are actually paying companies.

Your right I'm my general use of the word sponsorship but other than that, the specifics of are that they actually paid for tessellation to be implemented as well.

Well before anyone got hold of any DX11 code for the game & the predictions came true.
http://www.kitguru.net/components/graphic-cards/faith/nvidias-2-million-crysis/
 
There is no need for that amount of tessellation on a model like that, there is literally no need at all. Take it from experience making things like this day in day out, you would be laughed out of any office with an atrocity like the example picture tommybhoy provided.

Yup I agree, its so poorly implemented I struggle to believe even nVidia had anything to do with it... I even struggle to believe someone working for crytek had anything to do with it... there are however times when you do need a higher level of tessellation overall on a model than are required on all of it to correctly get some parts at the resolution needed for them to work best - tho not to the level shown in these screenshots unless you absolutely have no idea what your doing...

Yes, and that would harm Nvidia performance........... how?

Your missing the point (plus my comment was mostly rhetoric) - by removing one thing that hinders their performance it gives a better chance for their cards to shine in other areas where they do better.

How many people here predicted such a response from Rroff?

@Rroff

Honestly, and maybe you can't see it now... but one day you will look back and cringe at the place you are at, and thank above for whatever caused you to wake up...

The only thing making me cringe is that I even posted in this thread in the first place - while I would not put it past being the work of nVidia by a long shot, there is not a single scrap of credible evidence directly linking them to it at this point - no one with even a scrap of objectivity or intelligence would jump straight on the bash nVidia bandwagon at this point - it may yet be proved they were behind it who knows, but its truly pathetic the way some people have blindly jumped not only into slamming nVidia but also lining up to slam me for essentially saying "wait a minute, lets not jump to conclusions".
 
The only thing making me cringe is that I even posted in this thread in the first place - while I would not put it past being the work of nVidia by a long shot, there is not a single scrap of credible evidence directly linking them to it at this point - no one with even a scrap of objectivity or intelligence would jump straight on the bash nVidia bandwagon at this point - it may yet be proved they were behind it who knows, but its truly pathetic the way some people have blindly jumped not only into slamming nVidia but also lining up to slam me for essentially saying "wait a minute, lets not jump to conclusions".

If it would always be easy to directly prove all the time then they would not do it & you like to hide behind that all the time.

There is a point of statistics of probability of coincidence gets to a point where its not coincidence at all & have been used to convict.

And it takes intelligence to seeing things before everyone else does before obvious proof is given & it has served me well because to often the proof is to late to matter.

Don't wait until your crashing to put your seatbelt on or turn on your airbag.
 
Last edited:
He has already backtracked quietly on other things like GPU Physx which he was second only to Pottsey in its promoting as the second coming & how it will be used to great effect & now has backed off a bit with NV 3D as well while both matters are progressing in they way i thought they would under NV.

You are truly pathetic... I have never backtracked over GPU PhysX, my opinion of it is exactly the same as it always used to be, I have always promoted hardware accelerated physics (not PhysX specifically) like the second coming as I do believe that hardware accelerated physics will go a long way to moving games on generationally when used properly. I've promoted PhysX as with the lack of a mature Open alternative its well supported, polished/stable and light years ahead of any potential competition unfortunatly. I've always said it needs a killer title to push PhysX into the mainstream and unfortunatly we aren't likely to see that due to the fact that it cuts off a large part of the potential gaming audience. I was also very vocal in denouncing nVidia when they locked it out to people not using nVidia hardware for rendering... maybe you've conveniently forgotten that.

I've never promoted NV 3D (except rightfully over the competition when people are hell bent on going for a 3D solution of one form or another), infact inside the GPU forums I don't think I've even mentioned it - not sure on that - but most of my posts on it have been in general hardware and monitors... and you will find I've always been of the opinion it was not a logical step forward for gaming.
 
If it would always be easy to directly prove all the time then they would not do it & you like to hide behind that all the time.

There is a point of statistics of probability of coincidence gets to a point where its not coincidence at all & have been used to convict.

And it takes intelligence to seeing things before everyone else does before obvious proof is given & it has served me well because to often the proof is to late to matter.

Nicely put...
 
If it would always be easy to directly prove all the time then they would not do it & you like to hide behind that all the time.

There is a point of statistics of probability of coincidence gets to a point where its not coincidence at all & have been used to convict.

And it takes intelligence to seeing things before everyone else does before obvious proof is given & it has served me well because to often the proof is to late to matter.

Don't wait until your crashing to put your seatbelt on or turn on your airbag.


So lock someone away because they happen to have a history of robbery and happened to be in the area when a bank robbery took place? don't worry about the evidence they have to be the right person coz they have a history of it right?


Also your examples are of preventative measures and can't be used as a basis for proving or suggesting someone was behind something.
 
Last edited:
Back
Top Bottom