• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PCGH asks Nvidia on Gameworks

You have to remember that Crysis 2 was the first major title to use tessellation, so could well have been lazy development or working with something new and not getting to grips with it fully.
 
OK so let me just clarify where we are right now.

We are arguing that gameworks hurts AMD performance significantly more than Nvidia (even though testing has shown otherwise, and no evidence has been given beyond a AMD mouthpiece) on the basis of a single game released over 3 years ago, which had such high levels of tessellation that performance for both camps was affected, which didnt use any Nvidia binary libraries, and didn't carry over to the sequel.

Yes?
 
OK so let me just clarify where we are right now.

We are arguing that gameworks hurts AMD performance significantly more than Nvidia (even though testing has shown otherwise, and no evidence has been given beyond a AMD mouthpiece) on the basis of a single game released over 3 years ago, which had such high levels of tessellation that performance for both camps was affected, which didnt use any Nvidia binary libraries, and didn't carry over to the sequel.

Yes?

I would say yes. As gw features can be turned off AMD users aren't being hindered anyways its nvidia users that are gaining. That's how I understand it
 
You have to remember that Crysis 2 was the first major title to use tessellation, so could well have been lazy development or working with something new and not getting to grips with it fully.

Crysis 2 wasn't one of the first major games to use tessellation though, far from it.
Crytek, if I recall blamed the engine (Lol).
Crysis 2 imo was the start of EA engines that all look too blurry and covered in vaseline, but that's a separate issue.
 
Crysis 2 wasn't one of the first major games to use tessellation though, far from it.
Crytek, if I recall blamed the engine (Lol).
Crysis 2 imo was the start of EA engines that all look too blurry and covered in vaseline, but that's a separate issue.

Fair enough and my memory isn't as good as it used to be.
 
the big story with Crysis 2 was the fact it did not ship with DX11 nor Tessellation, it was added later with help from Nvidia as i recall, that's why it was picked up at the time,

So in many ways a bit like now with GW, It just seems AMD have a hard time dealing with any work Nvidia had a hand in, its just then it was with the level of tessellation used on flat surfaces, its something different now
 
I have to say, I quite enjoyed crysis 2 on my 670 with the high res textures, I thought it looked pretty and played quite well as a single player game.

I am not that bothered about too much tessellation as after all it's called a game for a reason.
 
From the picture in post #119 (if it can be trusted and is not just an early dev cycle picture) it just looks like the implementation of the tessellation factor fall off was not done particularly well - one wants the tessellation levels to fall off significantly at large distances by using something like a 1/x^2 reduction factor (except for when viewing the scene orthogonal to the plane of the map where ideally we want high tessellation consistently).

The image suggests to me that this was not the case and there were still high levels of tessellation at large distances from the camera. Unfortunately for those with an ulterior motive and alternate agenda here, we do not know who is to blame for this and whether or not it was just a quickly done job due to the devs being pushed for time. So to continue with this constant blame game against Nvidia is both illogical and stupid.
 
I don't buy it that it's more costly to do occasioning testing and remove the water than it is to render the water with this amount of tessellation whenever an ounce of water is on screen:


but anyway, water was only one part of it and that doesnt explain the ludicrous amounts of tess on the barriers lol



you don't agree that it's extremely convenient? what's your explanation then?

This is what I'm talking about. The usual way to do water (without a tessellation shader) is often basically just to draw it all the time that bit isn't unusual.

Throw in some haphazard tessellation by a lazy or inexperienced dev and the fact you've got not only water thats drawn all the time but water thats drawn with a high level of detail all the time isn't so suprising. Doesn't mean it wasn't maliciously done but theres a lot more to it than the face value potential smoking gun - which is my point with gameworks also people are lapping up AMD's often face value accusations of what look like smoking guns as AMD knows they will when the truth is a lot more complicated.

Regarding the barriers (and things like brickwork, etc.) - if the original high detail mesh/source assets also had definition for smaller features like scars in the surface, bolts, etc. then that the first pass tessellation version can sometimes be hideously unoptimised while trying to preserve the smaller finer details in the overall object, optimising that so that you have lower levels of tessellation in areas that don't need it is fairly tricky with a lot of potential side effects like objects having torn seams or deformed shapes that for instance lets the player see through areas they shouldn't, not to say its impossible to optimise or anything but a mixture of early days and/or inexperienced or incompetent devs could easily explain it and infact the more likely tho by no means exclusive possibility - if you took away any possibility of it being intentional its still something that is likely to happen with the nature of game development being what it is.
 
OK so let me just clarify where we are right now.

We are arguing that gameworks hurts AMD performance significantly more than Nvidia (even though testing has shown otherwise, and no evidence has been given beyond a AMD mouthpiece) on the basis of a single game released over 3 years ago, which had such high levels of tessellation that performance for both camps was affected, which didnt use any Nvidia binary libraries, and didn't carry over to the sequel.

Yes?

Pretty much :D
 
What the original article that kicked off this whole thing off was on about was 'GameWorks' and its adverse affect on AMD hardware. I have checked several sites that have benched many games (not just GameWorks titles) and they all seem to show the 780Ti slightly ahead of the 290X in most cases. If nVidia was gimping performance on AMD hardware, then wouldn't it stick out like a sore thumb?

After going through a few popular games on gameGPU, these were the differences in % comparing the 290X against the 780Ti

  • Assassins Creed 4 - The 780Ti was 10% faster
  • Sniper Elite V3 - The 290X was 4% faster
  • Battlefield Hardline - The 780Ti was 17% faster
  • Murdered: Soul Suspect - The 780Ti was 11% faster
  • Watch Dogs - The 780Ti was 17% faster
  • Wolfenstein - The 290X was 3% faster (and the 280X was 3% faster)
  • Titanfall - The 780Ti was 5% faster
  • Thief - The 780Ti was 7% faster

Sadly there was no results for Batman Arkham Origins but from our own bench thread, we know that the 290X is faster than the 780Ti (unless an insane overclock is added on the 780Ti).

So can someone show me what nVidia are supposedly doing? If someone can show me some evidence, I am open to be swayed but all the GameWorks games look as fair as the others to me and some win and some lose for both sides. Coincidentally, the 280X is faster than a 780Ti in Wolfenstein the new order.... What's up with that?

I used GameGPU for the results shown. http://gamegpu.ru/тест-gpu/action-/-fps-/-tps/
 
You have been shown countless times, your not going to change your opinion but that's cool, it's just the way things are, some agree, some disagree.:)

I have? I have read speculation but no proof. If someone showed me where GameWorks was crippling performance on AMD, I missed it. Care to show me again?
 
It isn't quantum physics that needs conclusive proof, I'm pretty sure everyone has opinions that don't need conclusive proof in a whole range of life matters in general.

There's plenty of threads/articles of Nvidia tactics in GW's and in previous titles, walls of silence from developers involved etc from various sources, we've seen them all before.
 
3POyupA_zps126ace5e.gif
 
It isn't quantum physics that needs conclusive proof, I'm pretty sure everyone has opinions that don't need conclusive proof in a whole range of life matters in general.

There's plenty of threads/articles of Nvidia tactics in GW's and in previous titles, walls of silence from developers involved etc from various sources, we've seen them all before.

So your argument is that you don't need actual proof, the fact that AMD people say so should be enough?


This picture is ridiculously watch-able...
 
That's spot on matt, 'can you show me?' round and round we go again and again and again...
So your argument is that you don't need actual proof, the fact that AMD people say so should be enough?

It's not an argument when plenty non AMD people(you know web sites that put the boot into both camps without batting an eyelid, they aren't all **** lickers you know) create articles from way back, it's happened too many times now.
 
It isn't quantum physics that needs conclusive proof, I'm pretty sure everyone has opinions that don't need conclusive proof in a whole range of life matters in general.

There's plenty of threads/articles of Nvidia tactics in GW's and in previous titles, walls of silence from developers involved etc from various sources, we've seen them all before.

680330fd292b24a977282cd05ea00f10.jpg


7ba55c9812f1f7f6865b538df1c6755c.jpg


Good enough evidence that shows that GameWorks wasn't hindering performance on AMD hardware for me.
 
Back
Top Bottom