• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gameworks, Mantle and a pot calling a kettle black

Maybe so, but i think given the difference in results between AA on and FXAA is enough for people to wonder what is happening.

Can you cite any examples where a 7970 beats a 780TI at 1080P? I've had a good look but can't find any.

Maybe you could ask one of your mates on twitter to why the 290X is faster than the 780Ti when the game is an nVidia sponsored title.

As for a 7970 beating a 780Ti, what has that got to do with GameWorks?
 
Could be. Could be any number of reasons. I can't say for sure what the reason is and neither can you.

I don't know anyone 'in the know' on Twitter though matey, so i have no means to find out. Hence my use of the word 'speculating' ;)

Could be an alieon race is gimping AMD performance because he prefers the colour green? Could be, we'll never know.
 
Maybe you could ask one of your mates on twitter to why the 290X is faster than the 780Ti when the game is an nVidia sponsored title.

As for a 7970 beating a 780Ti, what has that got to do with GameWorks?

When AA is used? You know why.

Well this is the first game ive ever seen a 770 beat a 290X. I was just wondering if there were any other titles that such a thing occurred. Seemed a fair question i think.
 
As Marine rightly said, none of us know how deeply it's integrated into the engine.

I already answered this :p

GW is an API like I've specified before, you can see the documentation here: http://docs.nvidia.com/gameworks/index.html

As for the claim that it might be more "tightly integrated" into games, why then does the Unreal 4 engine come without GW but has very simple "plug and play" support for GW if it needs to be tightly integrated?
 
TL : DR

Something is causing the drop off in performance. No proof that GameWorks is causing it, but also no proof that it's not. Truth be told, we just don't know and we likely never will. Thankfully Batman AO seems to be a worst case scenario in this regard. Not much more to say on the matter i think. Let's get on with our lives.
 
When AA is used? You know why.

Well this is the first game ive ever seen a 770 beat a 290X. I was just wondering if there were any other titles that such a thing occurred. Seemed a fair question i think.

If MSAA was so bad for nVidia, why the hell did they allow 8* the stuff? Surely if they wanted the massive advantage, they would have restricted it to *2.

4b34dcf9e5543dcad5cbbbdb273a8807.jpg


That Gif seems apt and honestly mate, that is what you are doing. You are ignoring the people who know coding and for whatever reason, you seem desperate to point the finger at nVidia/GameWorks but clearly it isn't.
 
TL : DR

Something is causing the drop off in performance. No proof that GameWorks is causing it, but also no proof that it's not. Truth be told, we just don't know and we likely never will. Thankfully Batman AO seems to be a worst case scenario in this regard. Not much more to say on the matter i think. Let's get on with our lives.

So to clarify, you mean disabling GW features and seeing the same performance diff between AMD/Nvidia isn't proof (even if not conclusive) that GW isn't harming things?
 
If MSAA was so bad for nVidia, why the hell did they allow 8* the stuff? Surely if they wanted the massive advantage, they would have restricted it to *2.

4b34dcf9e5543dcad5cbbbdb273a8807.jpg


That Gif seems apt and honestly mate, that is what you are doing. You are ignoring the people who know coding and for whatever reason, you seem desperate to point the finger at nVidia/GameWorks but clearly it isn't.

Nvidia didn't make the game, Warner Brothers did. They are not the same company. I doubt Nvidia had any say in how much AA was used. Do you think AMD had any say in how much AA was available in BF4?

Those people that work in coding are happy to ignore the words of game developers though when it suits. :p

I'm not saying GameWorks is the cause, I'm just saying i don't see enough reliable proof to know either way. Nor am i that bothered anymore. I will leave that up to you guys in trying to ram your point home. At the end of the day we're all allowed our own opinion.

So to clarify, you mean disabling GW features and seeing the same performance diff between AMD/Nvidia isn't proof (even if not conclusive) that GW isn't harming things?

How do we know that with GW features disabled, it still does not have any negative effect on performance? We only have Nvidia's word for it. But this thread has taught us that AMD/Nvidia lie and we are not to trust anything they say. Or are we to trust what Nvidia say this time? :p
 
TL : DR

Something is causing the drop off in performance. No proof that GameWorks is causing it, but also no proof that it's not. Truth be told, we just don't know and we likely never will. Thankfully Batman AO seems to be a worst case scenario in this regard. Not much more to say on the matter i think. Let's get on with our lives.

This is true.
However it's an argument that could be applied to a number of things.
No proof that AMD's drivers are causing it, but also no proof that they're not.
 
This is true.
However it's an argument that could be applied to a number of things.
No proof that AMD's drivers are causing it, but also no proof that they're not.

Agreed GM.

However i think given the press surrounding this they would've done their very best to get things working as well as they could. Anything else just doesn't make sense for a big title. But yes, it is also a possibility.
 
Personally I think the game is just a badly written pile of dung.

I mean lets face it if a Titan rendering the game AND the physics is scoring 14% higher FPS than a Titan rendering the game and offloading the physics to the CPU in a lightly threaded game, then something is very messed up.
 
How do we know that with GW features disabled, it still does not have any negative effect on performance? We only have Nvidia's word for it. But this thread has taught us that AMD/Nvidia lie and we are not to trust anything they say. Or are we to trust what Nvidia say this time? :p

Because it would be immediately obvious to the devs making the game? Let's assume for a minute that they didn't profile it before/after GW features to determine it's impact (and they would have, nvidia developers go on location to help them integrate and optimise the features). The other clue then would just be as obvious to any developer - toggling the option off in the settings for the game meant the GW library still did something which a blind developer could detect.

Just to extend that further too, GW doesn't control if it's on/off - the game does. If no methods in the library are ever called then it can't do squat, it's not some executable running alongside the game in control of itself (and this isn't specific to GW, this is how things work in the programming world). If for some reason GW was being used with the features toggled off, then that would be at the fault of the developer.
 
Agreed GM.

However i think given the press surrounding this they would've done their very best to get things working as well as they could. Anything else just doesn't make sense for a big title. But yes, it is also a possibility.

Run the benchmark with latest drivers then, lets see if AMD have done anything.
All the noise surround poor performance is recent, and my 13.12 drivers wont be helping.

You've got a 290, and an i7. Play the game, load it up and let's see. It doesn't prove anything either way, just shows you're willing to help and see the root cause rather than playing the blame game (normally aiming your blame as far away from AMD as you can)
 
Because it would be immediately obvious to the devs making the game? Let's assume for a minute that they didn't profile it before/after GW features to determine it's impact (and they would have, nvidia developers go on location to help them integrate and optimise the features). The other clue then would just be as obvious to any developer - toggling the option off in the settings for the game meant the GW library still did something which a blind developer could detect.

Just to extend that further too, GW doesn't control if it's on/off - the game does. If no methods in the library are ever called then it can't do squat, it's not some executable running alongside the game in control of itself (and this isn't specific to GW, this is how things work in the programming world). If for some reason GW was being used with the features toggled off, then that would be at the fault of the developer.

Some good posts Deceptor and I have learnt a bit from you posting in this thread. I always like when someone puts things into laymans terms so I can understand easier.

Top man :)
 
Run the benchmark with latest drivers then, lets see if AMD have done anything.
All the noise surround poor performance is recent, and my 13.12 drivers wont be helping.

You've got a 290, and an i7. Play the game, load it up and let's see. It doesn't prove anything either way, just shows you're willing to help and see the root cause rather than playing the blame game (normally aiming your blame as far away from AMD as you can)

I would, but i don't have the game im afraid. As Batman is not my cup of tea, im not in any hurry to buy it either. Please try not to read anything into my wish not to pay for it.
 
I would, but i don't have the game im afraid. As Batman is not my cup of tea, im not in any hurry to buy it either. Please try not to read anything into my wish not to pay for it.

Do you have Watch Dogs or COD Ghosts or AC IV or Splinter Cell?
 
Because it would be immediately obvious to the devs making the game? Let's assume for a minute that they didn't profile it before/after GW features to determine it's impact (and they would have, nvidia developers go on location to help them integrate and optimise the features). The other clue then would just be as obvious to any developer - toggling the option off in the settings for the game meant the GW library still did something which a blind developer could detect.

Just to extend that further too, GW doesn't control if it's on/off - the game does. If no methods in the library are ever called then it can't do squat, it's not some executable running alongside the game in control of itself (and this isn't specific to GW, this is how things work in the programming world). If for some reason GW was being used with the features toggled off, then that would be at the fault of the developer.

Here's the thing, Warner Brothers refused to help AMD, they refused to help them implement any performance optimization into the game or at driver level. Now, this could be because of their GameWorks contract, or it could be because they're a crappy dev. No one knows for sure. However they did work with Nvidia to fix the problems Greg mentioned previously. That did strike me as strange, but again it's not concrete proof either way. It's clear that the game dev had no interest in working with AMD to improve performance. Why is that? They refused to comment when approach by a website in france called Hardware FR. They investigated the issue at the time.


Fair play to you Bru. :p

However i was talking about average fps, probably should've said that just assumed it was obvious. Who's to say that minimum fps was not a one off glitch. Average fps on the other hand will tell a better story.

Do you have Watch Dogs or COD Ghosts or AC IV or Splinter Cell?

Nope, but ive played COD Ghosts and it is disgusting on crossfire, or was when i tried it. Other people here have also said that to be fair. First cod game to work terribly with crossfire. I've played them all.

I have Splinter Cell Conviction, but i assume you mean the other Splinter Cell. No i've not played that. Nor AC IV, not my type of games. I'm told crossfire does not work on ACIV either, or is that AC 3? Not sure. One of them crossfire just does not work full stop.
 
Last edited:
Here's the thing, Warner Brothers refused to help AMD, they refused to help them implement any performance optimization into the game or at driver level. Now, this could be because of their GameWorks contract, or it could be because they're a crappy dev. No one knows for sure. However they did work with Nvidia to fix the problems Greg mentioned previously. That did strike me as strange, but again it's not concrete proof either way. It's clear that the game dev had no interest in working with AMD to improve performance. Why is that? They refused to comment when approach by a website in france called Hardware FR. They investigated the issue at the time.

Like I've stated before, the idea of a witch-hunt to determine what's going on is in my opinion absolutely the right thing to do, no one wants shoddy stuff going on harming one company or another. It'd be great though if that witch-hunt wasn't stalled/focused on GW alone and began digging deeper else where.
 
Back
Top Bottom