• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Richard Huddy - Gaming Scientist Interviewed

lol...

The only question that we should be asking is "Is there proof of Gameworks crippling AMD performance?"

The answer to that is no.

You can't possibly be suggesting that people should also ask if there's proof of it not doing that? That's the guilty until proven innocent argument that never really works. You can apply it to almost anything and make anything look bad for any reason.

The answer is Yes

Why do you think users, AMD and other Game Developers are questioning the massive performance discrepancies and criticising the closed nature of GW in that it locks AMD out of optimising for it.

Its not just users, Twitter is full of Game Devs openly questioning and criticising both Ubi and Nvidia.

Nvidia are doing a great job with their Game Developer relations, not.
 
Care to show us the proof Humbug? Since all the tests so far say no.
And yes i mean show AMD's being harmed not Nvidia's being optimized
 
Yawn - Tone and I tested the relevant drop offs when enabling features which called GW libraries. The drop off in performance was practically identical so going through the GW library affected performance the same for nVidia as it did AMD. You really must stop posting unsubstantiated nonsense.

AMDs comparative lack of performance is due to something else. I wouldn't expect you lot to actually put pressure on AMD for it as it's far easier and convenient to blame anybody but AMD.
 
The thing is Tommy (and I do agree with your sentiment) is that if you're accusing somebody of doing something 'bad' you must provide proof of the wrong-doing. The proof that was put forward before has been shown to be nonsense. So therefore just because you can't prove that nVidia aren't doing anything wrong doesn't mean there's any particular reason to be suspicious.

The overuse of tessalation in the bat cape(and other titles) has been proved to be detrimental to AMD and to the cost of Nvidia users on lesser hardware, the fact that Nvidia can win BM's using this technique is the ultimate goal that implies Nvidia performance is 'better'.

The argument put forward by the people who so think that GW is harming AMD is flaky at best.

Could it be used in future to potentially harm AMD? Definitely yes but until that happens there's not much point discussing it.

As above, it's already happened in BAO(I honestly can't say for sure it has been used in WD's hence my lack of input/discussion on that game), the question is, will it happen again bearing in mind Nvidia's track history of getting caught out with dodgy practice-AC, C2 and whatever else they have done.
 
The overuse of tessalation in the bat cape(and other titles) has been proved to be detrimental to AMD and to the cost of Nvidia users on lesser hardware, the fact that Nvidia can win BM's using this technique is the ultimate goal that implies Nvidia performance is 'better'



As above, it's already happened in BAO(I honestly can't say for sure it has been used in WD's hence my lack of input/discussion on that game), the question is, will it happen again bearing in mind Nvidia's track history of getting caught out with dodgy practice-AC, C2 and whatever else they have done.

Nowte to do with GW though (unless tesselation is a GW library).
 
Yawn - Tone and I tested the relevant drop offs when enabling features which called GW libraries. The drop off in performance was practically identical so going through the GW library affected performance the same for nVidia as it did AMD. You really must stop posting unsubstantiated nonsense.

AMDs comparative lack of performance is due to something else. I wouldn't expect you lot to actually put pressure on AMD for it as it's far easier and convenient to blame anybody but AMD.

it's not quiet as simple as you might think, these are 2 GPU vendors, having a game patched several weeks later doesnt really matter, since big part of the PR is about the benchs made on the release day, take your time i get the benchs release day of GW titles, and tell me again there is no difference.
to you this might seem trivial but this translates into a lot of sales for these vendors.
 
Haven't you just changed the goalposts there Tommy (Before it was all about the FXAA benchmarks)? Over Tessellation is the only thing I can see as a problem to earlier AMD DX11 cards, but AMD's tessellation performance isn't *bad* anymore, so new cards shouldn't be "crippled" by it?
 
The biggest cause for concern for me is despite Nvidia saying they will give source code access to game devs, they have it as part of the GameWorks contract that the game dev is not able to share that source code with AMD, or provide any performance optimizations towards it that would improve AMD's performance in GameWorks enabled titles.
 
it's not quiet as simple as you might think, these are 2 GPU vendors, having a game patched several weeks later doesnt really matter, since big part of the PR is about the benchs made on the release day, take your time i get the benchs release day of GW titles, and tell me again there is no difference.
to you this might seem trivial but this translates into a lot of sales for these vendors.

Well, it kind of does matter, given the argument was that it couldn't be fixed.

And since people have only actually compared the results (Because apparently the onus is on the "defendant" now............) can we even say with any degree of certainty that these results wouldn't have been the same had they been tested close to launch?
 
it's not quiet as simple as you might think, these are 2 GPU vendors, having a game patched several weeks later doesnt really matter, since big part of the PR is about the benchs made on the release day, take your time i get the benchs release day of GW titles, and tell me again there is no difference.
to you this might seem trivial but this translates into a lot of sales for these vendors.

No - the benches on release day are now irrelevant. Where do you draw the line at AMD either not/not being able to optimise in time and 3rd party influence?

PR on who wins the benchmark isn't important to this debate. The original argument was that AMD couldn't optimise GW features and evidence was shown of card x from 2011 beating card y from 2013. But the respective drop offs when enabling FXAA are the same so the GW library is not affecting performance at all. In fact if you look at the no AA result AMDs performance is just not as good in this game. It happens. It's up to the game developer and AMD to resolve it.
 
Last edited:
The biggest cause for concern for me is despite Nvidia saying they will give source code access to game devs, they have it as part of the GameWorks contract that the game dev is not able to share that source code with AMD, or provide any performance optimizations towards it that would improve AMD's performance in GameWorks enabled titles.

Which IMO explains why Rusty'e Min FPS are over 100% higher than Tonester's

 
That's nothing to do with GameWorks libraries themselves though (even if true).

As I said above that's up to AMD and the game developer to resolve. The differences in performance aren't as a result of the GW libraries. As illustrated by us testing settings which enabled no GW features.

The biggest cause for concern for me is despite Nvidia saying they will give source code access to game devs, they have it as part of the GameWorks contract that the game dev is not able to share that source code with AMD, or provide any performance optimizations towards it that would improve AMD's performance in GameWorks enabled titles.

Source for that specifically being in the contract?
 
Last edited:
Tesselation is just tesselation. The crysis 2 situation was somewhat questionable optimization wise but complaining about batman's cape is just awfully silly.

Might as well complain about the absolutely absurd motion blur in star swarm designed to trump up mantle.

What did nvidia do about that? They optimized and surpassed AMD in that application.

What does AMD do about tesselation? Refuse to improve their tess performance and complains online.

The answer is Yes

Why do you think users, AMD and other Game Developers are questioning the massive performance discrepancies and criticising the closed nature of GW in that it locks AMD out of optimising for it.

Its not just users, Twitter is full of Game Devs openly questioning and criticising both Ubi and Nvidia.

Nvidia are doing a great job with their Game Developer relations, not.

Should probably look into it more if you think the answer is yes.

Because absolutely no one, even AMD has been able to provide a single shred of proof of GameWorks harming AMD cards.

If you think watch dogs is an issue you can disable hbao (since it's the only feature in it using gameworks) and see if it made a difference for amd. vs nv. The answer is no.

E: besides what's up the the assumption that NV sponsored titles have to perform as well on AMD hardware. Of course they don't. There are also plenty of AMD sponsored titles that perform or have performed terribly on Nvidia hardware. Doesn't mean that there's something malicious going on.
 
Last edited:
Disabling or enabling HBOA makes no difference to the performance at all that i can tell ^^^^

That's nothing to do with GameWorks libraries themselves though (even if true).

As I said above that's up to AMD and the game developer to resolve. The differences in performance aren't as a result of the GW libraries.

Its only GW games where this happens, and Robert has made it abundantly clear he is not allowed access to anything in those libraries so he can't optimise anything, other Game developers say he has reason to be concerned about that lack of access.
 
Which IMO explains why Rusty'e Min FPS are over 100% higher than Tonester's

I suggest you look at the results again.
Tonesters minimum gains 100% when gamesworks is enabled.

Another Humbug fail.

EDIT : That said, I didn't actually look at the results at first, I was posting how unreliable minimums can be in benchmarks (Heaven for example). But because you're using them against Nvidia, lets use them for Nvidia.

GAMEWORKS DOUBLES MINIMUM FRAMES WHEN ENABLED!!!!
 
Last edited:
Nowte to do with GW though (unless tesselation is a GW library).

Haven't you just changed the goalposts there Tommy (Before it was all about the FXAA benchmarks)? Over Tessellation is the only thing I can see as a problem to earlier AMD DX11 cards, but AMD's tessellation performance isn't *bad* anymore, so new cards shouldn't be "crippled" by it?

The relevance is if Nvidia are prepared to cripple performance in one instance, where does it end?

FXAA isn't 'important' or worth arguing I have been told on many many occasions here-just buy another card is the answer I'm usually informed.

Which I did, but some of the usp's are disabled by Nvidia because I choose to run it alongside AMD hardware/not on the approved list-but, but, Nvidia are the good guys and not setting ultimatums to what I can/cannot use in my system.
 
Its only GW games where this happens, and Robert ahas made it abundantly clear he is not allowed access to anything in those libries so he can't optimise anything, other Game developers say he has reason to be concerned about that lack of access.

Firstly, Thracks works for AMD. Don't believe what he says anymore than you would believe something a nVidia guy says.

Secondly, it's irrelevant that he can't optimise if the performance when enabling a feature which calls a GW library affects AMD as much as it does nVidia. (Ethically it may not be irrelevant but talking performance terms).

Thirdly, performance drop off with a GW feature enabled is the same in % terms so what he is saying is not strictly true.

With no GW feature enabled AMDs performance is still not as good. Now this might be because WB don't want to help AMD for whatever reason but that isn't the same as it being due to the GW libraries.
 
Rusty, look at my reply to Humbug, he wasn't even looking at your results.
The minimum FPS difference between you and Tone is 2FPS/~3% minimums on FXAA High.
 
Back
Top Bottom