• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Richard Huddy - Gaming Scientist Interviewed

Disabling or enabling HBOA makes no difference to the performance at all that i can tell ^^^^



Its only GW games where this happens, and Robert has made it abundantly clear he is not allowed access to anything in those libraries so he can't optimise anything, other Game developers say he has reason to be concerned about that lack of access.

Do you have proof its gameworks or lazy coding on the part of the dev?
And if he cant optimize Anything how do AMD get better results with new drivers?
 
Do you have proof its not? ^^^^ i said he cannot optimise anything inside the GW libraries.

I suggest you look at the results again.
Tonesters minimum gains 100% when gamesworks is enabled.

Another Humbug fail.

EDIT : That said, I didn't actually look at the results at first, I was posting how unreliable minimums can be in benchmarks (Heaven for example). But because you're using them against Nvidia, lets use them for Nvidia.

GAMEWORKS DOUBLES MINIMUM FRAMES WHEN ENABLED!!!!

Yeah i just looked at it again, thats not even Watch Dogs, thats an entirely different game, what game is that?

Its not what we are talking about here.
 
The relevance is if Nvidia are prepared to cripple performance in one instance, where does it end?

FXAA isn't 'important' or worth arguing I have been told on many many occasions here-just buy another card is the answer I'm usually informed.

Which I did, but some of the usp's are disabled by Nvidia because I choose to run it alongside AMD hardware/not on the approved list-but, but, Nvidia are the good guys and not setting ultimatums to what I can/cannot use in my system.

I don't think anybody is saying nVidia are angelically amazing but on the subject of this conversation I don't see the relevance in what you're saying.

I'm certainly not in the business of defending nVidia and what you are saying there is in my view a genuine grievance but as Martini says that's moving the goalposts from what we were originally discussing.
 
The relevance is if Nvidia are prepared to cripple performance in one instance, where does it end?

FXAA isn't 'important' or worth arguing I have been told on many many occasions here-just buy another card is the answer I'm usually informed.

Which I did, but some of the usp's are disabled by Nvidia because I choose to run it alongside AMD hardware/not on the approved list-but, but, Nvidia are the good guys and not setting ultimatums to what I can/cannot use in my system.

But it's not FXAA causing the performance on AMD. FXAA isn't important, it can be turned off, you can run MLAA on AMD through CCC.

Nvidia shouldn't have the ability to cripple via tessellation, because AMD's "fixed" theirs, and AMD users have the ability through CCC to change the levels of Tess.
 
Do you have proof its not? ^^^^ i said he cannot optimise anything inside the GW libraries.



Yeah i just looked at it again, thats not even Watch Dogs, thats an entirely different game, what game is that?

Its not what we are talking about here.

Don't play coy, you got caught with your pants down (Again) and of course you were talking about it, you flipping linked to it when trying to say how terrible Gameworks is, why are you still asking for proof when you've JUST seen it?
 
Do you have proof its not? ^^^^ i said he cannot optimise anything inside the GW libraries.



Yeah i just looked at it again, thats not even Watch Dogs, thats an entirely different game, what game is that?

Its not what we are talking about here.

It's BAO - GW title.

I don't need to prove it isn't happening. That isn't how it works.

If I say you're a mass murderer you don't need to prove you aren't to avoid jail, it's up to me to prove you are.
 
Rusty, he knows exactly what game it is.
He tried saying that your mins in Batman AO were 100% higher, he linked to the benchmark, and it didn't show what he said. He saw Tones GW on, GW off benchmark and thought it was yours and his. The mins on Tones GW are over 100% higher than his none GW :p.
 
Do you have proof its not? ^^^^ i said he cannot optimise anything inside the GW libraries.



Yeah i just looked at it again, thats not even Watch Dogs, thats an entirely different game, what game is that?

Its not what we are talking about here.

Its only GW games where this happens, and Robert has made it abundantly clear he is not allowed access to anything in those libraries so he can't optimise anything, other Game developers say he has reason to be concerned about that lack of access
.
Doesnt mention anywhere in your quote about only optimizing inside those libraries, and if any GW title has had a increase on a GW feature then clearly he can
So what your saying is guilty until proven innocent right?
Can you prove Every Dev's doesnt not spend any less time or money developing a game on the DX 11 side due to having to work on it having mantle in?
Answer nope So Thanks humbug for proving how mantle is detrimental to most gamers :rolleyes:
This is why when you dont have proof you dont keep accusing something
 
Last edited:
Yeah I know I did read it but I'll go mad trying to untangle humbug as he purposefully ties himself up to avoid being 'wrong'.

Only joking 'bug.
 
It's BAO - GW title.

I don't need to prove it isn't happening. That isn't how it works.

If I say you're a mass murderer you don't need to prove you aren't to avoid jail, it's up to me to prove you are.

BAO is not the game being question here because the performance is as it should be, its completely out of context, what are doing?

Using a game that has performance parity and trying to pass it off as the game which does not, the game actually in question.
 
The biggest cause for concern for me is despite Nvidia saying they will give source code access to game devs, they have it as part of the GameWorks contract that the game dev is not able to share that source code with AMD, or provide any performance optimizations towards it that would improve AMD's performance in GameWorks enabled titles.

Since it's Nvidia middleware I don't see that as a big problem. Just because it's not what AMD choose to do with TressFX doesn't mean that everyone has to do it like that.
Is it ideal? no.
Does it prove that GameWorks is causing issues beyond AMD not having much chance to optimise for it? (Which apparently after 5 years of DirectX there's not much left for them to optimise in their drivers, apparently)
Even Ryan brought up that Nvidia had issues on Tomb Raider, the TressFX game. I believe there was a quote floating around where Nvidia said they only got sigh of Tomb Raider late on.

As said, GameWorks could in theory be used maliciously and I think it's poor form on Nvidia's part if they do. But just having a middleware library that you're not openly releasing source code too doesn't have to be a problem.

I have to say, for the most part I thought Ryan did a good job.
 
BAO is not the game being question here because the performance is as it should be, its completely out of context, what are doing?

Using a game that has performance parity and trying to pass it off as the game which does not, the game actually in question.

We're talking about GW as a general thing. It won't be the any different in WD, BAO or any other game with GW libraries if it was the GW libraries themselves so the point stands.

I think you've read something and got confused because everybody else is following it just fine :p.
 
No we are talking about GW in Watch Dogs, the suggestion is its since BAO GW has this problem. ^^^^

Since it's Nvidia middleware I don't see that as a big problem. Just because it's not what AMD choose to do with TressFX doesn't mean that everyone has to do it like that.
Is it ideal? no.
Does it prove that GameWorks is causing issues beyond AMD not having much chance to optimise for it? (Which apparently after 5 years of DirectX there's not much left for them to optimise in their drivers, apparently)
Even Ryan brought up that Nvidia had issues on Tomb Raider, the TressFX game. I believe there was a quote floating around where Nvidia said they only got sigh of Tomb Raider late on.

As said, GameWorks could in theory be used maliciously and I think it's poor form on Nvidia's part if they do. But just having a middleware library that you're not openly releasing source code too doesn't have to be a problem.

I have to say, for the most part I thought Ryan did a good job.

Your right, it doesn't have to be a problem, but if they need access to optimise for it. then it is a problem, Robert says it is a problem, it appears to others that he is right, and others disagree.
 
I don't think anybody is saying nVidia are angelically amazing but on the subject of this conversation I don't see the relevance in what you're saying.

I'm certainly not in the business of defending nVidia and what you are saying there is in my view a genuine grievance but as Martini says that's moving the goalposts from what we were originally discussing.

The relevance is if Nvidia are prepared to cripple performance in one instance, where does it end?

Relevant.

But it's not FXAA causing the performance on AMD. FXAA isn't important, it can be turned off, you can run MLAA on AMD through CCC.

Nvidia shouldn't have the ability to cripple via tessellation, because AMD's "fixed" theirs, and AMD users have the ability through CCC to change the levels of Tess.

They shouldn't but here we are, what is the root problem then?

The performance discrepancies are that far apart, you need a salmon smacked off your face rapid style if you ignore the relevance of the performance drop off.

You can't run mlaa in 3D, disabling tess via ccc has an overall detrimental effect in IQ, you turn it down for the cape, it has a knock on effect on snow, scenery etc.
 
It suited you just fine to try and use batman Ao to suit your argument, but because it contradicts it you're pretending like it doesn't exist.
 
No we are talking about GW in Watch Dogs, the suggestion is its since BAO GW has this problem. ^^^^

No that is what you thought we were talking about. We were talking about GW in general.

Anyway, you can't arbitrarily attach it to a game because then you're introducing another variable: the actual optimsation work AMD/nVidia have done outside the GW libraries.

Relevant.

I don't disagree completely, Tommy, but it's still a fallacious statement to make because if I steal an apple (over tesselate) will I also steal a car (purposefully gimp AMD through GW)?

The probability I will is probably higher but that's not to say I will definitely become a career thief.
 
Last edited:
No we are talking about GW in Watch Dogs, the suggestion is its since BAO GW has this problem. ^^^^



Your right, it doesn't have to be a problem, but if they need access to optimise for it. then it is a problem, Robert says it is a problem, it appears to others that he is right, and others disagree.

Nvidia have said it's not a problem and that they can/have optimise without source code.
Just as valid to believe Nvidia as AMD, just because there's not an Nvidia video...

Also, so you're saying that GameWorks isn't a problem, but there's something else in Watch Dogs that's an issue now?
Or is GameWorks only an issue sometimes?
I'm sure LtMatt started a thread about GameWorks in general saying why it was bad, but now it's only bad in Watch Dogs?
I'm sure there were also threads about it being bad in B:AO when that was released and probably a similar thread when AC4 was released too.
 
No that is what you thought we were talking about. We were talking about GW in general.

Anyway, you can't arbitrarily attach it to a game because then you're introducing another variable: the actual optimsation work AMD/nVidia have done outside the GW libraries.

GW in BAO was optimised, Watch Dogs is the first title where the Driver Developer was denied access.
 
Back
Top Bottom