• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia’s GameWorks program usurps power from developers, end-users, and AMD

IF Mantle ran on Nvidia (and seriously, we don't even know what it's going to do for AMD performance yet), does anyone actually believe it will ever run as well/better on Nvidia than it will on AMD? I certainly don't.



Where is the gaurantee of that? It's like saying physx will run on CPU. It does. But not as well.

Mantle will almost certainly be to Nvidia users, what any Nvidia optimised feature is to AMD users.

But such is life. If it were THAT important to me, I'd own AMD.



Oh, I can't argue with this. It's something I still don't understand about them. And as I have said, I often don't like Nvidias practices.

But having said that, I still don't believe Gameworks will be any kind of threat to gaming on AMD hardware in the future.

For the same reason Physx, as I mentioned above, is nothing more than token eye candy.

If these Gameworks libraries don't work well on AMD, there will be alternatives, and what they add will continue to be token additions.
You are over looking one VERY simple point. Regardless of if Mantle would improve performance for Nvidia cards (if Nvidia do decide to take advantage of it and incoorpate design to support in it their future gen cards) as much as it does on AMD cards...as long as it has ANY improvement over the performance of Nvidia cards on dx11, it would ALREADY be a benefit for Nvidia users as they will be getting MORE performance than they original would have had; GameWork's "locking up the bonnet" approach however certainly ain't gonna benefit AMD users in ANYWAY.
 
Last edited:
The main thrust of the is that with other 3rd party libraries, you could consider both vendors at a equal disadvantage to optimising performance. Whereas with Gameworks any feature (big or small) will naturally be fine tuned for one vendor(creator) while the other faces the same disadvantage. The creator is unlikely to help and the game developer cannot. One vendor can showcase performance to the full, while the other is hindered slightly.
I agree, but still doesn't equal 'locked out completely'. AMD can still optimise their drivers for GameWorks titles, it's just more difficult for them than Nvidia. I don't really have that much of an issue with that frankly. I'd rather neither side acted like that but GameWorks isn't some massive escalation of what was going on before as the article claims.
 
I know, the thrust of matt's argument is that he believes that AMD should have access to the full source code for all of Nvidia's gameworks features so that they can optimise for it, not realising that no one gives out source code for anything yet they can still optimise for it, it's a complete non-point

No you misunderstand or just choose to ignore what im saying. It is not normal practice to block AMD at dev or driver level from being able to provide optisations for a game, especially when using GameWorks Nvidia are still able to do this. Its pretty basic stuff. Up until now this is not how it works. Its normally the practice with sponsored titles that once the game launches the other side is able to optimise drivers and or work with the dev to improve performance. See the Tomb Raider example we talked about before. If AMD had a GameWorks style libraries in Tomb Raider Nvidia would not have been able to improve TressFX performance in Co-op with the developer. With GameWorks this is no longer possible, even if the DEV WANTS TO HELP AMD. This is the problem.
 
If Josh had started his article ny making it clear it wasn't his findings but that AMD had made him aware; a lot of users views may have swayed. Makes him look like a puppet surely. Whole thing is a whirlwind of windup.

Some proper clarification on Gameworks stand point needs to be address ed by NV themselves or possibly another developer other than WB Studios.
 
I agree, but still doesn't equal 'locked out completely'. AMD can still optimise their drivers for GameWorks titles, it's just more difficult for them than Nvidia. I don't really have that much of an issue with that frankly. I'd rather neither side acted like that but GameWorks isn't some massive escalation of what was going on before as the article claims.
It's not an issue IF we could trust Nvidia to do the right thing, but I don't think I can trust them to do that purely based on their history of locking down PhysX when it detect the present of a AMD GPU, even when people got a Nvidia card as a dedicated PhysX card.
 
I convieniantly removed it because i could be bothered to comment on it.

So,

1, Amd release mantle support for bf4, fps and higher IQ is optimised on all amd cards.

2, Nvidia cards still run the same iq and fps in bf4.

3, Nvidia use gameworks libraries to optimise batman that are locked to nvidia.

4, Amd cards still run the game at higher settings as fast as nvidia cards.

As a BF4 fan i would not be happy if a £200 amd card had the same performance as a £400 Nvidia card.

Amd put a lot of time and Enhancements into making bf4 a GE title

Nvidia put a lot of time and Enhancements into Making Batman a TWIMTBP title.

hypothetically speaking of course.

I dont like Batman btw and dont own it.
 
If Josh had started his article ny making it clear it wasn't his findings but that AMD had made him aware; a lot of users views may have swayed. Makes him look like a puppet surely. Whole thing is a whirlwind of windup.

Some proper clarification on Gameworks stand point needs to be address ed by NV themselves or possibly another developer other than WB Studios.

So it's not the findings, it who made him aware of them that's the issue?

Lol try to discredit him because he's reported something that reflects poorly on Nvidia. :D


 
matt, on this very page you've stated that you believe that NVidia should provide the source code for game works to AMD and that AMD will provide Nvidia with the full source code for mantle - AMD have stated that this is not the case, MS don't hand out the source code for DX, it is totally not required

AMD can run a gameworks title on their own hardware and inspect what it is doing and alter their drivers to suit - the article talks about AMD asking WB to change the games code to add in AMD's alternative and WB declined, that is a matter between WB and AMD, it has nothing to do with NV

looking at a range of gameworks titles on a range of cards and settings, there are situations where certain settings favour one set of hardware or the other, looking at GE titles the same occurs, the only way this becomes "an issue" is if you stick your blinkers on and only look at a tiny sub set of data instead of looking at the whole picture

potato tomato, a GW game makes a 660 look like a 7950 and a GE title makes a 680 look like a 7870, neither of these cases is "normal" yet they both happen

their are plenty of libraries that devs use that AMD or Nvidia don't get the source code for, it IS normal practice

tressfx is a library, I can bet you any money that AMD have not provided the source code for tressfx to either developers or Nvidia

it's not in developers interests to deliberately gimp performance on one set of hardware across the board - that's why they have settings that you can enable and disable, in TR you can turn off tressfx, in Batman you can turn down or off HBAO+ and tesselation
 
Last edited:
Actually, I'm out. I suddenly realised how much time I had wasted in this thread.

Have fun in here guys!

Hope you all learn to just get along :D
 
So it's not the findings, it who made him aware of them that's the issue?

The findings aren't conclusive though. 290x is a new card why is everyone ignoring that premature drivers could be a contributor?

FXAA is inheritantly an NV technology. Also Josh is making digs at heavy tessellation then goes back on himself in an edit saying after a lot of digging there isn't anything cryptic. Credibility meltdown.

It stinks and I don't understand how anyone could take it seriously without seeing definitive proof that the GW library is at fault. There isn't anything to suggest AMD cannot enter aammendments other than this one example of refusal.

Another thing, If AMD had anything on GW what would stop them taking legal action? Why go to a 3rd rate site to try and unearth sabotage. Which we now know does not exist. For him to assume such AMD must have implied it at least.

Stinks. I just think it's best to forget the whole thing until such a time a GW title has clear performance problems, or AMD come forward with what exactly they're losing out on. As at the moment all we have is that Batman AO favours 290x with multi sampling which could imply anything including issues with game code. No sabotage remember.
 
Last edited:
I agree, but still doesn't equal 'locked out completely'. AMD can still optimise their drivers for GameWorks titles, it's just more difficult for them than Nvidia. I don't really have that much of an issue with that frankly. I'd rather neither side acted like that but GameWorks isn't some massive escalation of what was going on before as the article claims.

I think the main downside is it is another of those things that customers will need to be aware of. Performance in X title in Y review is due to Z and is not representative. In itself I agree there is nothing malicious about nVidia creating it's GameWorks library.
Perhaps the article had a accusatory edge (maybe because it also brought up the over tessellation issue as a precedent) which overshadowed what should have been a more lamentable take.
 
matt, on this very page you've stated that you believe that NVidia should provide the source code for game works to AMD and that AMD will provide Nvidia with the full source code for mantle - AMD have stated that this is not the case, MS don't hand out the source code for DX, it is totally not required

AMD can run a gameworks title on their own hardware and inspect what it is doing and alter their drivers to suit - the article talks about AMD asking WB to change the games code to add in AMD's alternative and WB declined, that is a matter between WB and AMD, it has nothing to do with NV

looking at a range of gameworks titles on a range of cards and settings, there are situations where certain settings favour one set of hardware or the other, looking at GE titles the same occurs, the only way this becomes "an issue" is if you stick your blinkers on and only look at a tiny sub set of data instead of looking at the whole picture

their are plenty of libraries that devs use that AMD or Nvidia don't get the source code for, it IS normal practice

Well you can download the SDK code for TressFX Andy, but you were using that as your argument a few posts back saying it was the same as GameWorks. Can you direct me to the SDK for GameWorks?

When Mantle is released it will be able to be examined in full by Nvidia so they can optimise for it. Same for Intel and all the other players.

Andy you need to re-read the article, as you're incorrect. I'm not going over it again, read it yourself. Check the comments section as well. Joel clearly explains which parts of the GameWorks library are locked to Devs and AMD, even at driver level. So AMD cannot even optimise at driver level for several different parts of GameWorks, though it seems to vary from game to game what can be done. Tessellation, HBAO, multi gpu scaling and probably other things im not aware of. Explains why crossfire just does not work in some GW titles. Call of duty ran flawlessly on AMD hardware and crossfire (scaling and performance was always outstanding) until it became GameWorks. Now look at the 7990. Aside from that the game just runs horribly on AMD. So GameWorks got SLI working, but didn't optimise for crossfire. That's ok, AMD can optimise at driver level right? Wrong, its GameWorks.


1pw6y8O.jpg


9TV1yYm.jpg
 
Last edited:
If Josh had started his article ny making it clear it wasn't his findings but that AMD had made him aware; a lot of users views may have swayed. Makes him look like a puppet surely. Whole thing is a whirlwind of windup.

Some proper clarification on GameWorks stand point needs to be address ed by NV themselves or possibly another developer other than WB Studios.
Because had AMD carried out the investigation directly themself, it wouldn't make them look like a sore-loser? Getting a 3rd party to investigate, and the 3rd party would automatically be "compromised", and their evidences means nothing?

Fortunately for Nvidia they don't need all these...all they need is say a few words, and they would already be passed around by loyal followers like gospel and need no 3rd party to varify their statements, and anyone that challege their claim/statements MUST BE with AMD. Few pages back as soon as someone quoted "Nvidia said" GameWorks is open (whatever that means), so many people immediately acting like God has spoken and banging that this thread being pointless and it is a non-issue.

To be honest, what we need a "neutral" developer to comment on the limitation of coding/optimising for AMD under GameWorks prior and post launch. Only time will tells.
 
Last edited:
matt, all you are showing is that you don't know anything about libraries, game development or SDK's

an SDK is a set of example code that allows you to implement a library, NVidia allow you to download their SDK's too if you register on their website, however the libraries themselves from both sides are closed, you cannot download the source code for tressfx any more than you can download the source code for PhysX

ghosts didn't work in SLI for the entire time it took for me to complete the single player campaign, can't say I've reinstalled it to check what the situation is now, but on launch it was the same situation for both vendors (performance scaling might have been good but SLI lead to scope glitches making the game unplayable with SLI on)

same with BF4, I still get artefacting on SLI, is that AMD's fault or DICE?
 
ghosts didn't work in SLI for the entire time it took for me to complete the single player campaign, can't say I've reinstalled it to check what the situation is now, but on launch it was the same situation for both vendors (performance scaling might have been good but SLI lead to scope glitches making the game unplayable with SLI on)

same with BF4, I still get artefacting on SLI, is that AMD's fault or DICE?

Ghosts launched on the 5th. When Guru3d did their review on the 7th SLI was working as i posted. Perhaps you were not using the correct driver version. Guru3d were using 'GeForce cards use the latest 331.70 Beta driver.' Regardless ghosts is a mess for AMD. The GameWorks effect.

Everyone still have the flickering textures from time to time on BF4. That's a game/multi gpu issue i believe. I get the same. If i remember correctly SLI worked on launch with BF4, didn't it? Funny that.

In fairness it appears to be working on the 690 in Gurus results. But I still can't see how GPU scaling issues can be blamed on any optimisations within the GW libraries.

Its explained in the article. Its also why crossfire never works on one of the Assassins Creed games yet SLI works fine. IF SLI can work, crossfire can work. The only obstacle is GameWorks and AMD unable to provide complete optimisation. It appears they can provide some, especially to things which don't require game code/dev cooperation like AA etc but there are keys things which ive mentioned and that were mentioned in the article, that cannot be optimised for and here in lies the problem.
 
Last edited:
Jesus Christ. How is it a mess because there isn't any Crossfire support? GameWorks effect. Care to explain to me how that is related to GW libraries?

You're making the connection to BF4 just because Nvidia had an SLI profile available? How does that make any sense???

Incredible.
 
Back
Top Bottom