• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia’s GameWorks program usurps power from developers, end-users, and AMD

Caporegime
Joined
12 Jul 2007
Posts
41,991
Location
United Kingdom
TL : DR

Gameworks blocks AMD from providing any optimization for any games using the libraries. :eek:

Just another reason for me to avoid TWIMTBP titles. :(

because AMD can’t examine or optimize the shader code, there’s no way of knowing what performance could look like. In a situation where neither the developer nor AMD ever has access to the shader code to start with, this is a valid point. Arkham Origins offers an equal performance hit to the GTX 770 and the R9 290X, but control of AMD’s performance in these features no longer rests with AMD’s driver team — it’s sitting with Nvidia.

There’s a second reason to be dubious of Arkham Origins: it pulls the same tricks with tessellation that Nvidia has been playing since Fermi launched. One of the differences between AMD and Nvidia hardware is that Nvidia has a stronger tessellation engine. In most games, this doesn’t matter, but Nvidia has periodically backed games and benchmarks that include huge amounts of tessellation to no discernible purpose. Arkham Origins is one such title.

Full Article
http://www.extremetech.com/extreme/...surps-power-from-developers-end-users-and-amd

The Gossip Goats amongst you will remember i picked up on this a while ago and stated the case in the Batman Origin benchmark thread. It was picked up by a hardware site in france the AMD tried to provide performance optimizations for Batman Origins but were blocked from doing so by WB.

One for the Gossip Goats amongst us...


apsp6ZP.jpg


Via google translate from the reviewer at Hardware.fr it seems!

Nvidia up to their dirty tricks again...AMD not allowed to provide any optimizations for Batman AO.


GcXVeHF.jpg


EDIT


The chap who quotes it is the Hardware.fr senior editor:

http://beyond3d.com/member.php?u=1799

They are the main hardware review website in France.
 
Last edited:
Back to the trenches we go :D

This is a bit of a non story and reading that article properly, nVidia has CUDA locked down to nVidia GPU's and nVidia send out their devs to work alongside the games devs. The libraries that Nvidia are using are specific for nVidia hardware and hence why AMD don't have access.

AMD have the option of working alongside the devs from the start but choose to send out code for AMD GPU's instead and expect the devs at WB to impliment it. There is no mention of AMD working with WB.... I remember BF3 and so many people claiming that Nvidia had not allowed AMD near it and yet, they had the same option as Nvidia to work with the devs from the off but chose not to.

AMD are tied into many projects lately (which is great) and seems they are a little short on men for other projects. Don't read more into this than what is being said.
 
Back to the trenches we go :D

:D

This is a bit of a non story and reading that article properly, nVidia has CUDA locked down to nVidia GPU's and nVidia send out their devs to work alongside the games devs. The libraries that Nvidia are using are specific for nVidia hardware and hence why AMD don't have access.

AMD are tied into many projects lately (which is great) and seems they are a little short on men for other projects. Don't read more into this than what is being said.

It can be never be right, good or acceptable to block one side from providing performance optimizations for a game. To the best of my knowledge Nvidia have never been blocked from providing performance improvements for a GE focused title. Fixes or tweaks may have had to arrive after the game has launched, but there was never been a block on it. This is an unhealthy practice, if true.

I have no problem with Nvidia using their own tools to optimize things but when it stops AMD users from getting any performance improvements that makes me want to avoid that title like the plague.

AMD have the option of working alongside the devs from the start but choose to send out code for AMD GPU's instead and expect the devs at WB to impliment it. There is no mention of AMD working with WB.... I remember BF3 and so many people claiming that Nvidia had not allowed AMD near it and yet, they had the same option as Nvidia to work with the devs from the off but chose not to.

GcXVeHF.jpg

Hardware website in France first reported on this a while back
http://beyond3d.com/member.php?u=1799
 
Its the same as using any other middleware api i.e. physics engine, the developer knows they won't have low level control over the code unless they license the original source and rebuild their own version of the library binary.
 
Non-issue for me until it starts happening on genuine AAA titles which I doubt it ever will (Origins is nowhere near as good as the previous two- also not a rocksteady title)
 
Why didn't AMD send out their devs from the start? You can read that response in the spoiler in many ways and what was picked up in the ExtremeTech article was sending out the code and not actually sending out devs to optimize the game for AMD GPU's. I know Nvidia (nVidia just doesn't look right) are toys out the pram guys but it wouldn't be in WB's interests to lock out half the market. I feel there will be more to it and AMD sending out the code instead of sitting with the devs is probably why AMD have lacked optimizations in B:AO.

You have to remember that AMD are extremely busy with Mantle at the moment and this could be why some games are not getting their full attention. Both AMD and Nvvidia have limited resources (Nvidia have over 300 devs) and something has to give. I would like to hear from Warner Bros on this in honesty before making judgement.
 
Why didn't AMD send out their devs from the start? You can read that response in the spoiler in many ways and what was picked up in the ExtremeTech article was sending out the code and not actually sending out devs to optimize the game for AMD GPU's. I know Nvidia (nVidia just doesn't look right) are toys out the pram guys but it wouldn't be in WB's interests to lock out half the market. I feel there will be more to it and AMD sending out the code instead of sitting with the devs is probably why AMD have lacked optimizations in B:AO.

You have to remember that AMD are extremely busy with Mantle at the moment and this could be why some games are not getting their full attention. Both AMD and Nvvidia have limited resources (Nvidia have over 300 devs) and something has to give. I would like to hear from Warner Bros on this in honesty before making judgement.

Fair shout and ill tweet someone from AMD and ask them to comment on it. I expect as its a TWIMTBP title they were just flat out blocked from providing any performance improvements for it.
 
Origins is nowhere near as good as the previous two- also not a rocksteady title

Of-topic but that is so true. The weakest of the three by big margin.

The entire red vs. green thing, I hoped for a Christmas miracle were both sides join hands and sing songs near camp fire while the PC-MASTER-RACE benefits, but 'twas just a dream :D
 
Last edited:
Pouring triangles into these surfaces can make them look subtly more realistic, but it’s also a cheap way to disadvantage a competitor.

developer gets support from nvidia, adds features that make the game more realistic for nvidia users and can be turned down and/or off on other gpu's shocker
 
Fair shout and ill tweet someone from AMD and ask them to comment on it. I expect as its a TWIMTBP title they were just flat out blocked from providing any performance improvements for it.

It would be good to hear something from AMD or WB on it and looking at bench results, AMD are massively behind but then when you look at other TWIMTBP titles, AMD are <=> in fairness. I know GameWorks is Nvidia's new libraries (and looking promising as well from Montreal 13) and without getting into the nitty gritty, AMD could strip down the code and reassemble it for AMD optimizations. For those devs at AMD, this should be done already in truth and then it is just a case of optimizing.

Christmas is over men, back in your trenches until new years eve when we all stop for cheese and crackers for one night.

I will bring the wine :p
 
developer gets support from nvidia, adds features that make the game more realistic for nvidia users and can be turned down and/or off on other gpu's shocker

This takes it one step further. AMD was denied any chance to actually look at the game. Its not about some added feature. Even in AMD sponsered games,Nvidia will get access to at least some of the code,so they can do some optimisations. FFS,what do you think driver development entails?? Both AMD and Nvidia need access to some of the game code:

http://i.imgur.com/GcXVeHF.jpg

Read that quote again carefully and read the article again. AMD offer resources to the dev too,so they were trying to support the dev,but Nvidia did not want them to as part of the programme.

So what is Nvidia going to do with multi-platform games?? All of the consoles run AMD cards,two of them GCN based,so they are just going to artificially kibble performance on PCs??

I can see where this is going to go though and this time AMD can actually fight dirty tactics with their own ones.

It looks like it is back to two rigs with an AMD and Nvidia card for me at some point.

Good bloody way to go. Lets have more people on consoles then,as it means they won't need to give a damn about what card they need. No more "driver" problems for any of them.

While both are scrabbling over the declining discrete card market,the 800LB Gorilla,ie,Intel is looming behind them. Idiotic fools. Their IGPs are improving at a massive rate and they can spend 10X the amount AMD and Nvidia can on dev bribes - they have already done so in the past.
 
Last edited:
developer gets support from nvidia, adds features that make the game more realistic for nvidia users and can be turned down and/or off on other gpu's shocker

Yes, it's when Nvidia add massive amounts of tessellation to FLAT objects that does absolutely not improve quality and is only there to harm AMD performance because they have optimised for a sensible amount of tessellation and not extreme and worthless amounts of it.

In Crysis 3 Nvidia added extreme tessellation to road bollards, to the flat surfaces of the sides, it has literally no improvement at all, but the real kicker is, it reduced Nvidia cards performance to do this, it also hurt AMD performance but to a larger degree... until you use the AMD driver to simply limit the amount of tessellation at which point the only users being hurt by Nvidia's own stupid attempts at cheating were once again.... Nvidia users.

They also showed the ocean... tessellated.... UNDER the city map(I would guess only in the levels with the ocean visible at least somewhere), not visible to anyone but bringing down performance anyway.


Also Greg, the entire point of the TWIMTBP platform is that they pay the game maker to support them and only them, AMD could not have sent dev's along as well, with TWIMTBP platforms Nvidia pays companies to not work with AMD, it's that simple.
 
You know somethings not right when a 770 is faster than a 290X though. :D

I guess, but I played City like last week and it was diabolical compared to Origins.

I'd rather have my card nerfed compared to the competitor, but giving fine performance, rather than just having plain crap performance.

EDIT : I just looked at their figures, the 770 isn't winning, we don't know what fan they're running on the 290x, could throttle.
 
Last edited:
TL : DR

Gameworks blocks AMD from providing any optimization for any games using the libraries. :eek:

Just another reason for me to avoid TWIMTBP titles. :(



Full Article
http://www.extremetech.com/extreme/...surps-power-from-developers-end-users-and-amd

The Gossip Goats amongst you will remember i picked up on this a while ago and stated the case in the Batman Origin benchmark thread. It was picked up by a hardware site in france the AMD tried to provide performance optimizations for Batman Origins but were blocked from doing so by gameworks.

Why shouldnt they ? It's their hardware they can do whatever they like.
 
Back
Top Bottom