• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia’s GameWorks program usurps power from developers, end-users, and AMD

This is what it boils down to when it comes to level of understanding of GameWorks and its implications. First look to see what gpu the poster has. Then apply the relevant picture. :p


Nvidia GPU
hfhkxFJ.jpg



AMD GPU
zx5rnGn.jpg
 
For God's sake, spell-check that before Rusty gets here ;)

:D

LT matt
GameWorks: Optimized for Nvidia. Prevents AMD from optimizing its drivers for DX11 games.
so AMD have Never put out a driver update for a gameworks game at all?
if they have they can clearly optimize
and btw i'm asking ,i never looked at what games the patchs for my AMD cards do

They cannot optimize via drivers/game code features like Tessellation, multi gpu scaling and performance, HBAO etc because of the closed library of GameWorks. Until this it was always possible after a game had launched for either side to optimize, with GameWorks that is only possible for Nvidia. It should be noted that even a pro AMD game dev could not optimize for AMD cards if use GameWorks library, even if they wanted to.
 
Last edited:
No I'd be exactly the same opinion because I don't let my opinions be formed based on what GPU I've got in my machine. It's nothing to do with what GPU I've got. You're assuming everything thinks along AMD / Nvidia lines again.

Regarding Splinter Cell: perhaps God forbid the AMD drivers are just not up to scratch in this game? Where does the line get drawn?

Interesting theory. Does seem odd how it on becomes apparent in GameWorks titles though. Just coincidence i suppose, right? :D
 
Black List isn't a GameWorks title, Matt. Says so in your initial article.

Correct, not formally. However Nvidia used part of it (GameWorks) to provide the games HBAO. Probably the reason why he states a 290X is only 16% faster than a 770 in Black list.

To answer this question, I’ve spent several weeks testing Arkham Origins, Assassin’s Creed IV, and Splinter Cell: Blacklist. The last is not a formal GameWorks title, but Nvidia worked with Ubisoft to implement the game’s ambient occlusion, and early benchmarks showed a distinct advantage for Nvidia hardware when running in that mode. Later driver updates and a massive set of game patches appear to have resolved these issues; the R9 290X is about 16% faster than the GTX 770 at Ultra detail with FXAA enabled.

A titan is probably 35% faster id bet, possibly more than that and 290X is generally a bit faster than a titan in most games.
 
Last edited:
Nvidia Attempts To Create A Moat Around GPU Market Share With Proprietary Technologies

Other than the negative press, Nvidia is also releasing proprietary GPU technologies that will only benefit Nvidia cards; namely G Synch and GameWorks.

For more information on G Synch, read this article. I will say that this is one of the Nvidia initiatives I am paying attention to. It looks promising and could provide a moat for Nvidia if it catches.

GameWorks is more of a direct assault. GameWorks essentially provides "black box" libraries that will help developers, but also make it extremely difficult for AMD or Intel to optimize drivers to improve performance. For more information, read Joel's article. But let me draw attention to his conclusion:

In the long run, no one benefits from this kind of bifurcated market - especially not end users.

Titles that use GameWorks libraries will put AMD and Intel at a distinct disadvantage by preventing driver level optimizations after a game is launched.

Full Article
http://seekingalpha.com/article/1921341-amd-so-what-now-part-1-gvs-revenues?source=feed
 
I am still puzzled as to how AMD released optimizing drivers giving up to 35% gains with 13.11 Beta 6...

http://www.tomshardware.com/news/amd-driver-catalyst-13.11-radeon,24876.html

I've already explained that to you Greg. Infact all your post does is back up what i said earlier in the thread.

AMD improved AA performance in Batman with that driver. GCN is more powerful than Kepler with AA activated so they can overpower the gameworks advantage and pull ahead with that feature activated. MSAA is something you don't need game code/dev cooperation for. Extreme tech are factually 100% correct.

I'll say it again for the 18th time. The part what concerns me is AMD unable to optimize for their own gpu's like you would traditionally. Having to leave Tessellation/HDAO optimization and mutli gpu scaling and performance in the hands of their rival is something which is concerning. Especially given Nvidias track record for using large amounts of unnecessary tessellation to harm amd performance as well as their own.
 
Last edited:
Really Matt, after pages of being proven wrong you post another article that has a throw away comment aimed at nvidia?
Gameworks is not proprietary, it does run on AMD/Intel hardware, and as end user benches show the gameworks games run just as well on AMD hardware as on Nvidia, with AMD even successfully releasing driver updates for said games

I get the feeling this thread is just matt's personal chagrin at the black screen thread that is constantly on the front page

Three different articles written by three different journalists now all saying the same thing. As for being proven wrong just because you or others say its wrong does not make it the case. Joel Hruska who wrote one of the articles for Extreme Tech is very respected and i certainly take his non biased, well researched opinion over yours. You're welcome to try and bring other issues with other gpu vendors into this if you want it to help support your claim, but it has no relevance to this.
 
lol Matt, you have not explained nothing. I asked you if you understood how DLL's work and you didn't answer. Joel stated that AMD were locked out and I have clearly shown they are not on several occasions and posted facts of which you chose to ignore.

GameWorks is an open standard. If Nvidia are saying it, I will believe that over some guys Blog in honesty and this latest article is even worse than Joel's.

You believe what you want but to see AMD outperform me on a flashed 290 in Batman: AO, this makes the OP and Joel's article look stupid.



But AMD are locked out, so how can they release performance drivers?

Did you try running your test without AA and just FXAA to see if the claim is correct? If performance drops massively on AMD cards then you know its correct. The only optimization AMD can do is using AA so test it without and see if what i said is true. I think you're extremely naive if you think GameWorks is open standard. Nothing Nvidia do is open standard.

But AMD are locked out, so how can they release performance drivers?

LtMatt said:
AMD improved AA performance in Batman with that driver. GCN is more powerful than Kepler with AA activated so they can overpower the gameworks advantage and pull ahead with that feature activated. MSAA is something you don't need game code/dev cooperation for.
 
if GameWorks features don't run on AMD hardware then how can they affect AMD performance, surely they would just be disabled, lowing IQ but increasing frame rates?

Tessellation, HBAO, multi gpu scaling and performance. All things that come under GameWorks and features that are available for AMD. There may be more, but these are the things i know of. Things that AMD are unable to optimize for at driver level because of the closed libraries of GameWorks.

Explains why crossfire never worked in one of the Assassins games and why crossfire never worked for Ghosts on launch. Crossfire has always worked flawlessly with COD until it became a GameWorks title.
 
Last edited:
Unlike most of the other articles on the subject - which show a marked ignorance of what actually happens in game development and an almost daily mail level of reporting on the subject complete with definite bias - Joel Hruska's article is indeed very well written with a lot of insight however I don't think it supports your views as you appear to think it does - some of his comments could be taken in a narrow context to suit one or other agenda or opinion but most of them are meant from the perspective and insight of a wider view of game development as a whole.

He demonstrates very well that the problem with Batman:AO isn't with GameWorks itself and that the closed nature of the libraries is not in itself anything unusual* - and shifts the blame performance wise to tessellation and ultimately points the finger at WB rather than nVidia.

*Though hes not a fan of that aspect in a wider scope of game development regardless of who is behind the libraries.

Agreed. I have it on good authority that GameWorks is closed though and neither the dev nor amd are able to optimize for the things ive mentioned which is the problem i have with it.
 
Thats the thing - gameworks being closed isn't any different to other similiar libraries like enlighten, speed tree and so on but people are carrying on like its different, probably due to being an easy thing to latch onto to bash nvidia, and in doing so taking focus away from the real issues here - to your credit you are the only person I've seen here who has gone away and asked AMD/WB "hey what actually happened with AMD's attempt to optimise for this game?"

I think we can all agree theres a problem when AMD are unable to optimise their own drivers when a game uses GameWorks. This is not something they'd say, or journalists would report on, without good reason. In my opinion this is Nvidia's proprietary version of Mantle and will ensure they can outperform any AMD card, if they so desire. If im wrong on this, then ill happily apologise down the line. I don't think that will be the case though. I can only hope Nvidia optimize GameWorks to allow AMD some wiggle room. After all it will still favour their cards, but no need to put up impossible boundaries for AMD either.
 
Last edited:
Oh I agree its not a good thing at all, I just find it a bit tedious and red herring when people focus on "nVidia has 'locked' the gameworks libraries to keep AMD out" as though its something specifically intentional and unique to gameworks when all binary libraries work in the same way in that regard.

Neither am I saying nVidia won't try to use this as a way to marginalise AMD or that pressure shouldn't be put on developers and/or nVidia to prevent them using the natural functionality here to hide intentional degredation of performance on other platforms.



When you get past the hysteria - the real issue here isn't the performance as is but whether AMD are being shut out from being able to optimise to show their true performance - it could be that as things are the 290/X and 780/tis show comparable performance but with optimisations the AMD cards could run even faster in the game(s).

Agreed. I have no problems with what you're saying and agree with most, if not all of it.. Most don't see it your way though and ignore what has been said and just look at the fps figures. Anyway i think there is more to come from this story. ;)

so pages of benchmarks showing comparable results on comparable AMD/Nvidia hardware in a variety of gameworks titles not enough evidence for you then matt?

gaming evolved titles showing a similar bias towards AMD hardware also totally not an issue for "gamers" in general?

When having GE ever blocked Nvidia at driver level from providing optimizations for Tessellation/HBAO/HDAO/TressFX/Multi Gpu Scaling?
 
Last edited:
I wish both game developers and companies to away with this sort of ********. The only ones inconvenienced/hurt buy this are the consumers.

Yep ill drink to that mate.

No problem with TWIMTBP. Nvidia get earlier optimizations, thats fine. GE works the same way with AMD. Its fair as once the game launches both sides can fine tune optimizations. But GameWorks changes that playing field. No longer will AMD be able to optimize like before, aside from improving AA performance. This is a worry and its not healthy. Hopefully things change regarding what can be done with it.
 
What are you even talking about? It's an open library to developers, the only thing you're basing this on is one studio refusing an update for which we don't even know what it contains to WB Studios. AMD will be able to work with developers just as directly. You have no idea what the reasoning behind WB refusal is, and nor has it happened anywhere else yet.

You're just posting blanket statements based on one crummy article.
There is no facts that state that AMD optimisations will be directly effected. Stop going round in circles it's really quite boring. I could start up a thread about how Mantle could directly affected developers affordable time on Direct X optimisations, but I won't. Because I don't have a website low on hits that needs some attention.

Just because you don't understand nor agree does not make it a crummy article. Article > Your opinion, in my opinion.

If you find it boring don't reply or use the ignore feature.

Feel free to start a Mantle thread if you want though.
 
@Greg The author of the article Joel, has replied to your comment.

Maybe this will clear up some confusions amongst us all... this is for the specific title in question.

I used FXAA because switching to MSAA is precisely what allows the 290X to leverage enough raw power to overcome the GTX 770. My benchmark results are not "massively wrong," simply because you don't like the tests I chose. My test results accurately summarize performance in the mode tested.

When you turn MSAA to full, you push fill rate hard enough that card performance comes down to brute force. Yes, the 290X wins that comparison. This does not explain the performance gap between FXAA performance of AO and AC under identical circumstances.

Finally: GameWorks locks out optimization of specific functions. It does lock out everything -- just the cutting-edge parts. In Arkham Origins, the following Gameworks libraries are used:

GFSDK_GSA
GFSDK_NVDOF_LIB (Depth of Field)
GFSDK_PSM
GFSDK_ShadowLib (Soft shadows)
GFSDK_SSAO (Ambient Occlusion)

Assassins' Creed IV, on the other hand, uses:
GFSDK_GSA
GFSDK_GodRaysLib
GFSDK_ShadowLib
GFSDK_SSAO

Clearly the GW library loadout is customized and tailored depending on the title. These are the libraries and functions AMD cannot optimize. The fact that AMD can optimize the game and improve performance 35% due to other changes does not change the fact that GW-specific changes are locked out. And I believe the original story makes this distinction quite clear.
 
Last edited:
http://www.techspot.com/review/733-batman-arkham-origins-benchmarks/page2.html

Honestly don't see anything untoward there, bearing in mind that is before the patch.

LOL that backs up the claim perfectly!!

So 290X faster than a titan with AA but 19% slower with FXAA. So that is the GameWorks penalty of 20%. Maybe more than 20%, not sure how much faster the 290X was over the titan with AA applied. Without AA and AMD brute forcing their way to the top 20% is the performance penalty of GameWorks for a 290X vs a Titan in Batman.

See now if AMD could actually optimize a bit more for this, i wonder what the performance penalty would be. Would there be one at all?

Am I allowed to ask how he knows these functions are locked out? They're fundamental DX technologies.

Ask him. He'll reply to your quote. Check the comments section.

http://www.extremetech.com/extreme/...om-developers-end-users-and-amd#disqus_thread
 
Last edited:
I was looking more at 7950 performance, seems to scale well enough with competing hardware, in this case the 580 GTX. Of course the 290X is a newer card and thus the drivers are in their infancy. Something you choose to ignore in this case. In fact yes, let's look at the performance in the mid section shall we.

Any thing you care to add on that?

On why a 580 GTX and 7950, which are both very similarly performing cards - are performing very similarly?

Maybe we should ask your friend what it is in AMDs current drivers that they're unable to optimise, and what optimisations exactly were rejected? Would he be able to comment on that at all?

Take a look at these. ;)


GTX580 gets near to a 7950 with FXAA.(somehow)

Moek20r.png



Yet once we pile on AA x8 what happens to the 580? It vanished into oblivion. I wonder how it managed to get so close to a 7950 with FXAA?

u6oQAYD.png


The GameWorks effect. Only nullified via GCN's superior AA brute force performance.

Maybe you should ask Joel, as you don't quite seem to grasp it. I think you should because we're just going round in circles and he has a superior knowledge to me or you.
 
You'll also notice there are no other 1gb or 1.5gb cards in that list. MSAA 8x, right well. That's proven me wrong then. Crickey.


I grasp exactly what Joel is saying, but Joel doesn't know any more than you or I other than what he's being told. Subsequently by AMD.
.

Ok lets try this again. Lets take your vram argument out of things. So a basic 660 2gb card 3.5% faster than a 7950 boost at 1080P with FXAA and the GameWorks effect. How this is possible, ive no idea. Its a god damn 660, low spec.

Slap on some AA and what do we have? 7950 boost now 36% faster. 36%, not far off the 35% AA boost, eh? ;)

The GameWorks effect Nvidia sending spending there $$ by sending out there employees out to work with game devs at the start of projects to ensure games run well on purchasers of their hardware.** fixed

Whilst also ensuring AMD cannot provide 'complete' optimization for their users. Great for the gaming industry hey.

I grasp exactly what Joel is saying, but Joel doesn't know any more than you or I other than what he's being told. Subsequently by AMD. News Flash. FXAA works better on NV cards.

Lol you're getting desperate aren't you. So now you're telling me that Nvidia are better than AMD at providing FXAA which normally costs 1-2fps MAX in more demanding games? I rest my case Frosty.
 
Last edited:
Back
Top Bottom