• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Exclusive – Nvidia Talks GameWoks And DirectX 12 Plans For 2016

GameWorks effects are not Game Critical, neither are they on some kind of subscription service.

In the real world, 'open and free' is only applicable to certain wares. Normally ones that don't take time to develop and directly benefit your competition by handing it to them. This is the outline, as a consumer you cannot make that decision. You can either not use them - or purchase an NVIDIA product in order to use it's technologies.


This has been covered so many times that it doesn't bear thinking about. If one finds this decision taxing enough, then the answer should honestly be simpler when looked at objectively.

The GW program is not free, hence it's focus. It gives developers incentive to add (more) PC exclusive technologies to games where otherwise there would be none or next to. Open and free no matter how good your technology maybe gets you a licence with one or two games if going by example.

If you cannot understand why NVIDIA chose not to openly source these to AMD, and why AMD chooses to make their technologies open - then that is something one would have to come to an understanding on in their own time, and sadly if the media would let them.


As equally as easy to understand, is not liking what GW has to offer involves an on-off toggle, just as soon as you're not naive enough to discount the developers involvement in implementing them, as this is up to them.
 
Last edited:
I like the idea of gameworks, but it should be open source / from another vendor. Nvidia having there hands in how games are designed is a little dubious and even if you choose not to be sceptical it's still got the inherent issue of being biased to Nvidia anyway since it's there own tech so they will choose to make it in a way that suits them of course.

I don't mind the idea of gameworks but it just needs to be fair / open and when you see the kind of performance drops as well it's clearly got some room for improvement and I don't know if Nvidia are the ones who are honestly going to do that without trying to hamper the competition. Overall it's good and when I use it then it does improve the game but it's entirely missable in my opinion, there's nothing that makes me feel I must have it and there never has been which is quite sad when the interview makes it out to be stuff that is meant to be years ahead. I think the main stuff I'd be interested in are simply hbao + and other aliasing techniques. Keep the game clear and sharp and that will do Nvidia, the rest of your techniques tend to look okay but not great anyway.

If you take any GameWorks title and strip out the GameWorks effects, that is the day to day game that we all get. Then add some effects that enhance the game and that is what GameWorks brings. Some don't like the effects because of Nvidia reason, some don't want the effects because of performance hits (this baffles me because they are optional) and some are just so anti Nvidia and get drawn into all the politics and will dig and dig and dig and whine etc but realistically, some of these effects enhance the game.

This has been one of the longest ongoing arguments I have seen and it gets brought up often but they give Nvidia users a performance hit as well as AMD and requires more grunt than what it should but they are added libraries for enhancements and were developed to make the devs lives easier and free up time for other things. I love AC:S but man that is a crippler to run on my system and there isn't much in the way of GameWorks effects in it (HBAO+/PCSS and TXAA (which I am not a great fan of)) and all of this is on a real old engine that was first used in Prince of Persia on the Anvil Next Engine back in 2008, but then you look at Battlefront and it is stunning and runs really well and it is on a much newer engine (frostbite) that is superbly optimised.

My point being that better optimisations and better engines would result in less performance drains and better looking games but whilst these engines cost a lot of money and the big publishers look at profit, profit and profit, I am happy to have a few effects tacked on to make a game look good, even if it does come at a cost. Oddly enough, the biggest selling game of 2015 was BO3 and that is buggy/ugly and on an old ass engine.

Nvidia are not always the bad guys.
 
The shrewd buyer looks at the VW and the Audi and the SEAT or Skoda and says well i can get the same performance and for my money get all these extras if i buy the SEAT or Skoda than i would if i bought the Audi or the VW, while some of the extra might be as refined the underlying car is still the same identical car.

The brand concious buyer sticks with the Audi or the VW if thats what they normally buy.

Is the underlying car essentially the same? ANd what is the underlying car? The engine and chassis? BUt what about the standard and quality of the rest of the components? Bodywork, paint etc, lights etc. Performance is only one aspect of what we enjoy in automobiles. I personally wouldn't buy a skoda or seat and believe I'm getting a proper VW or Audi and it's nothing about the branding. Anyone with the money, or more acurately, the correct level of wealth, to buy a VW or Audi IMO will likely do so.
 
Last edited:
this thing about gameworks is funny, the fact that Nvidia allow AMD GPU to work with gameworks should ring a bell for ppl, because it's out of Nvidia's character, to me there is more to it than what ppl think.
and it wouldn't have bothered me as much if Nvidia closed gamework to AMD GPU, that would be their right, like they did with nvidia physx for years.
or AMD should disable them from their drivers if it possible, thats my take on this.
 
this thing about gameworks is funny, the fact that Nvidia allow AMD GPU to work with gameworks should ring a bell for ppl, because it's out of Nvidia's character, to me there is more to it than what ppl think.
and it wouldn't have bothered me as much if Nvidia closed gamework to AMD GPU, that would be their right, like they did with nvidia physx for years.

PhysX is probably the reason why nVidia chose not to lock AMD out. One would assume PhysX saw... very few titles using it because it locked AMD out - devs don't want to alienate users. Allowing GW to work on AMD hardware makes it more likely that devs will use the libraries, and thus money for nVidia. I don't see this as "out of character" but perfectly within character, it's not generosity - it's profit.
 
You can disable all the GameWorks effects AlamoX, AMD or Nvidia users.

i know greg, but that's not the point here, the majority of user wont care if they can disable it or not, for a lot of user this is simply a game features that will work better on Nvidia than it would for AMD, that's how far their interest will go (assuming there is a probleme with a feature or unoptimised one).
having the features work with both, discourage the devs to put an alternative to the feature, and always depands on nvidia if they wanna play nice or not with AMD.
thats why the best solution is to make it nvidia's eco-system, but for the first time they seem to be really generous with their R&D and investment outside of their eco-system :D , and really insisting on making AMD users benefit from it :p
 
Last edited:
i know greg, but that's not the point here, the majority of user wont care if they can disable it or not, for a lot of user this is simply a game features that will work better on Nvidia than it would for AMD, that's how far their interest will go (assuming there is a probleme with a feature or unoptimised one).
having the features work with both, discourage the devs to put an alternative to the feature, and always depands on nvidia if they wanna play nice or not with AMD.
thats why the best solution is to make it nvidia's eco-system, but for the first time they seem to be really generous with their R&D and investment :D

It would be interesting to compare performance hits for both AMD and Nvidia and I doubt there would be much in it. Those debris effects give me quite a big hit on performance in FO4 (I know it is Nvidia only PhysX) but something like WaveWorks in JC3 or HBAO+ in many of the other GameWorks titles. Just to add, TW3 ran very well on my Fury X and dealt with the GameWorks effects far better than I was expecting. Still a biggish hit but it was also a biggish hit for my Titan X.
 
PhysX is probably the reason why nVidia chose not to lock AMD out. One would assume PhysX saw... very few titles using it because it locked AMD out - devs don't want to alienate users. Allowing GW to work on AMD hardware makes it more likely that devs will use the libraries, and thus money for nVidia. I don't see this as "out of character" but perfectly within character, it's not generosity - it's profit.

Essentially Physx is the main feature that Nvidia locks AMD out of and uses as a selling point. The new Fallout 4 Nvidia only debris feature is a Physx particle effect but they call it Flex now.

Devs could easily do better effects on the cpu as proven by older games such as BFBC2 and Red Faction Guerrilla and now R6:Siege but it seems some devs don't want to put in the effort or are given a cash incentive by Nvidia.

From a performance point of view these effects are pretty heavy and not worth it in games that already run poorly but nevertheless the option would be nice to have if your system could run it. By closing the feature to AMD and to a lesser extend Intel, Nvidia just makes themselves look greedy when every non-Nvidia tech is open to everyone.
Also by using gpu only Physx they are in effect cunningly forcing their users to upgrade their cards when we all know the multicore cpu's have some cores doing nothing most of the time.

Hairworks has a competitor in TressFX but AMD really need to catch up on the Physx side with their own version that can run on the consoles too. That's the only way they can counter Gameworks and also ensure an open standard.
 
Last edited:
It would be interesting to compare performance hits for both AMD and Nvidia and I doubt there would be much in it. Those debris effects give me quite a big hit on performance in FO4 (I know it is Nvidia only PhysX) but something like WaveWorks in JC3 or HBAO+ in many of the other GameWorks titles. Just to add, TW3 ran very well on my Fury X and dealt with the GameWorks effects far better than I was expecting. Still a biggish hit but it was also a biggish hit for my Titan X.

well gameworks had a lot of controversy about how it affects AMD, even if nothing much shows now, Nvidia would still keep the rope around AMD's neck, and they dont have to pull it all at once, they can ease you to it, and pull it up slowly overtime, end game would be the same, but the reaction to it wont be.
and this is purely speculative from me, all depands how morally sound/corrupt, the firm holding the rope can be, and believe me from experience you shouldn't go there, companies get tempted, and they can dive really deep in grey areas if things get tough.
 
Hairworks has a competitor in TressFX but AMD really need to catch up on the Physx side with their own version that can run on the consoles too. That's the only way they can counter Gameworks and also ensure an open standard.

I don't think either side should be doing them tbh. But PhysX already has competitors; Havok, Bullet, Vortex etc. And they, surprisingly, offer better tech that's vendor neutral. Don't tell that to the nVidia guys though.
 
Devs could easily do better effects on the cpu as proven by older games such as BFBC2 and Red Faction Guerrilla and now R6:Siege but it seems some devs don't want to put in the effort or are given a cash incentive by Nvidia.

I prefer the way Valve did it with the Source engine and Half Life 2 using Havok physics, I mean it's something that is enabled by default and has great effect on the overall game. Whereas with PhysX it only works if you enable it, it's just eye candy - other players don't see it if in multiplayer / it doesn't affect them.
 
I prefer the way Valve did it with the Source engine and Half Life 2 using Havok physics, I mean it's something that is enabled by default and has great effect on the overall game. Whereas with PhysX it only works if you enable it, it's just eye candy - other players don't see it if in multiplayer / it doesn't affect them.

Microsoft bought out Havok so no idea if they are still in the running anymore which sadly makes Physx more popular. I always wondered why AMD couldn't just buy Havok but I suppose Intel wouldn't have let it go to a competitor so easily.

Microsoft should start getting devs to use Havok in more and more consoles games which would negate Physx quite a bit.
 
Bullet is something AMD jumped in and is open source and good with OpenCL and used in a few games (Hotwheels Battleforce/Gavatronix/GTA IV/V) but not seen much adoption.


As you can see, it is very good but not sure why it doesn't get used much. I am sure someone will enlighten us.
 
PhysX is probably the reason why nVidia chose not to lock AMD out. One would assume PhysX saw... very few titles using it because it locked AMD out - devs don't want to alienate users. Allowing GW to work on AMD hardware makes it more likely that devs will use the libraries, and thus money for nVidia. I don't see this as "out of character" but perfectly within character, it's not generosity - it's profit.

well that seem awefully convinient :p, but if they locked the gw api because of physx, they could have kept physx out of gw library, and even if we concede that their motive is broader adoption, they could have included physx into the library and locked physx only not the whole library, a lot of choices made on gw show a pattern of an ulterior motive, at least that's my perspective of things, i am not as trusting as you are i guess.
 
I don't think any of us (well, many of us) is under any illusion here,

Up to and including even *******, I reckon they all know GW's a bit of a stinky pinky but are never going to go public.:p

nVidia are absolutely ruthless when it comes to getting on top and will pretty much do anything to stay there :)

They are dominating AMD rotten, they are that good at at punting gpu's, the majority are under the impression that(most if not all of N's slower cards) are actually faster.:D

Just because I have one of their cards (beaming all my dodgy internet business to them no doubt) doesn't mean that I have to like them ;) :D

+1

I just want to game, personal experience on AMD, sometimes is practically useless for running GW's, ultimately if you can't beat em join em.

The rate Nvidia are churning out GW's titles and some of the exclusive IQ, unless you get unlucky with some kind of Nvidia specific issue, as AMD have next to nothing in the game tie ins(and since they don't do any binaries in the ones they do), there is next to no reason to go back to AMD.
 
Last edited:
As you can see, it is very good but not sure why it doesn't get used much. I am sure someone will enlighten us.

I reckon Nvidia bankrolling GW left right and centre, AMD are only full of ideas of openness yet only open their wallet on a blue moon.
 
Last edited:
Change GW's to Nvidia IQ then, my underlying opinion is the same, Nvidia pay for gpu physX/GW's, AMD hardly ever pay to implement their IQ.
 
Last edited:
Here's a video I found comparing the physics in CryEngine 3, Physx and Bullet (all CPU versions). Unsurprisingly Physx runs the worst here. The comparison shows how cpu optimised physics engines can run just as good or better than gpu physx. The fancy new Nvidia Flex effects are easily replicated in CryEngine.

 
Last edited:
Back
Top Bottom