Why isn't PhysX used more?

I don't see why NVIDIA are being idiots tbh. It's the developers who choose to utilize the sdk in their games, which when you look at the backlog of games we have there aren't many. :)

Heck not even BF3 uses it and that game was meant to be NVIDIA optimized.

nVidia provide "incentives" for developers who include or use PhysX in their games.
 
What is interesting to me in these discussions is how nVidia is often made to be evil *******s, when in reality if AMD bought PhysX instead they would probably act the same way.
Not to mention it has been what, 5 years since nVidia bought PhysX ?
In that time AMD could have introduced their own physics solution a long time ago, but they have absolutely nothing on that front, and what a shame that is.

I am playing Alice Madness Returns at the moment, and it has probably best use of PhysX next to Borderelands. Fantastic particle effects and black physically powered goo everywhere. Looks much better than normal version.
 
What is interesting to me in these discussions is how nVidia is often made to be evil *******s, when in reality if AMD bought PhysX instead they would probably act the same way.

I am playing Alice Madness Returns at the moment, and it has probably best use of PhysX next to Borderelands. Fantastic particle effects and black physically powered goo everywhere. Looks much better than normal version.

This is very true. AMD would be no different if it was their propriety tech.

And Alice is a fantastic game. Enjoy it:)
 
What is interesting to me in these discussions is how nVidia is often made to be evil *******s, when in reality if AMD bought PhysX instead they would probably act the same way.
Not to mention it has been what, 5 years since nVidia bought PhysX ?
In that time AMD could have introduced their own physics solution a long time ago, but they have absolutely nothing on that front, and what a shame that is.

I am playing Alice Madness Returns at the moment, and it has probably best use of PhysX next to Borderelands. Fantastic particle effects and black physically powered goo everywhere. Looks much better than normal version.

AMD didn't buy PhysX though so it's a moot point. nVidia has a track record of doing things like this with most things it has, AMD doesn't.

nVidia has a habit of trying to support things that only benefit themselves and lock you in to having to use their cards. PhysX, CUDA, 3D Vision and so on.

They used to do it with SLi, they wouldn't allow SLi on a motherboard that supported crossfire and would demand that manufacturers had to use one of their rubbish chip sets, claiming that SLi wasn't possible without the chip, just to make sure an AMD dual card setup wouldn't be work in it. nVidia personified is like a jealous bratty kid who takes a tantrum when they don't get their own way who likes to sabotage others it doesn't like out of some twisted jealousy. They're pathetic.
AMD has a track record of using and supporting open standards that will work on any brand graphics card and let the industry around it develop it.

Things like this should never be owned and developed by the company that's selling you the product, especially so. I don't think people really understand what they're talking about when they make suggestions like this. What benefit would it be to anyone for AMD to develop their own proprietary physics api that runs on the GPU?

No one because then you'd have AMD pushing theirs that doesn't work on nVidia and nVidia pushing PhysX that doesn't work on AMD. What developers would use and take either seriously when it cuts out half market?

So yeah AMD would be very different because they're not obsessed with proprietary software like nVidia are.

As for Alice, That's another thing that people don't get. The physics effects in it certainly don't require a GPU to process them. While they look nice, they are fairly basic, which leads back to what I was saying before. GPU PhysX is usually included at the expense of those who don't run it. So that cut down the effects so much if you turn PhysX off so that it looks rubbish to again add credibility and validation that your GPU doing the physics makes such a big difference. They actually do suggest or imply that those effects are only possible when running on the GPU.
 
Last edited:
AMD didn't buy PhysX though so it's a moot point. nVidia has a track record of doing things like this with most things it has, AMD doesn't.

nVidia has a habit of trying to support things that only benefit themselves and lock you in to having to use their cards. PhysX, CUDA, 3D Vision and so on.

I'm no fan at all of the way nVidia lock things down... not at all but...

AMD has a track record of using and supporting open standards that will work on any brand graphics card and let the industry around it develop it.

Things like this should never be owned and developed by the company that's selling you the product, especially so. I don't think people really understand what they're talking about when they make suggestions like this. What benefit would it be to anyone for AMD to develop their own proprietary physics api that runs on the GPU?

Seriously? I'm usually very careful banding about accusations of fanboyism - but you can't hide behind "and let the industry around it develop it" - AMD has a track record of making a big noise about technologies and especially open standards and then doing very little if anything to support it including not giving the "inudstry around it" the support they need to properly support AMD and then dropping the ball entirely a few months later quietly pushing everything under a rug... to suggest otherwise is at best blind fanboyism. See my post here for a little more detail http://forums.overclockers.co.uk/showpost.php?p=23144207&postcount=59 the most relevant part of this thread being:

Rroff said:
Hardware physics? ATI was one of the first to make a big splash about it in 2005/2006 https://www.youtube.com/watch?v=gLgb9AdnaBI continued by an even bigger publicity exposure by AMD in 2009 https://www.youtube.com/watch?v=MCaGb40Bz58 / https://www.youtube.com/watch?v=xfrM973spw0 and even as recent as 2011 they made a big noise about Open CL and bullet:
bit-tech.net said:
Interestingly, Hegde also didn't rule out the possibility of GPU-accelerated Havok rearing its head again either, saying that ' it is possible that we'll see it in the future, but right now our gaming strategy at AMD on GPGPU is based on the Bullet Physics engine.'

In short, it looks as though AMD is now putting some serious money behind gaming physics, and with a developer-friendly business model, not to mention wide-ranging hardware support, Bullet Physics has the potential to take over from CUDA-accelerated PhysX. Whether this will translate into fully fledged game-changing physics remains to be seen, but if future consoles use OpenCL-compatible GPU hardware (and they probably will), and GPU-accelerated physics on PCs indeed opens up to multiple hardware platforms, then it looks as though gaming physics might actually start to take off in future.

Where are the fruits of all this? what applications are using this today?

If you look at AMD's attempts at physics (as above), compute API (stream), 3D (various flavors) and almost any other relevant technology they have pushed you see the same story over and over - lots of hype, big media push and then a few months later its quietly retired. The never settle program is possibly the first time in AMD's history they have seriously put their money where their mouth is you really can't suggest otherwise.

I'm no fan of the way nVidia lock stuff down but they do atleast put the effort into making things into working technologies.

While I don't believe its in the best interests of anyone for a GPU manufacturer to produce their own version of hardware accelerated physics there isn't really any other options - there is no other body of people, except for possibly microsoft, who have the experience with the GPU hardware and the expertise and resources to implement something like that. If you look at stuff like bullet and openCL its taken them years just to get the basics up and running and they are a long long way from developing something like PhysX - and most of the support came from Sony despite AMD's apparent interest in pushing bullet.
 
I swapped in my previous Nvidia card to replay Batman just for all the supposedly amazing PhysX effects and I was really disappointed. There were a few bits of eye candy here and there but I was so disappointed compared to what I'd been told about how much better it made the game. I really didn't notice much and I was specially looking out for it.

Really? I thought all the extra smoke and sparks in AA made the game look amazing. I need to do this again as I removed the card for some reason, I think the special driver made me have a few frame drops as my main card is AMD. Is this the case or maybe I did something wrong.
 
Some of the stuff in Batman AA is nice but it doesn't really push physics capabilities much - some nice vortex effects in the scarecrow areas (but most people wouldn't notice if they were non-physics debris).

The standout one for me is the volumetric smoke/steam as thats one thing that looks far far better in motion than the primitive versions and adds a lot to the immersion. The sparks and water particles were quite dissapointing in batman as they only collided with the world hull and went through the player and other characters as if they weren't even there.

They did some world environment destruction too but only in the first couple of areas of the game and after that didn't bother :( and even that was fairly limited with mostly just tiles you could break and the odd corner that would realistically deform to damage. Thats always one of the big problems with widespread use of physics in the game geometry tho its a lot of effort to implement, requires a lot of foresight so that it doesn't have unintended game breaking consequences and really needs GPU compute to be able to use it widespread without a lot of artificial constraints that would be required for CPU physics - for example the broadphase filtering (narrowing down what objects actually need processing without having to process every single instance of an object) for a fully fleshed out level like Batman AA if it had full destruction physics, etc. would require around 70-80ms on the CPU so you'd never see above ~14fps at best whereas the same running on the GPU would only take ~6ms giving you atleast 160+fps to play with before processing anything else. (then the actual physics processing takes another chunk).
 
Last edited:
The standout one for me is the volumetric smoke/steam as thats one thing that looks far far better in motion than the primitive versions and adds a lot to the immersion.

It can also be done though DX so everyone can benefit from it. Perhaps that implementation wasn't as good as in Batman AA, but still a step forward towards a "free standard". http://www.youtube.com/watch?v=nfW_b3c00kM

Vegetation and cloth physics is nice in CE 3 and as time will pass, I'm sure other solutions will appear at least at the same level as PhysX. BF and RF show just that: a lot of stuff can be done one the CPU, granted, not something very complex, but judging by the gaming industry nowadays, even small steps can be perceived as a big leap.

Bullet may catch up and make it's way into next-gen games even on consoles, as in GPU physics API.
 
Last edited:
Back
Top Bottom