PhysX goes free

Soldato
Joined
18 Oct 2002
Posts
8,444
Location
Leamington Spa
http://www.developmag.com/newsitem.php?id=24895

For people that don't know PhysX is a physics engine that developers can integrate into their games to allow realistic physics interactions. It can also be accelerated in hardware by a PhysX card (one of these).

What effect do you think this could have on PC gaming? I think it could lead to lots of good innovative games being made. It's clear Ageia are in it for the money since the more games that use the PhysX engine, the more people will buy the PhysX add-on card. But even so giving amateur developers free access to an advanced physics engine could be great.

It could give them an edge over the competition (ie. Havok). Havok are working on using a graphics card to do the physics calculations which not only costs the end-user money (which won't even go to them since they don't make the cards) the developer also has to pay for a license. With Ageia's strategy it'll work out roughly the same cost to the user but developers will be more inclined to develop with it since it's free for them too.
 
Once games like Crysis come out it will be such a massive leap in graphics but players will want to see more than just prettier graphics. I think having such advance physics such as the PhysX engine will increase what you can do in a game massively. Just think of the infinite spells you could have in an RPG with this engine. If you have seen the video where all that rubbish spins around in a big whirl wind type effect you will know exactly what i mean, no graphics could ever replicate that.

I personally can't wait for the new gen of physics engines implemented into a game as it will give us such a new look on gaming.
 
The PhysX card is a waste of time and money. With the new DX10 cards coming out, with architectures that are adaptive to whatever calculations are needed, most if not all the physics can be done by the graphics card.

Physics can add a new dimension to a game, but new technology has already put the PhysX card on a slippery slope towards obsolete-ness! :p

SiriusB
 
With such games on the horizon such as Alan wake where multi core cups will be used to a greater degree in terms of physics calculations, not to mention better graphics cards the PhysX card will become a bit redundant unless they can come up with a need for the card.
 
SiriusB said:
The PhysX card is a waste of time and money. With the new DX10 cards coming out, with architectures that are adaptive to whatever calculations are needed, most if not all the physics can be done by the graphics card.

Dissagree, although they can do the calculations, they have limited MHZ you don't wont physics slowing down your fps..
 
It was demoed on a quad core so you have 3 other cpu’s and the graphics card thats enough for most games right now. :)
 
The way DX10 cards are designed makes sure that they don't do any work that doesn't need to be done. It is why DX9 cards wont work because the DX10 API has new specifications for hardware to allow for the streamlining.

DX9 games will run better on DX10 cards because of this. With the card doing less work there is room for physics. Besides, today's processors are utter beasts and there is no reason why a game can't offload some of the physics to the CPU with no performance hit.

I am not saying we have the perfect solution right now, but it isn't very far away. I would be loathe to spend any money on an add-in card that has no games which supports it and will no doubt soon be obsolete.

SiriusB
 
the free licensing is a good thing, gives more developers a chance to try out and experiment with the PhysX SDK
just a shame the actual card is still £180+ :(
 
In a strange way this could almost be viewed as a bad thing (for the majority of people who don't have the hardware), since it means that potentially we could see more development time spent on implementing PhysX rather than on other things.
 
smoove said:
What can you give me a demonstaration to what it does exactly?

Gives real life physics. Which means you could have completely destroyable buildings/landscapes, realistic shrapnel. At the moment it's limited, but the possibilities are endless.
 
HangTime said:
In a strange way this could almost be viewed as a bad thing (for the majority of people who don't have the hardware), since it means that potentially we could see more development time spent on implementing PhysX rather than on other things.
As far as I know a game that uses PhysX doesn't require the card. It's just that if the card is there, it can use that and you'll get better frame rates / can turn more effects on.

SiriusB said:
The PhysX card is a waste of time and money. With the new DX10 cards coming out, with architectures that are adaptive to whatever calculations are needed, most if not all the physics can be done by the graphics card.
I think Havok's solution requires a second graphics card that's used for physics calculations. Although there may be an option to use a single one at the expense of a bit of graphics grunt.

SiriusB said:
I am not saying we have the perfect solution right now, but it isn't very far away. I would be loathe to spend any money on an add-in card that has no games which supports it and will no doubt soon be obsolete.
Indeed. Given that, you can see why Ageia have made this move. Right now no one wants a PhysX card because there are so few games that use them. And developers aren't bothered about using PhysX because so few people have the cards. Now that the license is free, developers will be tempted to use PhysX over the competition, giving people more reason to buy the cards.

I can't say whether their plan will be successful though. Personally I think Havok's graphics card based solution makes much more sense. With that, if you upgrade your graphics card the old one can be put on physics duty.
 
Psyk said:
As far as I know a game that uses PhysX doesn't require the card. It's just that if the card is there, it can use that and you'll get better frame rates / can turn more effects on.

I'm aware of that; what I'm getting at is that the time spent implementing PhysX support - something only a small minority of users will be able to use - could perhaps have been spent elsewhere.

Supporting proprietary APIs is OK when it is something with a sizeable installed user base (e.g. 3dfx glide, Creative EAX), but PhysX simply doesn't warrant a lot of development time being thrown at it.

The cynic in me might suggest that some developers might be coerced into implementing PhysX due to the marketing dollars thrown their way by Aegia.
 
HangTime said:
I'm aware of that; what I'm getting at is that the time spent implementing PhysX support - something only a small minority of users will be able to use - could perhaps have been spent elsewhere.
Ah right gotcha. But still, as far as I know there is very little work that needs to be done to add support for a PhysX card. The API is designed to run either in software or in hardware. You still have a good point though. They might be tempted spend a long time on loads of physics effects that will be too intensive to run in software mode, hence useless to the majority of people.
 
I cant see them really taking off if multiple core CPUs become the norm. Alan Wake and Crysis are both offloading physics etc onto a different core if you have them..
 
I agree with Gord - I can see the extra cores becoming the way forward for physics.

I expect Pottsey to be along at any second to kick all your asses though! :D ;)
 
I thought Pottsey was always banging on about how the Ageia PhysX license was free anyway and bashing Havok because they charged for their API license. Was he talking out of his bum? :confused:

This news doesn't make a difference though; Paying nearly £200 for a PPU is never going to be popular, Ageia was a sinking ship from the beginning and is only going to hit the bottom faster with the dawning of true multi-core technology for even the mid-range consumer and GPU physics for the enthusiast.
 
Last edited:
I think physics cards are a waste of money aswell. We've already got quad core chips out, and higher core ones on the way. It would be a lot better if you could just assign one or more cores to physics processing and let it get on with it.
 
Doesn't seem too surprising as I doubt there were too many PhysX developers due to few people buying the cards. The developers are probably waiting for the ATI/Nvidia versions of the physics to come out as well.
 
Sleepery said:
I think physics cards are a waste of money aswell. We've already got quad core chips out, and higher core ones on the way. It would be a lot better if you could just assign one or more cores to physics processing and let it get on with it.
True, but CPUs aren't as good at doing the physics calculations, so there's more potential with a dedicated card. Although people are much more likely to part with their money to buy a multi core CPU as ultimately it can do a lot more stuff and is useful for more than just games. So I wouldn't be surprised at all if thats the way it went.
 
Back
Top Bottom