• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD: Devs only use PhysX for money

Soldato
Joined
7 May 2006
Posts
12,183
Location
London, Ealing
As AMD announced the next step of its Open Physics Initiative today, the company also hit out at Nvidia’s PhysX technology, saying that most game developers only use it for the money.

Speaking to THINQ, AMD’s senior manager of developer relations, Richard Huddy, said: “What I’ve seen with physics, or PhysX rather, is that Nvidia create a marketing deal with a title, and then as part of that marketing deal, they have the right to go in and implement PhysX in the game.”

There’s nothing wrong with this, you might think. However, Huddy spends a lot of time talking with game developers in his role, and he reckons that most devs would much rather Nvidia kept its hands off. “The problem with that is obviously that the game developer doesn’t actually want it,” he says.

“They’re not doing it because they want it; they’re doing it because they’re paid to do it. So we have a rather artificial situation at the moment where you see PhysX in games, but it isn’t because the game developer wants it in there.”

In fact, Huddy reckons that no developers outside Epic genuinely wanted to implement GPU-accelerated PhysX in their game. “I’m not aware of any GPU-accelerated PhysX code which is there because the games developer wanted it with the exception of the Unreal stuff,” he says. “I don’t know of any games company that’s actually said ‘you know what, I really want GPU-accelerated PhysX, I’d like to tie myself to Nvidia and that sounds like a great plan.’”

Unsurprisingly, Huddy is very confident in AMD’s open approach to GPU-accelerated physics as an alternative, and thinks that it will eventually force PhysX to join GLide and A3D in the proprietary API museum.

“I think the proprietary stuff will eventually go away,” he says. “If you go back ten years or so to when GLide was there as a proprietary 3D graphics API, it could have coexisted, but instead of putting their effort into getting D3D to go well, 3dfx focused on GLide. As a result, they found themselves competing with a proprietary standard against an open standard, and they lost. It’s the way it is with many of the standards we work with.”

This is a point that AMD plans to hammer home at the Games Developers Conference (GDC) tomorrow, where Huddy says he will unveil a “rather nice chart” with a “list of proprietary standards that have tumbled, because what the world wants is open standards.”

Of course, whether game developers want to use PhysX or not, AMD’s bigger problem is that GPU-accelerated PhysX is already being used in a fair few games. Meanwhile, AMD has so far only demonstrated a GPU-accelerated version of Havok at the GDC last year, and there aren’t any games available that take advantage of GPU-accelerated physics on AMD GPUs yet either. It also doesn’t help that Nvidia has openly offered to share its PhysX technology with AMD, but AMD hasn’t taken up the offer.

Either way, we certainly need a standard that everybody can agree on in order for GPU-accelerated physics to take off, and AMD’s Open Physics Initiative is currently the only system that will theoretically work on anyone’s hardware. The question is whether game devs want to use it, and AMD is hoping that it can tempt more developers its way with the offer of free physics tools.

As Huddy says, “When you have an open standard, everyone can join in and everyone can make free and well-informed choices, and it’s not about skewing the market with money.” We’ll get there, one day.
http://www.thinq.co.uk/news/2010/3/8/amd-game-devs-only-use-physx-for-the-cash/
 
So... he's promoting open source collaboration? Cool, now he just needs to explain how developers are just using DirectX because they get money for it and call it a day :)
 
Why would ATI bother with GPU accelerated physics if developers didn't actually want it in their games?

D3D isn't an open standard and it has for the most part won out against OpenGL so there goes another of his arguments.

Granted developers aren't keen on being tied to using PhysX, they'd rather have a choice of implementations but thats a different story.

And A3D isn't a great example, it was and probably still is the best at what it does and didn't die due to being a proprietary standard.
 
Makes perfect sense, all NV physx titles so far have poorly implemented placebo effects, just another way for NV to push crap down the consumers throat with little thought or implementation put in to it.
 
To be fair developers aren't asking for GPU accelerated physics, but more complex physics effects are becoming more and more required in games and as their useage increases, along with the demand for other things that need the CPU like AI, etc. performance becomes an issue and is something the GPU can very effectively address.
 
Its fairly simple, even if physx was fantastic and was used in all the big "destructable games", it would have a hard time against two companies that dominate the market between them. Basically even if AMD wanted it, it would have a hard to going up against INtel who dominate the both of AMD and Nvidia quite easily, and they own Havok which is, well, massively more widely implemented than Physx already.

Even if Intel don't end up in the discrete market, without that they have the most used physics engine in gaming, and sell the most gpu's, by far. AMD's market share is gaining and they hate physx, and over the next year Nvidia quite obviously will be losing market share.

SUpermicro have started using firegl cards pretty big in the server market, because Fermi's not available more than anything. AMD are gaining market share in every market, Nvidia are going to lose low end and have for over a year been losing high end market share, which has only accelerated in the past few months and will continue to shift AMD's direction in the next 12 months, until Nvidia get a new architecture or until TSMC makes a truly fantastic process which unfortunately is highly unlikely to happen.

Currently about 75% of gpu's made are sold by the two companies that hate physx's, even when Nvidia have a stronger, but not great market share, Physx adoption was awful.

Aren't the two latest "destructable games" in Just Cause 2 and Bad Company, well, neither are Physx are they... again. In fact none of the big considered to by "physics using" titles, destructable enviroments or throwing people around and good explosions like Crysis, use Physx.
 
AMD Open Physics Initiative Expands Ecosystem with Free DMM for Game Production and U

SUNNYVALE, Calif. — March 8, 2010 — AMD (NYSE: AMD) today announced that, along with partners Pixelux Entertainment and Bullet Physics, it has added significant support to the Open Physics ecosystem by providing game developers with access to the newest version of the Pixelux Digital Molecular Matter (DMM), a breakthrough in physics simulation. In addition, to enabling a superior development experience and helping to reduce time to market, Pixelux has tightly integrated its technology, DMM, with Bullet Physics, allowing developers to integrate physics simulation into game titles that run on both OpenCL- and DirectCompute-capable platforms. And both DMM and Bullet work with Trinigy’s Vision Engine to create and visualize physics offerings in-game.

“Establishing an open and affordable physics development environment is an important accomplishment for both game developers and gamers, signaling a move away from exclusionary or proprietary approaches,” said Eric Demers, chief technology officer, AMD Graphics Division. “Not only does the integration of Bullet Physics into partner middleware help drive broader adoption of physics in games, it ensures that when those games are released, all gamers, regardless of the hardware in their PC, can benefit from the more realistic experience enabled by those effects.”

All of the Bullet Physics implementations described above can be run on any OpenCL- or DirectCompute-capable platform. On AMD platforms, ATI Stream technology is used to drive the enhanced game experience. As a further enhancement, AMD has developed new parallel GPU accelerated implementations of Bullet Physics’ Smoothed Particle Hydrodynamics (SPH)
http://www.amdzone.com/index.php/co...duction-and-updated-version-of-bullet-physics
 
“Not only does the integration of Bullet Physics into partner middleware help drive broader adoption of physics in games, it ensures that when those games are released, all gamers, regardless of the hardware in their PC, can benefit from the more realistic experience enabled by those effects.”


Quoted for truth.
 
Problem is bullet is still somewhat rudimentary in the areas that count most to video gaming and more focused on movie, etc. type useage than gaming. Bullet has got a way to go before it can match what PhysX has to offer in strictly video game application context. What looks great in a pre-rendered sequence doesn't translate to physics that enhance the gaming experience, physx has had to compromise somewhat on the realism in some areas so that the physics don't intrude on the gameplay experience.
 
Developers certainly aren't thinking of customers when they choose to implement PhysX or any propriety tech.

Come on Nvidia, make it run on OpenCL/DirectCompute and let the hardware do the talking.
 
Developers certainly aren't thinking of customers when they choose to implement PhysX or any propriety tech.

Come on Nvidia, make it run on OpenCL/DirectCompute and let the hardware do the talking.

This is why we haven't seen more complex or widespread use of physx, most developers don't want to cut out a large slice of their potential market.

DirectCompute is hardly any more open than CUDA, etc. but it would atleast solve the ATI/nVidia side of the story and its not like you'd be using anything else for gaming physics.
 
I'd like to see some developer use PhysX as a vital component to a game, then you would see if its going to take off.

Its this sort of thing that stunts game development. It causes everyone to proscrastinate and wait for each other to make a choice with what to go for. They all wait and no one moves forward.

Its what happened with the HDVD, Blueray wars but to a lesser extent. They just left it up to the customers to decide, but the customers didnt because they didnt know what to go with. So then one of the manufacturers finally decides to end the farce. But somewhat too late, as the take up of High Def even now isnt to great.
 
I'd like to see some developer use PhysX as a vital component to a game, then you would see if its going to take off.

Its this sort of thing that stunts game development. It causes everyone to proscrastinate and wait for each other to make a choice with what to go for. They all wait and no one moves forward.

Its what happened with the HDVD, Blueray wars but to a lesser extent. They just left it up to the customers to decide, but the customers didnt because they didnt know what to go with. So then one of the manufacturers finally decides to end the farce. But somewhat too late, as the take up of High Def even now isnt to great.

They tried that with cryostasis - but the game design was pretty rubbish anyhow.
 
Thats all Nvidia need to do, get someone to put Physx into a game. A really good game like Grand Theft Auto 5. So that you cant play it without it or its really gimped. Then Nvidia will have won. Ati will be forced to buy the tech off of Nvidia or face disgruntled customers whom may emigrate.

Then other Developers will know whom to go with and physics in games will take off like a rocket.
 
It will never take of when the requirement for decent performance is to go out and buy another GPU, it's crazy.

Depends people said the same thing about hardware 3D rendering when the first generation of 3D hardware came along and games were using software 3D engines like quake 1.

People will go out and buy another GPU if a killer title comes out that shows people why they should do that - and hardware accelerated physics is capable of just such a distinction if someone has the balls to run with it.
 
One thing upgrading a single GPU, to go out and buy another GPU just for physx is crazy, even more so when you look at the games produced so far that incorporate it, and with nothing exciting on the horizon, it looks doomed.
 
Back
Top Bottom