• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD to demo GPU physics at GDC next week

Soldato
Joined
7 May 2006
Posts
12,183
Location
London, Ealing
Author: Ben Hardwidge
Published: 20th March 2009
Comments (11) Email to a friend Stumble
AMD to demo GPU physics at GDC next week

AMD says that the Havok physics API can sit on top OpenCL and Stream.
AMD might have publicly declared its support for Havok’s gaming physics technology, but the company has been curiously quiet about GPU-accelerated physics since Intel bought Havok in 2007. Since then, AMD has revealed that it’s still working with Havok, but has only really talked about running Havok on AMD’s x86 CPUs. However, AMD has now revealed that it plans to demonstrate its own GPU-accelerated physics technology at the Game Developers Conference (GDC) next week.

AMD’s Catalyst product manager, Terry Makedon, revealed on his Twitter feed that AMD would reveal its “ATI GPU Physics strategy,” and added that there may also be “a demo being thrown down next week at GDC.” Makedon also said that "Havok is indeed our partner of choice," when asked about the talk. The 60-minute talk, called “Having Your Cake and Eating it Too: Increasing Game Realism, Scale and Reach” will take place at GDC on 26 March.

The summary for the session says that AMD will discuss “the latest on game computing featuring open, standards-based physics with OpenCL and ATI Stream.” AMD’s stream computing director, Patti Harrell, explained to us a while ago that “the beauty of Havok is that ultimately we would expect it to sit on top of these industry standard APIs as they become available. So we’re working with them, and in fact there’s a team in our consumer group who works very closely with them on a daily basis.”

AMD found itself in a difficult situation after Intel bought Havok and Nvidia bought Ageia, and the company claimed that talks about GPU acceleration using the Havok FX API effectively broke down after Intel bought Havok. Since then, the focus for AMD has been CPU support for the physics API. At the time, Havok’s managing director, David O’Meara, explained the priority for CPUs, saying that “the feedback that we consistently receive from leading game developers is that core game play simulation should be performed on CPU cores.”

However, he added that GPU physics acceleration could become a feature in the future, saying that “the capabilities of massively parallel products offer technical possibilities for computing certain types of simulation. We look forward to working with AMD to explore these possibilities.”
http://www.bit-tech.net/news/hardware/2009/03/20/amd-to-demo-gpu-physics-at-gdc/1
 
Epic kudos to Intel for letting Havok be implemented with technologies other than its own.

Edit: That said, if this is true, PhysX will struggle in comparison to Havok if Havok is being implemented in OpenCL, as that will make it vendor independent. To developers, that means more of the market can use the shiny physics effects than implementing PhysX. Hopefully this will encourage Nvidia to implement PhysX in OpenCL or DX11 Compute Shader, too.
 
Last edited:
mmm, tasty!! sounds really good to me :D i look forward to this! and from what ive read on NGOHQ, Nvidea are now helping them port Physx to the ati cards and they are doin really well with it, almost finished apparently!
 
Apparently, nvidia have licensed the physx api to be used on the ps3 and the wii but if the wii only has ati gcard, how does that work? I'm guessing it's just software based...
 
Apparently, nvidia have licensed the physx api to be used on the ps3 and the wii but if the wii only has ati gcard, how does that work? I'm guessing it's just software based...

PhysX is a middleware API that runs on many platforms - however, the only GPGPU based solution it will run on is CUDA. Also note that the RSX in the PS3 is not a GeForce 8 series or above class GPU, and therefore cannot accelerate PhysX, it is processed by the CPU, Cell. PhysX also runs on standard X86 CPUs, although its performance running like that is... not so good.
 
To developers, that means more of the market can use the shiny physics effects than implementing PhysX.

IIRC from the last 2 hardware surveys including the steam one which is pretty big, nvidia cards capable of physics processing are far more prevelant than the ATI equivalent, something of the order of 3:1.

Not to mention nVidia throw $$$ at developers to use things like physx and provide a dedicated support team for getting these features into the games... for developers this is going to be far more attractive in the main.

While havok itself has quite a strong following, ATI isn't particularly liked by the majority of developers.
 
Last edited:
PhysX is a middleware API that runs on many platforms - however, the only GPGPU based solution it will run on is CUDA. Also note that the RSX in the PS3 is not a GeForce 8 series or above class GPU, and therefore cannot accelerate PhysX, it is processed by the CPU, Cell. PhysX also runs on standard X86 CPUs, although its performance running like that is... not so good.

Not sure if it ever came to fruition or if it was all talk - but sony was saying awhile back that the ps3 would have a hardware scheduler for opptomising physx throughput - while not providing the performance of a dedicated GPU it would increase performance on CPU substantially.
 
IIRC from the last 2 hardware surveys including the steam one which is pretty big, nvidia cards capable of physics processing are far more prevelant than the ATI equivalent, something of the order of 3:1.

Not to mention nVidia throw $$$ at developers to use things like physx and provide a dedicated support team for getting these features into the games... for developers this is going to be far more attractive in the main.

So? That's an extra 30% of the market that they can tap, since if it's using OpenCL, it can target both platforms with minimal costs in development overhead (apart from obviously debugging on appropriate platforms and such). That's not a small number when you're talking about millions of units of software sales that will potentially be using GPGPU accelerated physics

Not sure if it ever came to fruition or if it was all talk - but sony was saying awhile back that the ps3 would have a hardware scheduler for opptomising physx throughput - while not providing the performance of a dedicated GPU it would increase performance on CPU substantially.

I think the SPE's in the Cell processor are more than adequate for decent physics processing - my point was more that it's not about the GPU that determines how the technology can be targeted.
 
Last edited:
Rroff, I thought you would be happy that ATi are doing hardware physics, they were holding it back weren't they? In your eyes, Nvidia are always ahead eh?
 
So Havok (which by all accounts is considered superior to PhysX) will be able to run on the openCL platform therefore being platform agnostic and available to all? As for developer support, Havok is an Intel technology, you know the biggest CPU producer on the planet, renowned for unparalleled developer support channels.

I'm sure some of the more tiresome fanboys here will try and put a ATI or NV spin on this excellent piece news for the end users.
 
So Havok (which by all accounts is considered superior to PhysX) will be able to run on the openCL platform therefore being platform agnostic and available to all? As for developer support, Havok is an Intel technology, you know the biggest CPU producer on the planet, renowned for unparalleled developer support channels.

I'm sure some of the more tiresome fanboys here will try and put a ATI or NV spin on this excellent piece news for the end users.

Very good post. I can't really say a lot more than what I've already said and what you have said, it is indeed an extremely good thing for the end user. I can't say I'm really supporting AMD or Nvidia in this, but as far as I'm concerned, Intel is the saviour in all this. If this goes through as it seems it will, it will be the true dawn of GPU accelerated physics, all because of a company who is primarily a CPU maker. Brilliant. :D
 
Rroff, I thought you would be happy that ATi are doing hardware physics, they were holding it back weren't they? In your eyes, Nvidia are always ahead eh?

If we can keep ATI out of the loop and have havok running on openCL then I'd be a little happier, they have a bad tendancy to push stuff thats all style over substance at the expense of what really needed for game development and then not even support that feature down the line anyhow...

Personally tho I find physx better for gaming purposes it feel more fluid than havok, specially in multiplayer where objects tend to bounce off you more without impeding your progress whereas with havok you tend to stick on them :( i.e. the barrels in CSS.
 
If we can keep ATI out of the loop and have havok running on openCL then I'd be a little happier, they have a bad tendancy to push stuff thats all style over substance at the expense of what really needed for game development and then not even support that feature down the line anyhow....

we? you work for nvidia?
That's the biggest fanboy post I've read in ages.

I can't see why everyone isn't thrilled that there's going to be a cross-platform, open physics standard, regardless of GPU preferences.
 
havok is no more open than physx... the only things thats gonna change is the software interface layer between the physics API and the hardware.

if you had any idea of what goes into video game design you wouldn't see that as a fanboy post. I have little love for nVidia but they are a lot easier to work with in game development than ATI.
 
Last edited:
havok is no more open than physx... the only things thats gonna change is the software interface layer between the physics API and the hardware.

if you had any idea of what goes into video game design you wouldn't see that as a fanboy post. I have little love for nVidia but they are a lot easier to work with in game development than ATI.

A little love is a bit of an understatement, everything is ATi negative and Nvidia positive to you :D

No matter how you seem to spin things, I said this to you before, the consumer needed a price competitive GPU market and no more £500 GPUs with features that are not really needed, if ATi did not hit back as hard, you would be paying insane sums of money for a decent GPU. Your favourite company is far from consumer friendly now is it? They're rehashing all their GPUs with new names.....
 
Last edited:
havok is no more open than physx... the only things thats gonna change is the software interface layer between the physics API and the hardware.

if you had any idea of what goes into video game design you wouldn't see that as a fanboy post. I have little love for nVidia but they are a lot easier to work with in game development than ATI.

I'd rather a Physics API that sits on top on an open industry standard interface than one which sits on a propitiatory interface that serves no purpose than to lock out other vendors and sell their own hardware.

I wonder what will happen to NV once Intel really start to throw their weight around the industry.
 
A little love is a bit of an understatement, everything is ATi negative and Nvidia positive to you :D

No matter how you seem to spin things, I said this to you before, the consumer needed a price competitive GPU market and no more £500 GPUs with features that are not really needed, if ATi did not hit back as hard, you would be paying insane sums of money for a decent GPU. Your favourite company is far from consumer friendly now is it? They're rehashing all their GPUs with new names.....

go troll someone else...

I'd rather a Physics API that sits on top on an open industry standard interface than one which sits on a propitiatory interface that serves no purpose than to lock out other vendors and sell their own hardware.

I wonder what will happen to NV once Intel really start to throw their weight around the industry.

Depends on the level of documentation too I'd go for the one that was most the most stable, accessable and best documented personally.

Intel have had plenty of time to throw their weight around without much impact so far...
 
Back
Top Bottom