• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU physics Havok FX dropped from first game

In rb6 vegas if you shoot something and it moves the other players cant see it. i tryed to play football with a box but my mate couldn't see the box moving. Sure this would be the same if some had physix and others didn't? But if i can do it in rb6 without some phyx card... why do i need it to be able to shoot boxs in other games? lol.
 
From a hardware perspective, I have no doubt that dedicated physics hardware (like physX cards) are dead in the water.

the direction of CPU development is now largely parallel, and particularly in games this is something which is very difficult to take full advantage of. Physics calculations can be easily parallelised, and so can take advantage of 'idle' CPU cores.

Similarly, GPUs are (and always have been) massively parallel, so can be (relatively) easily modified to run physics-type calculations. So, having another bit of hardware specialised to do physics (which is basically just another floating point co-processor - just like a graphics card) makes little sense when you already have a bunch of un-tapped parallel potential sitting idle in your CPU and GPU.


I am writing a "physics" code to take advantage of the GPU through CUDA (nvidia's API for using the GF8 for general floating point calculations). It's an engineering code (computational fluid dynamics) rather than a games-related engine, but the principles are the same. Anyway, the GPU has orders of magnitude more power for such calculations than a CPU, but in the end I think CPUs will win out for game physics, because:


a) There will be no conflicting standards (GPU acceleration would almost certainly require games developers to write two different physics implementations for ATI and nvidia cards)

b) future multi-core processors (8 and 16 cores) will have more than enough spare capacity for high en physics

c) Programming for the CPU allows more flexibility than programming for specialised FPU accelerators (like a GPU or physics card). The CPU is more more "forgiving" of poor coding practices.

d) Programming for CPUs will not require extensive re-education for programmers. They can use the compilers and other tools that they are already familiar with.


Anyway, that's my 2p. I think the GPU has massive potential for general FPU calculations, but I think that the (relative) simplicity of programming for multi-core CPUs will win out. Physics-specific hardware is an unhappy compromise, since it has all the drawbacks of GPU implementation (specialised parallel architecture that is difficult to program, along with not having the raw power of a GPU), and also requires the user to buy an extra bit of hardware.
 
In rb6 vegas if you shoot something and it moves the other players cant see it. i tryed to play football with a box but my mate couldn't see the box moving. Sure this would be the same if some had physix and others didn't? But if i can do it in rb6 without some phyx card... why do i need it to be able to shoot boxs in other games? lol.

Good point - what you are saying is that PPUs are actually a hinderance to online multiplay rather than an aid (unless everyone has one).
 
“Not looking good for Ageia is it? Havok is used a lot more than PhysX and now Intel own them.“
I don’t believe Havok is used a lot then Ageia judging by games out. You got any evidence to support that? You don’t get many Havok games any more.
 
Last edited:
Has anyone heard about Nvidia including a PPU in the 9 series? It may have been a rumour but, ive heard something about it.

Matt.

The Geforce8 series already has this capability, through CUDA (nvidia's development environment for using the GPU for generlised floating point calculations rather than just for graphics). There is no such thing as "a dedicated PPU" on a graphics card - at least not for the current generation cards that have programmable pixel pipelines. The whole idea of the new architecture is that you don't need such specialisation - you simply use some of the pixel pipelines for physics. The amount of GPU potential given over to physics can be altered dynamically, but fundamentally every physics calculation you send is a pixel pipe not being used for graphics calculations.

I'm not sure whether ATI has this capability yet. Perhaps they are working internally on an evironment (equivalent to nvidia's CUDA), but it certainly hasn't been released to the public domain yet. Certainly the architecture is capable of handling it - what is missing is a software interface to allow developers easy access.
 
The Geforce8 series already has this capability, through CUDA (nvidia's development environment for using the GPU for generlised floating point calculations rather than just for graphics). There is no such thing as "a dedicated PPU" on a graphics card - at least not for the current generation cards that have programmable pixel pipelines. The whole idea of the new architecture is that you don't need such specialisation - you simply use some of the pixel pipelines for physics. The amount of GPU potential given over to physics can be altered dynamically, but fundamentally every physics calculation you send is a pixel pipe not being used for graphics calculations.

I'm not sure whether ATI has this capability yet. Perhaps they are working internally on an evironment (equivalent to nvidia's CUDA), but it certainly hasn't been released to the public domain yet. Certainly the architecture is capable of handling it - what is missing is a software interface to allow developers easy access.

Cheers for that Duff-Man, cleared that up for me.:D
 
How about all the source engine games for starters
What Pottsey doesn't realise is that it doesn't matter how many games use PhysX or Havok if the games are crap. Look at the list of PhysX supported games and, aside from Unreal Tournament 3, try not to fall asleep. On the other hand look at the list of Havok supported games, which not only don't require a PPU, but have amazing titles such as Half Life 2 (and everything else based on the Source engine), the Company Of Heroes series, the Halo series, Heavenly Sword, Lost Planet, LOTR Online, Stranglehold, World In Conflict etc.

Let's not also forget upcoming games like Assassin's Creed, Alan Wake, Fable 2, etc.

Ageia PhysX is an expensive yawn-fest and I can't wait for the multi-core CPU to kill it.
 
ppu is not bad in ut3, it purley provides a speed boost and people with and without a ppu can all play together online. id say thats a successful implementation of the ppu in ut3. although its not a 90 quids worth of ppu implementation. the card needs to drop down to 50 quid in price.


How can they?

Some people will have physics of stuff etc blowing up and some wont!

Not fair and shouldn't be allowed IMO.

CPU cores for physics for the win
 
Hey Duff-man, can physics on the GPU be interactive or just eye-candy?

The physics on the GPU can be whatever you program them to be - just like with a CPU or a dedicated physics card. It's really up to the programmer to decide how he wants the physics appear, and how much of the GPUs potential he wants to give over to physics calculations.

You have to be careful how you send stuff to the GPU though. It effectively has a lot of small processors with a shared memory, so you want to be careful to send the right calculations to the various pipes. Send too much at a time and you will waste effort searching the memory. Send too little at a time and the communication costs of re-assembling the data and sending it back to the CPU will be more than the calculation itself.

It's a bit of a balancing act, and one reason that I think multi-core CPU physics will prevail (although I don't doubt that the GPU has the potential to be more powerful).
 
Last edited:
“How about all the source engine games for starters“
What all 3 or 4 of them over how many years? Source failed as a game engine barley any commercial games used it. Don’t get me wrong the engine its self is great, no idea why it did so badly. I can only think of 3 commercial games if you don’t count add-on packs and mods. It’s not like the quake or UT engines which get used a lot. hasn’t UT3 already got over 120 games licensed for it? (I recall reading there are 126 games licensed but cannot find link, could be wrong on the number)




“Look at the list of PhysX supported games and, aside from Unreal Tournament 3, try not to fall asleep.”
I was talking about physics which has 100’s of games supporting it. Not just the PPU games.
There are tons of great games from GRAW 2, Gothic 3, Medal Of Honor: Airborne, Clive Barker's Jericho, Warhammer Online: Age Of Reckoning and BioWare - Eclipse Engine. Have Bioware ever made bad game?

Then there are the none PC games like Splinter cell, Gears of war e.c.t

Hate PhysX as much as you want but its not going anywhere anytime soon. Not now there are over 100+ games licensing the UT3 engine with physics.





“Ageia PhysX is an expensive yawn-fest and I can't wait for the multi-core CPU to kill it.”
Well you’re in for a long wait considering Ageia PhysX main market is making money as a software solution for multi-core CPU’s and it seems to be selling better then Havok recently.
 
Last edited:
Well I for one, hope PhysX gets supported, the m1730 comes with one (no choice to remove it) and I don't want a pointless add-in card in my laptop making it more of a brick and sapping power :p

Still, Hellgate London sucks so I'm not that bothered about this. Bring on more games! :D
 
There are tons of great games from GRAW 2, Gothic 3, Medal Of Honor: Airborne, Clive Barker's Jericho, Warhammer Online: Age Of Reckoning and BioWare - Eclipse Engine. Have Bioware ever made bad game?

You should be shot for saying Gothic 3 is a good game. Do you own the game? Do you know that jowood desided to publicly announce they arn't going to patch it and its up to the community to. Did yu know that there are so many bugs in the game that its really difficult to complete. in fact have they fixed it so that you can achualy complete it yet because there was a bug taht stoped you getting the last talismun.

That game ****'s me off that i was so hyped up over it and it wasn't even a game tha could be completed! Infact id say that game should have been in alpha still there was too many bugs. Loads of "features" that hadn't been implimented that where still listed in the manuel.

GRRR
 
Back
Top Bottom