• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Physics Card

Associate
Joined
6 Oct 2005
Posts
669
Location
West Midlands
Ello,
I was looking at a 'graphics card guide' the other day and it mentioned something called a 'Physics card' or chip. I'd never heard of one before so I don't know what they're for exactly.

So I was just wondering if anybody can share some insight into what do they do and why would you need one?
 
Basically does what it says on the tin. It works out real world physics for games. Realistic explosions you can destroy/manipulate objects/background objects. However as with all new technology it’s not worth getting yet. Wait for the bugs to be ironed out and for games to utilize it.

A physics card is an expansion card for computers, similar to a graphics card but which is used to process physics interactions as opposed to graphics. By taking over the processing of these effects, the CPU can use more of its power for other tasks. A physics card is centered around a physics processing unit, similar to the graphics processing unit on a graphics card, and also contains RAM for use in its processes. The first physics card created was the PhysX by AGEIA, released in 2006.

Also gfx cards new improving. As the extra graphical realism strains cards.
 
Physics processing is still seriously in its infancy, it's not even been through the teething stage yet as it's still riddled with performance problems. I would wait some time before even considering taking the plunge.
 
also there is the havox engine coming out possible sometime soon? which allows you to do physocs calculations on a graphics card. wait and see I say
 
neo-omega said:
also there is the havox engine coming out possible sometime soon? which allows you to do physocs calculations on a graphics card. wait and see I say


I still recon a stand alone card is the way forward. If you do calculations on a gfx card think what the power of the card will need to be. Just look at the gfx of the new games coming out.

Whats that game with the jungle scene. That just looks so awesome. However I recon that'll push even a top end card.
 
AcidHell2 said:
I still recon a stand alone card is the way forward. If you do calculations on a gfx card think what the power of the card will need to be. Just look at the gfx of the new games coming out.
Keep in mind that you will need two graphics cards (three optimally) to do physics processing on the GPU. ATi are pushing the proposition to use one card in a Crossfire system as a graphics card and the other as a physics processor, they are also trying to get people to use two cards for Crossfire graphics and a third card (a cheap X1300 say) for physics.

AcidHell2 said:
Whats that game with the jungle scene. That just looks so awesome. However I recon that'll push even a top end card.
You're thinking of Crysis, the new game by the folks who made Far Cry. It should be noted that every bit of video we have seen of that game so far has been rendered purely on single ATi Radeon X1900XT graphics cards, and as we all have seen, it's smooth as silk (but Crytek are just fantastic coders, this game should run fine on any modern system.) That game will only be a problem when people want to run it in DirectX 10 mode.
 
But we've seen DX10 graphics on the demos yeah?

What I was thinking is that DirectX is the application/hardware interface right? and if they simply coded the demos for the single system, say the X1900XT, then they wouldn't need it. The same as a console. You can just code it for a specific system specification and not require the application/hardware interface like you would when coding games for many different specs.
 
naffa said:
But we've seen DX10 graphics on the demos yeah?
Nope, pure X1900XT DirectX 9 rendering in all footage we have seen so far.

What DirectX 10 will bring to the table for Crysis is faster and more elegant/immersive effects thanks to the new shader coding etc. We haven't seen it yet, but we'll definitely be blown away.
 
All I can say is.. :eek:

I thought that was DX10 because it looks so damn nice... I can't wait to see what DX10 looks like :D
 
speedy2004 said:
Does your graphics card work together with the physics card, or do you get rid of your graphics card and use the physics card?
If you mean when using an Ageia PhysX card, the physics card works alongside the graphics card.
 
Thanks guys...

There doesn't seem much use for physics cards at the moment then, seeing as realistic in-game physics is still quite rare, even in Oblivion for example, although it's a great game and the physics are nifty the physics aren't anything that special; everything seems to have a similar weight etc.

I don't see the point in physics cards until in-game physics become a bit more sophisticated.
 
“I don't see the point in physics cards until in-game physics become a bit more sophisticated.“
It is a bit of catch 22. No one wants a physics card until in-game physics are more common and no one wants to make games like that until there are lots of physics cards. Right now there are only 6 games that make use of a physic card and we really need a killer game to come out that use’s it.
 
At the moment if I understand the two systems correctly, then Havok will provide eyecandy only. ie you blow up a house and the explosion looks spectacular However the platoon of soldiers hiding inside are still alive and unharmed, and probably kill you whilst you are watching the explsion

With the Ageia system or the Physx cards then not only do you get the spectacular explosion but it also calculates that the soldiers in the building you will also be killed in the collapse of the building This way it is not just the actual graphics that improve but also the interaction between the objects that you can see.

Microsoft are supposed to be adding some physics processing into DirectX so I guess that is when it will become popular.
 
Its a bit like the problem when 3D accelerators were first released. Only certain games could take advantage of them but as the game catalogue grew the sales sored.

Tbh I dont think we will have separate Physics cards for very long, either the functionality will be added into a normal GPU's core or they'll be some changes to allow a 2nd GPU to act as the Physics processor. ATi want something along the lines of the latter, Nvidia will probably go for the former. As usual ;).
 
Standalone just makes so much more sense. Why use a 2nd very exspensive gfx card, when you can buy a dedicated physics card, which is going to work better. Also probably cheaper than a top end gfx card.
 
AcidHell2 said:
Standalone just makes so much more sense. Why use a 2nd very exspensive gfx card, when you can buy a dedicated physics card, which is going to work better. Also probably cheaper than a top end gfx card.
I was under the impression that you will be able to use an X1300 alongside an X1800/X1900 to process physics, that's pretty much the same cost as an Ageia PhysX processor and an X1800/X1900.
 
Úlfhednar said:
I was under the impression that you will be able to use an X1300 alongside an X1800/X1900 to process physics, that's pretty much the same cost as an Ageia PhysX processor and an X1800/X1900.

To begin with but as the physics get more and more complicated, Then it'll take more and more power. Having a dedicated card and proccesor they can design to handle a persific engine is always better, than trying to piggyback it onto something else.
 
AcidHell2 said:
To begin with but as the physics get more and more complicated, Then it'll take more and more power. Having a dedicated card and proccesor they can design to handle a persific engine is always better, than trying to piggyback it onto something else.
But an X1300 or whatever card you choose to use for physics processing in ATi's solution will be a dedicated card and processor, and in terms of raw computational power an X1300 is actually more powerful than an Ageia PhysX card as far as I know, not to mention having the benefit of being PCI-E x16.

I'm not sure if you think that ATi's solution means that the X1300 would be doing graphics and physics at the same time, but if you do then that's not correct. ATi wants to push boards with three PCI-E x16 slots; One for graphics, one for physics (X1300), and an optional third for Crossfire graphics and physics at the same time.
 
Last edited:
i think the thinking behind the ATi solution was that when DX10 etc come around people dont sell on their old gfx cards, but instead use them for physics processing. (meaning 2nd hand ATi cards will be harder to find and so ATi can push more cards than before ;))
 
Back
Top Bottom