Are physx cards worth it?

“They are worth less than nothing, don't buy one.“
Please stop coming out with this rubbish you know that’s a flat out lie.

You're dead right - it is an extreme call to say that they are worth less than nothing - they do have some utility and they are a very clever piece of hardware but I personally think that there is a greater need of transparancy of how extensively and to what depth they are actually used in games and game play.

They aren't a horribly waste of money and they do have a future but in my personal opinion not in the manner that they exist now - not a waste of money but certainly a luxury item much like putting glowey lights and coloured water in your PC.
 

quote.gif
<-----------------

WRT the cards, they remind me of the old MPEG cards. Usefull for about 5-minutes, then CPUs and video cards became able to just as good a job, and made them obsolete.
 
Last edited:
As the number of cores in a cpu grows and the support for multi threading in games becomes more widespread, the need for a physx card will continue to diminish, as a result of that i see little point in getting one, it'll be obsolete within a year i bet.
 
Physx is a good ideal, but i just feel that this implementation is going to go anywhere and that the next stage will be the on-cpu physics processing using a specialized core.
 
As the number of cores in a cpu grows and the support for multi threading in games becomes more widespread, the need for a physx card will continue to diminish, as a result of that i see little point in getting one, it'll be obsolete within a year i bet.

I think eventually physics calculations will be moved away from the CPU, but not necessarily in a dedicated physics card. Most CPUs aren't all that efficient at doing the sort of calculations that are needed, so a seperate chip does make sense. With graphics chips becoming more programmable, I think it's possible we will eventually see cards that aren't specifically marketed as graphics cards or physics cards, but will be more general purpose number-crunching processors that can be programmed to do all sorts of things. We're already half way there really with Havok FX which allows physics to be done on a second graphics card, and that's fairly similar to the idea behind the Cell.
 
I think eventually physics calculations will be moved away from the CPU, but not necessarily in a dedicated physics card. Most CPUs aren't all that efficient at doing the sort of calculations that are needed, so a seperate chip does make sense. With graphics chips becoming more programmable, I think it's possible we will eventually see cards that aren't specifically marketed as graphics cards or physics cards, but will be more general purpose number-crunching processors that can be programmed to do all sorts of things.
The company i work for uses 8800GTX's for pure number crushing! Very cost effective!
 
As the number of cores in a cpu grows and the support for multi threading in games becomes more widespread, the need for a physx card will continue to diminish, as a result of that i see little point in getting one, it'll be obsolete within a year i bet.

But general purpose CPU's are not designed for complex physics, you could have 16 cores and the current Physx cards would probaly still be able to do more complex effects.

The only real competition Physx has is GPU based physics, but as graphics cards are constantly struggling to run the latest games decently surely a dedicated Physx card is the best option? or would you rather buy a separate £400 8800GTX for your physics instead?
 
“As the number of cores in a cpu grows and the support for multi threading in games becomes more widespread, the need for a physx card will continue to diminish, as a result of that i see little point in getting one, it'll be obsolete within a year i bet.“
According to the specs it’s going take CPU’s far more then 1 year to catch up with the power of a PPU cardfor physics. More like in 5 to 10 years time a CPU will catch up with today’s 2 year old PPU, meaning the CPU is 7ish years behind. That’s also using 100% off all cores for physics and nothing else. Current CPU’s are a bad choice . CPU’s are two inefficient and will hold physics back unless there is a major change in how CPU's work.





“We're already half way there really with Havok FX which allows physics to be done on a second graphics card,”
Havok FX has been scrapped along with all the games supporting it have dropped support. I feel sorry for anyone who bought a 3 slot motherboard and 3rd GPU for physics. It’s going nowhere which is a shame.
 
Havok FX has been scrapped along with all the games supporting it have dropped support. I feel sorry for anyone who bought a 3 slot motherboard and 3rd GPU for physics. It’s going nowhere which is a shame.

BECAUSE INTEL HAVE BOUGHT HAVOK. What does this tell you. It's not that hard.
 
Just wondering if having a dedicated physx card is worth the cash?

Firstly, does it really add anything to games?

Secondly, are there many games that use them, or have I got the wrong end of the stick and all games can use them to some degree?

TIA

Not unless the number of games that support them increases dramatically which i dont think is going to happen
 
The only real competition Physx has is GPU based physics, but as graphics cards are constantly struggling to run the latest games decently surely a dedicated Physx card is the best option? or would you rather buy a separate £400 8800GTX for your physics instead?

I'd rather not have any extra physics at all than shell out for an aegia card judging by what I've seen. In any case, you wouldn't have to buy an 8800gtx to get good physics performance.
 
that instead of running on a 2nd GPU it will use a CPU core or 2?

Thats what I'd like to see. Afterall, I've only had one game max out my cpu (E6600) and that was Supreme Commander. With all these quad cores floating around now, I'd be suprised if games maxed out 3 of the cores. That leaves one free to run windows, and do phyics. Sure, it may not be as efficient as a dedicated card, but its cheaper :)
 
WELL I installed a physx card today, 2nd hand of course, not payin full price, wanted to have a play.

UT3 demo the fps has fallen by ~4/5 fps, however the average is up slightly, so overall iv no idea.

Agiea's OWN benchmark demos run 2fps slower than software mode, however its 'reality mark' runs ~ 58fps, in software mode it runs 2fps. Something tells me though it maybe programmed to run pap in software mode as its other demos run slower in hardware.

Not sure what to make of it so far, anyone ever written a havoc --> agiea emulator ? then it would work in most games.

So far though its slowed down anything iv ran on it (peak fps)

From the looks of it the hardware is so so old now, I mean other than pottsey where is the technical info on what its throughput is and more importantly what about info on the abilitys of modern CPU's, I need to invesigate how many physics per second a CPU can do.

EDIT: Fluid physics look aweful, really really bad, like DX 6 graphics VS DX9c (in that I mean it looks like a very early attempt to do fluids, needs 5 more years and 5 more new versions of the physx cards to acheive something good)
 
Last edited:
Are you really sure there is more quad core support? Could you list some games? I am a bit out of date with quad core support so I could well be wrong.

Supreme Commander, UT3, Crysis, Bioshock (I think), multicore patch for Source engine too. So more than Physx supports.
Cellfactor is the only game worth note that uses phyx to an extent worthwhile. The UT3 demos are laughably bad, the physics in Farcry and HL2 are far superior, thats before you even look at newer games.

Some 3d rendering software now supports physx then, ideal if you want real time physics when rendering single frames I guess.... :p
3d apps like this have supported multiple cores for years now.

A physx card is nowhere near as usefull as a quad core, unless the only thing you do on your pc is play cellfactor.
 
Back
Top Bottom