• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia GPU PhysX almost here

Soldato
Joined
29 May 2006
Posts
5,381
Having trouble reading this due to blocking at work. But from what I can gather GPU PhysX is almost here. At long last hardware physics for the masses. I just hope the GPU can run all the older and upcoming PPU games. I don’t want to see 3 sets of the PhysX API with each one only working on one device.

If the GPU can run older PPU games it will be interesting to do benchmarks between the GPU and PPU. It makes sense give GPU owners a back catalogue of PhysX games to play along with any new ones. I assume newer games will be for the GPU only but GPU PhysX is more likely to take off with a back catalogue of games. It’s just a shame ATI cannot use PhysX GPU. I hope PhysX takes off and ATI get permission to use it.

I wonder how many people who said PhysX PPU games are rubbish will suddenly like them now they run on the GPU. Assuming they run.

“We can’t really recall how long and how often true physics capability has been promised by various parties over the past three, four years. But there are more signs that stunning physics visualization in fact is becoming a reality and it appears that Nvidia is taking the lead for now. With Ageia on board, the company has a key technology and Manju Hegde, co-founder and former CEO of Ageia, confirmed that the port of Ageia’s technology to Nvidia’s CUDA is almost done.”

http://www.tgdaily.com/content/view/36915/135/
 
"extre 8/9 series GPU alongside the graphics GPU for physics only?"
I am hopeing those of us with 1 GPU can use it for both Physics and GPU work. But who knows how its going to work. One can hope. I dont see PhysX taking off if you need a 2nd GPU.
 
“Yep you will need a second card if you want to keep your framerates up. There are only so many stream processors and that is what is been used for the physics.”
Surly that depends on the game. Freeing up the CPU and moving physics off it will in some games give a bigger FPS boost then the drop from running physics on the GPU.
 
“then silly ideas like this, will always be on the horizon”
If it boosts high physics games like the high physics levels in UT from 10fps up to playable FPS it’s not a silly idea. If the GPU can do something way faster then the CPU why is it silly to do the job on the GPU?
 
“If i remember, i think ATI's Havok was going to be done with Crossfire, so if the game used the Havok Physics, then it would use your 2nd card as the sole Physics one,”
ATI where only going to support Havok FX not normal Havok. When Intel bought Havok they dropped all support for FX. All games supporting FX dropped it and so did ATI. In a semi recent interview ATI said they scrapped all plans for physics for years to come if I recall correctly.

ATI are pretty stuck right now if physics take off they will be left in the cold. Both there CPUs and GPU’s don’t have physics support while NIVIDA and Intel do.



EDIT
“Does the Ageia Physx card out now need SLI to work?”
No, instead of using the Ageia Physx card you use the GPU to do Physx.
 
Last edited:
“P.S. Is anyone else annoyed by Pottsey's inability to use the Quote button?”
What is the problem with forum users on here? Recently there have been some very intolerant ones.

For 4 ish years there was no problem with me posting like this then in the past 2 ish years all of a sudden a few people don’t like it. Yet all the other forums I visit and post like this don’t have a problem. Why is my posting style a problem now but not a problem in the past? Why is it this forum only?
 
I joined in 2000 or there about and lost my account in the big wipe at least I think it was during the wipe. Anyway around 2004/2005 there wasn’t any PowerVR news or cards on the market so I stopped posting. It wasn’t till I came back to the forums in 2006 I found my account had been wiped so I signed back up.

A few members still remember me from the old times and have confirmed I am the same person.
 
Last edited:
Cyrsis on very very high physics is bottled necked by the CPU putting FPS down to less then 1fps. Moving physics to the GPU should bump the FPS back up to playable levels. A lot of people seem to overlook this. Yes physics on the GPU might slow down the FPS but it should slows down the FPS less then what the CPU would doing the same high level physics.
 
“So as 95% of the gaming population will not have a second GPU for physics it means that the developers having to pour money into a investment that only benefits a very few people –“
Or if the game is more CPU intensive then GPU you can do those physics on the GPU for a speed boost and those in multiplayer without the right GPU have the physics fall back onto the CPU. The API is made to fall back to the CPU for those who dont have hardware its hardly a lot of money thats needed for the devs as its all one package.


95%! Pretty sure it’s more like 20% to 40% if one GPU can do it and there is no reason why it couldn’t.





“But if your playing online all those projectiles and paths have to communicated to the other computer..”
That’s not a problem. Why would it be?
 
“I see what you're getting at - however the game has to play on their minimum specs.“
True but Nvidia have said AMD can use it for free and as it can work on 1 card most gaming PC’s could have support. If done correctly its going be usefull for all gamers and most real games have ATI or Nvidia GPU's and if you have a 3+ year old GPU chance's are you below the recommanded specs.





”It's the memory bus that is usually the limiting factor (I should point out I've been involved with GPGPU for sometime).”
Memory bus isn’t a problem as very little data will need to be sent across it due to physics. The hard part isn’t sending the data back and forth its processing the data. You need a lot of internal bandwidth to process the data but very little external bandwidth to send the results.





“One other point - I bet you will not be able to mix and match the AMD, nVidia, Intel cards - thus nVidia”
No one has said anything about needing a 2nd card the only thing that been said is Nvidia has opened the API for anyone to use including AMD for free. It should work fine on 1 card.
 
Back
Top Bottom