• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ATi Radeon can use PhysX!

Permabanned
Joined
31 May 2007
Posts
10,721
Location
Liverpool
Seems some one from NGOHQ has reverse engineered NV's latest physx driver, and managed to get CUDA running on RV670. It's most probably 'illegal' so I doubt it'll last long, but oh well :o

Here's a link anyway Link

Eran Badit of NGOHQ.com successfully modified NVIDIA CUDA (Compute Unified Device Architecture) to operate on an ATI GPU and has been able to run the NVIDIA PhysX layer on an RV670, the Radeon HD 3850.

He tells that enabling PhysX support on Radeon cards is not particularly difficult, leading us to believe that physics on graphics cards may not so much be a technology problem but an issue of corporate dynamics.

On his first run, Eran got a 22,606 CPU score in 3D Mark Vantage, enhancing the overall score to P4262. A comparable system without PhysX-support will cross the finish line at about P3800.
 
Doesn't surprise me. GPUs seem pretty well suited to processing physics so it was only a matter of time before someone hacked it.

I wish they'd actually MAKE a decent PhysX game though instead of constantly fagging about who has what....there's no point in it at all until there are some games worth buying.
 
He is surely only able to do it because Havok and Physx are platform independent (which means they can run on practically anything :D)

Well done, and pray that it somehow gets out!!

Matthew
 
Last edited:
It will, somewhere else on the web today was the announcement that ATI are getting PhysX for their cards, under the condition that they also implement CUDA, will create a new business opportunity for Nvidia as a specialist in these massively parallel situations. It's a win/win situation for us as customers if we can get a completely functioning CUDA implementation on ATI, and they work with Nvidia on that instead of creating their own. That will mean that things like PhysX and folding etc can be taken for granted now, and for the next generations. This is really a huge development and i really hope it goes through.
 
"I wish they'd actually MAKE a decent PhysX game though instead of constantly fagging about who has what....there's no point in it at all until there are some games worth buying."
There are loads of decent games, Mass Effect, Speedball 2, Age of Empires III e.c.t
 
"I wish they'd actually MAKE a decent PhysX game though instead of constantly fagging about who has what....there's no point in it at all until there are some games worth buying."
There are loads of decent games, Mass Effect, Speedball 2, Age of Empires III e.c.t

Eh?

Those games don't strike me as ones that make extensive use of PhysX.
 
Guess you really have to have a "thing" about Physx to have think those games are of any interest :p
 
”Guess you really have to have a "thing" about Physx to have think those games are of any interest ”
So what do you call a good game? Those are all triple A+ games with great gameplay. How can they not be of interest? Sure not everyone is going like them but they are all great games that are popular.

Most Physx games are not about new effects, it’s about more speed when you have a PPU or GPU. Like UT for example.
 
That's not how PhysX is sold - it's sold as "unleashes amazing new physics effects on your computer".

Which it generally doesn't do.
 
“That's not how PhysX is sold - it's sold as "unleashes amazing new physics effects on your computer".
Which it generally doesn't do.“

Yes it does, it does physics on the CPU and when possible moves it to the GPU and PPU. PhysX is just an API to do physic work.

Take UT the amazing new physics work on the CPU, PPU or GPU. PhysX adds the high end or normal physics. The PPU just speeds up the physics over the CPU.
 
Last edited:
it was always possible and amd have said this themselves, its just that someone skilled enough had to make it work seeing as amd/nvidia wont make it work on radeon cards.
 
"I wish they'd actually MAKE a decent PhysX game though instead of constantly fagging about who has what....there's no point in it at all until there are some games worth buying."
There are loads of decent games, Mass Effect, Speedball 2, Age of Empires III e.c.t
Age of Empires III uses good old havok, thank you very much. Only the OSX version uses PhysX.
 
Yes it does, it does physics on the CPU and when possible moves it to the GPU and PPU. PhysX is just an API to do physic work.

Take UT the amazing new physics work on the CPU, PPU or GPU. PhysX adds the high end or normal physics. The PPU just speeds up the physics over the CPU.

Notice I didn't say "universally doesn't do". It's true that one or two games have done a half-decent job of it. But the majority haven't.
 
"I wish they'd actually MAKE a decent PhysX game though instead of constantly fagging about who has what....there's no point in it at all until there are some games worth buying."
There are loads of decent games, Mass Effect, Speedball 2, Age of Empires III e.c.t

Eh? Sorry? The only PhysX games I can think of are UT3, CellFactor, WarMonger and Ghost Recon. CF and WM got universal drubbings and Ghost Recon's implementation is hardly groundbreaking. I ran UT3 with the Ageia PhysX map pack installed and the levels are rubbish. The frame rate drops to its knees whenever anything 'physical' is going on, and even when it isn't, the game still runs like a dog.

Mass Effect shows no evidence of PhysX whatsoever; it doesn't even seem to have a physics engine at all. Just because the engine supports PhysX implementation doesn't mean that every game that uses U3 uses PhysX.

The most impressive physics effects I've seen are in Company of Heroes, which runs off the plain old Havoc API. When are we going to see a Call of Duty game where the environment is as destructable as it would be in real life? Where debris rains down and cover steadily erodes under enemy fire? That is what I want to see, and until then, PhysX is a waste of time.

That is all :)
 
Last edited:
Eh? Sorry? The only PhysX games I can think of are UT3, CellFactor, WarMonger and Ghost Recon. CF and WM got universal drubbings and Ghost Recon's implementation is hardly groundbreaking. I ran UT3 with the Ageia PhysX map pack installed and the levels are rubbish. The frame rate drops to its knees whenever anything 'physical' is going on, and even when it isn't, the game still runs like a dog.

Frame rate crawls to a halt with or without PhysX hardware?
 
Frame rate crawls to a halt with or without PhysX hardware?

Running it on my 9800GX2, using the latest Nvidia/PhysX drivers. It might be early days yet, I don't know, but Bit-Tech were hardly impressed when they reviewed it with an actual PhysX card....

"Firing three rockets into the planks would often only damage the single plank directly hit, not the planks all around it...if you did hit the plank then the debris which came out was always odd looking. It was all small bits, oddly coloured to be slightly lighter than the surroundings and there was always far, far too many of them."

It's true, the implementation of the effects is over the top and poorly done.
 
Nvidia always said that would make PhysxAPI available to other GPU.

AMD working with Intel for Havok Support with the CPU's, now if they work with Nvidia to get Physx on the GPU as well then covers all the bases.
 
Getting the support of physics engines onto GPUs will increase the user base able to take advantage of physics effects. This should help encourage games developers to improve on the quantity and quality of effects in their games.
It's still early days yet, but we should see some significant improvements in around 6 months time as the technology matures and developers get to grips with what can be done.
 
Running it on my 9800GX2, using the latest Nvidia/PhysX drivers. It might be early days yet, I don't know, but Bit-Tech were hardly impressed when they reviewed it with an actual PhysX card....



It's true, the implementation of the effects is over the top and poorly done.
Exactly the problem :) I think AOE3 did a good job but still way over the top, would rather that buildings in strategy games actually develop for each plank that's added, trees look realistic etc. Just a lot of stuff thrown around will never impress me, and to be honest the destructible walls in UT3 were horrible.
 
Nvidia always said that would make PhysxAPI available to other GPU.

AMD working with Intel for Havok Support with the CPU's, now if they work with Nvidia to get Physx on the GPU as well then covers all the bases.

AMD and INTEL will have gpu's intergrated on every cpu in 1-2 years when we hit the 16 core level maybe(possible 8 but unlikely). that means each and every last cpu sold in 2 years will support HAvok out of the box, Physx is dead, completely and utterly. No one will buy a 2nd gfx card to not use in SLI/Crossfire, Physx actually had a better chance as a 3rd party card. I might shell out £50 for a card in another slot if it was any good(it wasn't). But while its using a 2nd gfx card, if i buy a 2nd gfx card considering 99% of games I'll be using sli/crossfire, I wouldn't ever remotely bother to switch it to PPU mode and lose framerate and the expense of playing a couple crap levels and some ridiculous and not realistic effects.

Offered the choice of a 2nd gfx card for, graphics, or as a PPU, 99.9% of people would use it for graphics, so they failed massively. Except they didn't, Nvidia bought out Physx to let them die, rather than spend 10 times the cost competing with them for the next decade. Business wise its much much cheaper, its as simple as that.

Intel vs Nvidia, intel are about 7 zillion times bigger, Nvidia can not compete, and they've been irritating Intel lately aswell so Intel are even more opposed to helping Nvidia and more likely to do things to screw them.

Physx was dead when it started, Anandtech suggested a LONG time ago that the people behind Physx had this gameplan the whole time, become irritating enough and slightly embedded into the market so someone buys them out for a decent profit, and that was their goal the whole time. Thats why the marketing campaign was always promising things the hardware could not do, the appearance that Physx was fantastic was more important than it being fantastic, as long as people "thought" it would be useful, Nvidia/Ati would need to compete, thats what led to Nvidia buying them out.


Every single useful thing you can do with real physics, you can do with estimated cpu friendly physics. How walls blow up has NOTHING TO DO WITH THE PHYSICS ENGINE. Every wall that can be destroyed must be designed to do so, a wall that explodes in 5 pieces takes 50 times as long to design as a wall that doesn't destruct, because clipping issues, the various parts, making it explode to look decent, making it explode decently, making the parts fall to the ground and not dissappear and a whole lot of other things. Levels are fully destructable as games would take 5 years rather than 2-3 to make. This is why we've only ever seen 1-2 "physx/highly destructable" levels per physx supported title, there is no time to make more.

New engines, new design tools, and things to speed up the level building process is what will bring more destructable levels and more realism to games, nothing more, nothing less.

ALso, Nvidia will still look for ways to make back some cash on it, even if its never used getting ATI or anyone else to buy a licence still means money in their pocket, having people use the basic physics engine saves others time and puts a little more money in their pocket so Nvidia will still "market" it as useful, till its finally dead.
 
Back
Top Bottom