• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

what do u think on physic x?

Some of the effects are pretty good, I would say im about 30 hours in, I think im getting toward the end of my first playthrough. My problem comes with the turrent + rockets it was good up until then but when the rockets are used the fps drops presumably because of the particles that fly everywhere.

Some of the effects are nice. The best effects really don't need GPU acceleration though. My favourite effects are the crap that goes everywhere in a firefight. Bits of rubble, sparks, ash and the elemental effects, it looks great.

It also makes it kinda hard though because in a big fight it's hard to see anything. However as I said, the GPU acceleration isn't needed for the effects in Borderlands 2, that part is just a gimmick put in because nVidia offered gearbox nice money to do it.

Some of the other effects are BAD. though. The free flowing water looks awful, it looks like this horrible thick gel that just blobs about, in fact it acts and looks pretty much the same as the gels from Portal 2.

I've seen better water effects in games that don't "need" a GPU to process the physics for "best results".
 
Nvidia Physx is only used on Nvidia Cards to process Physics effects.

AMD's Physics engine is open source and not shut down for use only on AMD and tbh is just as good.

So shut up the lot of u
 
I would be inclined to add that it's not always what one says, it's how it's said /expressed. Then again, everyone speaks a different language anyway, which makes it like trying to 'pin the tail on the donkey while blind folded'. :D

OT : I have Borderlands 2, though I haven't played it yet. As much as I have been impressed with my current gpu, I can't say the inclusion of phsyx alone would sway me much towards buying an Nvidia gpu the next time I upgrade. If I find the effects in B2 to be something special, and a few more games trickle on to the market that continue that trend, then it might be a different case. At this stage, these are only 'ifs'.
 
Last edited:
Nvidia Physx is only used on Nvidia Cards to process Physics effects.

AMD's Physics engine is open source and not shut down for use only on AMD and tbh is just as good.

So shut up the lot of you

What's this "AMD Physics Engine" you speak of?

Also, we're all adults, please try not to use shorthand like "u" and so on.
 
Could anyone recommend a good Nvidia card for PhysX alongside crossfired 7850's?

I've been reading about it quite a bit today and seen a few cards being thrown around but the thread is quite dated.
 
i7 3770k @4.4Ghz
Asus Z77 Sabertooth Intel Z77 (Socket 1155)
16Gb Corsair Vengeance Black
850w PSU
2x MSI 7850 Power Editions 2Gb GDDR5

Not even built yet, coming later this week.

I just don't wanna build all that and still have some games demanding that little bit extra which I could easily solve with a low cost nvidia card running as backup using that PhysX hack.

Edit: Thanks for that link, very interesting read (while I'm at work :O)
 
Last edited:
Yikes! For the cost of the 560ti I might as well cash in my 7850's and go SLi!

Thanks for the advice though, appreciated.

If you sell them both on that Internet auction they might yield a GTX 660ti, or a GTX 670 with a little added out of your pocket, which are both good cards.
 
I don't often get involved in PhysX related threads. However as it absorbed quite a bit of my life whilst doing my PhD I'd just like to get the following in to make sure people are working from facts:

1) NVIDIA PhysX is a software library, it is the evolution of the Ageia PhysX software library, which in turn started out as the NovodeX software library.
2) Its purpose is to perform the sorts of physics calculations required to produce physical effects in any 3D simulation (such as computer games). There are many ways to perform these simulations, those that produce results closer to reality require more processing power. PhysX (amongst many others such as Bullet, Havok, ODE etc.) offer well optimised solutions to perform the necessery calculations to produce the numbers that a games animator can then use to move the elements of their 3D scene to make it look more realistic.
3) PhysX has two sets of code:
a) The standard CPU code. This will run on any CPU.
b) The GPU code, not all of PhysX can be calculated on a GPU, however some of it can, and more is being ported as time goes on. It is up to the developer using the library in their software to make sure that their software correctly detects whether a supported GPU is available, and if so whether it should be used to speed up the calculation of the values. The end values will be identical whether calculated on a GPU or CPU, though with a faster processor to hand (i.e. a GPU) more accurate results can sometimes be calculated.

Clearly as PhysX is a product paid for by NVIDIA (in that they paid to acquire it's intellectual property and now pay to continue to develop it) they are only producing accelerated GPU code using their own Gp-GPU programming language: CUDA, which quite simply won't work on any other architecture.

Should game developers wish to include realistic physics effects that can be accelerated on more than an NVIDIA GPU then they need to look at using something else such as the Bullet library, this is open source and receives quite a bit of funding from Sony. Its developers are concentrating on accelerating its functionality using OpenCL instead of CUDA, this is available on both AMD and NVIDIA GPU architectures.

This isn't intended as opinion on whether PhysX is good or bad, just to offer facts as people seem to get so emotive about graphics cards and graphics companies!
 
Last edited:
I don't often get involved in PhysX related threads. However as it absorbed quite a bit of my life whilst doing my PhD I'd just like to get the following in to make sure people are working from facts:

1) NVIDIA PhysX is a software library, it is the evolution of the Ageia PhysX software library, which in turn started out as the NovodeX software library.
2) Its purpose is to perform the sorts of physics calculations required to produce physical effects in any 3D simulation (such as computer games). There are many ways to perform these simulations, those that produce results closer to reality require more processing power. PhysX (amongst many others such as Bullet, Havok, ODE etc.) offer well optimised solutions to perform the necessery calculations to produce the numbers that a games animator can then use to move the elements of their 3D scene to make it look more realistic.
3) PhysX has two sets of code:
a) The standard CPU code. This will run on any CPU.
b) The GPU code, not all of PhysX can be calculated on a GPU, however some of it can, and more is being ported as time goes on. It is up to the developer using the library in their software to make sure that their software correctly detects whether a supported GPU is available, and if so whether it should be used to speed up the calculation of the values. The end values will be identical whether calculated on a GPU or CPU, though with a faster processor to hand (i.e. a GPU) more accurate results can sometimes be calculated.

Clearly as PhysX is a product paid for by NVIDIA (in that they paid to acquire it's intellectual property and now pay to continue to develop it) they are only producing accelerated GPU code using their own Gp-GPU programming language: CUDA, which quite simply won't work on any other architecture.

Should game developers wish to include realistic physics effects that can be accelerated on more than an NVIDIA GPU then they need to look at using something else such as the Bullet library, this is open source and receives quite a bit of funding from Sony. Its developers are concentrating on accelerating its functionality using OpenCL instead of CUDA, this is available on both AMD and NVIDIA GPU architectures.

This isn't intended as opinion on whether PhysX is good or bad, just to offer facts as people seem to get so emotive about graphics cards and graphics companies!

Very interesting, thank you :)
 
I'm really not fussed whether I use Nvidia or ATi/AMD in all honesty and certainly won't be changing my setup to accommodate a feature which I'm hoping my processor can tackle with the patches I've seen flying around.

Was just something I knew nothing about or even knew was possible until this morning, so thanks for the info. Gonna see how my games with PhysX run this weekend after the build before I start going down a hybrid path.
 
I'm really not fussed whether I use Nvidia or ATi/AMD in all honesty and certainly won't be changing my setup to accommodate a feature which I'm hoping my processor can tackle with the patches I've seen flying around.

Was just something I knew nothing about or even knew was possible until this morning, so thanks for the info. Gonna see how my games with PhysX run this weekend after the build before I start going down a hybrid path.


Keep us updated :)
 
I'm really not fussed whether I use Nvidia or ATi/AMD in all honesty and certainly won't be changing my setup to accommodate a feature which I'm hoping my processor can tackle with the patches I've seen flying around.

It won't as explained above. It can run "OK" off the CPU but it isn't the same. Whether or not it's worth it is of course down to you :)

Oh? Like what?

Wrong topic for such discussions but I was just responding to your comments.

All I'll say is if you dish it out expect it to come back and when it does don't moan about it. :)
 
Last edited:
Wrong topic for such discussions but I was just responding to your comments.

All I'll say is if you dish it out expect it to come back and when it does don't moan about it. :)

lol, are are serious? i think you have completely fallen off the trolley now.

Sorry mate but that is Hilarius. :)
 
Back
Top Bottom