• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

EVGA Bans Discussion of PhysX Mod

EVGA may still have the opportunity to go the way of bfg. All their motherboards seem to be aimed at the high end (for example, evga classified), and I can't imagine them selling in large volume, or supporting the company while they have few GPUs to sell. Come on Nvidia! We want a price war to drive down prices!
 
i really hope they get screwed for blocking something people have paid for, it's an utter disgrace!

nVidia have screwed themselves with regard to physx, there was 2 more AAA titles that were going to use it and have moved to havok instead since they disabled it on ATI+NV configurations and a couple more smaller projects that I've been involved with that have swapped to havok or ODE.
 
Nvidia's stance on this is somewhat ironic, when you consider how much they (quite rightly) ribbed 3dfx for their proprietary Glide API, before they bought them out of existence.
 
I cant understand why nvidia don't embrase the fact people want physX.

We do?

Personally, I don't see the point of Physx on GPUs, regardless of hardware accelerated CUDA guff. Multicore/multi-processing systems are becoming the norm; you've got plenty of CPU power going spare for physics without having to offload it to the GPU.

In fact, too much physics can be a bad thing, as anyone playing a medic in BF: Bad Company 2 will testify when they run to where the map says a dead player is, only to find they've been rag-dolled over the next mountain range by a tank shell ;)

On topic - ultimately, EVGA are bound by nVidia's ToC; we don't like it and I'm sure they don't like it, but business is business and those are the rules ;)
 
CPU performance even in the next 2 years isn't going to be even close to enough to run a game at realtime and do complex physics in realtime, sure modern CPUs are capable of most rigid body physics even at fairly extreme amounts but even with ragdolls they quickly run into performance issues and extensive use of soft body physics especially fluid effects will quickly grind even the latest CPU into the ground. Add in other advances like more complex AI processing, etc. and the situation is even worse.
 
It's not really EVGA's fault though, it's down to NVIDIA and it's out of order. EVGA just have to play the game otherwise get in the bad books. Did anyone really expect anything else?

This thread is slightly pointless.
 
I don't understand how a company that is in jeopardy of losing a lot of its market share (I don't know what figures currently are) would make such an unpopular decision. I can understand the logic behind it, but you'd think Nvidia would want to win over ati customers, not alienate them. It just makes no sense to me, even from a business point of view. How much do they have to gain by forcing customers to buy Nvidia (and no one else) to get something that's rapidly becoming redundant? Like kkbigal, I'm after a price war! I want a healthy competetive market that provides good products from both sides.
 
I don't understand how a company that is in jeopardy of losing a lot of its market share (I don't know what figures currently are) would make such an unpopular decision. I can understand the logic behind it, but you'd think Nvidia would want to win over ati customers, not alienate them. It just makes no sense to me, even from a business point of view. How much do they have to gain by forcing customers to buy Nvidia (and no one else) to get something that's rapidly becoming redundant? Like kkbigal, I'm after a price war! I want a healthy competetive market that provides good products from both sides.

Totally agree. Denying ATI owners the ability to use an NVIDIA card alongside for PhysX is just baffling.
 
It is a shame, the way they force developers to implement features for their cards only and render Nvidia cards useless for physx when an ATI GPU is in the system. ATI have publicly said that they are open with their developer relations program, they will never exclude Nvidia owners from any features they help to implement in to games, everyone will benefit.
 
damn i am shocked to read that article and then say i have an E*** & N**** card, damn if they dont changed that decision, i think and others going to have to move our loyals to an open market...i've never been that type to support bullys except microsoft at the moment because i have no choose.
 
nVidia should be really pushing PhysX as an add-in card. They could still encourage sails by offering incentives for developers to increase the requirements, meaning an old 8800GT would cease to be relevant. I was quite interested in PhysX - even if it's not a game changer - until nVidia bought it out and killed it. As it stands it's only used by developers where money or manpower are thrown at them and even then only for trivial effects.

nVidia had their chance. Now I'm waiting for DirectCompute to kick off and leave this unsightly era to fade to a distance memory.
 
I would think so, it still has an ATI GPU present.

According to the quote in the first post, yeah. Pretty crazy stuff.

I'm with the people who are suggesting Phsyx should be facilitated as an addon product. I would be tempted to buy a cheap little card to handle physics if it was hassle free (eg if drivers existed that were straightforwards). This, I think, would be a great way for Nv to piggyback off the recent success of ATI using what is, essentially, the only thing they have that ATI doesn't right now.
 
CPU performance even in the next 2 years isn't going to be even close to enough to run a game at realtime and do complex physics in realtime, sure modern CPUs are capable of most rigid body physics even at fairly extreme amounts but even with ragdolls they quickly run into performance issues and extensive use of soft body physics especially fluid effects will quickly grind even the latest CPU into the ground. Add in other advances like more complex AI processing, etc. and the situation is even worse.

Please god don't drag this down that ally again, there is MORE than enough CPU power.

These are GAMES, not ultra releastic high accuracy critical systems. This isn't weather prediction or stock market pattern recognition.

This is where physx fails completely and utterly. You can do one calculation to 100 decimal place accuracy, or estimate it, the more accurate the answer, you need exponentially more power to get it. The problem is, you simply don't need that level of accuracy.

You CAN NOT distinguish between, in any way, between an object falling at 9.8m/s in a game, and falling at 9.81284374932, however the calculations for the more accurate number would kill performance.

Physx's entire basis is built upon ultimate accuracy and being able to do incredibly complex calculations faster than a CPU, in which, its right and it can, and any physx API could do so, game based physx aren't complex. The problem is, you DO NOT need that level of accuracy, a far more simplified Havok API can give identical results, at 1/4 of the power usage, just by PURPOSEFULLY using less accurate numbers.


Take Just Cause 2, its got Nvidia crap all over it, you know Nvidia would have paid them more to use Physx for everything, so theres no reason for them to have NOT used it except one reason, physx used in place of Havok on that scale isn't feasible, performance would have been horrible(even the Physx software and not just the hardware accelerated stuff).

Physics in games aren't hugely complicated, you need them to make things come back down if they go up, and do basic things, but destructable models, better hit boxes, more accurate bullet/rocket travel, these are simply design decisions.

Its more likely Phsyx was removed from said AAA titles because as games naturally add more things that interact with each other, people are finding the performance hit of using the far to complex Physx, too much to use on a bigger scale and nothing at all to do with ATi/Nvidia and who can use the hardware acceleration.

Your idea that it had ANYTHING to do with not being able to run physx hardware alongside a main AMD card is rubbish as 99% of people who would play those games wouldn't be using physx hardware acceleration anyway, just the software/cpu version, which runs with an AMD card in the system. Clearly they think Havok was the better choice, ignoring the hardware acceleration.
 
If CPUs are so powerful then why does using PhysX in CPU mode lag games like Mirror's Edge to a 5 fps crawl when breaking a window?

Or is it that PhysX is just horribly unoptimized for CPU usage?
 
Strangely enough... physx is usually called with sceneDesc.mavity.set ( 0, 0, -9.81f ); your information is OLD, out of date... for a long time now physx has been about what works best in a gameworld and not super accurate simulations, that was left behind long ago with novodex. Internally PhysX works with the same numerical precision as any other physics API the only difference is in the approach to some solvers, but thats does have both advantages and disadvantages, on other implementations the simulation is more likely to explode if care isn't taken with the design, but they may offer slightly better performance... you have to decide which trade off works better for you.

CPUs are fine for quite a decent number of RIGID body effects but COMPLETELY fall down on complex use of SOFT body effects - which have much more potential to enhance the gameworld and really really do need GPU type performance for realtime useage, except when used in highly specific instances with highly specific solvers.
 
Last edited:
Its more likely Phsyx was removed from said AAA titles because as games naturally add more things that interact with each other, people are finding the performance hit of using the far to complex Physx, too much to use on a bigger scale and nothing at all to do with ATi/Nvidia and who can use the hardware acceleration.

Your idea that it had ANYTHING to do with not being able to run physx hardware alongside a main AMD card is rubbish as 99% of people who would play those games wouldn't be using physx hardware acceleration anyway, just the software/cpu version, which runs with an AMD card in the system. Clearly they think Havok was the better choice, ignoring the hardware acceleration.

That consumers could no longer pop in an nVidia card alongside an ATI card for "ultra" mode was a big factor in the reason for dumping physx, that and concerns for what random change in direction nVidia would take with it next - developers like things to be predictable.
 
That consumers could no longer pop in an nVidia card alongside an ATI card for "ultra" mode was a big factor in the reason for dumping physx
Nvidia shooting themselves in the foot there.
When you take into account the bundling involved for retailers to receive the 4 series cards, older nVidia gpu sales would probably have seen an increase with Ati users wanting to use physx.
 
Back
Top Bottom