Batman demo GPU PhysX... Running on a Quadcore CPU No NV card required .

Soldato
Joined
7 May 2006
Posts
12,192
Location
London, Ealing
smoke - http://tinypic.com/player.php?v=20pyqev&s=3
sparks+cape - http://tinypic.com/player.php?v=2urv5dz&s=3
blur - http://tinypic.com/player.php?v=242g3r7&s=3
tiles - http://tinypic.com/player.php?v=nn8l52&s=3
fight- http://tinypic.com/player.php?v=2us7bcj&s=3

all files were made with fraps all settings maxed, 1600x1200, no AA, forced physX mode 1, 720 be unlocked 3.6ghz, stock 4890, 1440mhz 7-7-7-18 ram. fraps lags a little with cashing, u can see were it went every 20sec or so in the fight

go to "Documents\Eidos\Batman Arkham Asylum Demo\BmGame\Config"

then open UserEngine.ini

next change "PhysXLevel=0" to "PhysXLevel=1".
http://www.xtremesystems.org/forums/showthread.php?t=231871&page=7

Theses effects & features were disabled in the game before if you had no GPU PhysX
 
Last edited:
Soldato
Joined
30 Jun 2006
Posts
6,192
Location
Horsham
That's a triple core. To be honest it looks like there is no real change. The only big stand out feature for me was the tiles picking up and even then it isn't major.

Edit: So I just played with all the physics bits turned on and it isn't all that special and it does cause slow down considerably. It is still playable but at about 20fps. I do game at 1920x1200 rather than 1600x1200 but the rigs are pretty similar.

Edit2: Also you can turn those features on by ramping them up in the normal settings panel for the game rather thanvia the config.
 
Last edited:
Soldato
Joined
6 Sep 2006
Posts
6,275
Location
London
Is it just me or is there nothing there really that we haven't seen before in games for years? Granted the tile smashing is quite nice but nothing to make me go wow.

The cape is just the same as when I first saw the shower curtain move because of my model back in the original splinter cell.
 
Soldato
OP
Joined
7 May 2006
Posts
12,192
Location
London, Ealing
That's a triple core. To be honest it looks like there is no real change. The only big stand out feature for me was the tiles picking up and even then it isn't major.

Edit: So I just played with all the physics bits turned on and it isn't all that special and it does cause slow down considerably. It is still playable but at about 20fps. I do game at 1920x1200 rather than 1600x1200 but the rigs are pretty similar.

Edit2: Also you can turn those features on by ramping them up in the normal settings panel for the game rather thanvia the config.

Triple unlocked to 4.

PhysX. is only using one core on the CPU in that game so that maybe contributing to the slowdown.

The user could not use the in game menu to change the setting because he was using older PhysX as the newer drivers gave him lag.

try theses out.

Default DisableATITextureFilterOptimizationChecks=True set it to False

bAllowMultiThreadedShaderCompile=True
 
Last edited:
Caporegime
Joined
9 Mar 2006
Posts
56,289
Location
Surrey
Is it just me or is there nothing there really that we haven't seen before in games for years? Granted the tile smashing is quite nice but nothing to make me go wow.

The cape is just the same as when I first saw the shower curtain move because of my model back in the original splinter cell.

I agree, I didn't really see anything special until the tile bit, and even then it was hardly breathtaking.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
Default DisableATITextureFilterOptimizationChecks=True set it to False

bAllowMultiThreadedShaderCompile=True

Am I correct in thinking that the game is actively disabling ATi using a feature to optimise performance by default?

Or is that just a coincidence that DisableATITexture is there and ati is some other acronym?

Does anyone really not have at least a dual core anymore, is it not getting ridiculous that Nvidia constantly add's features that are disabled/work badly by default unless on their own hardware. If the physx were ground up built to use a quad core, it would just work on most peoples computers just fine with little to no slowdown(for very very little benefit anyway I'm gathering).

I'm really REALLY starting to dislike the massive amount of money Nvidia pumps into companies to make TWIMTBP games, as rather than helping, it seems more about actively sabotaging ATi, instead of making better games that run great on Nvidia hardware.
 
Caporegime
Joined
26 Dec 2003
Posts
25,666
Physx is a software API that supports both CPU & GPU acceleration so it's not really a surprise, the whole point is that performance-wise using a GPU to accelerate physics is far superior and frees up the CPU for AI etc.
 
Soldato
OP
Joined
7 May 2006
Posts
12,192
Location
London, Ealing
Am I correct in thinking that the game is actively disabling ATi using a feature to optimise performance by default?

Or is that just a coincidence that DisableATITexture is there and ati is some other acronym?

DisableATITextureFilterOptimizationChecks
UseMinimalNVIDIADriverShaderOptimization

This is right underneath it. so it looks like its not an acronym.

Also the user is getting much better Physx on the CPU with much older drivers with a minimum 22FPS in the game while others using the latest Physx drivers have seen a decrease in Physx on the CPU with the latest Physx drivers with a minimum fps12.
 
Soldato
Joined
6 Sep 2006
Posts
6,275
Location
London
Makes you wonder why we have a 7 page thread of crying about it doesn't it :D

Surely the 7 page thread is more about the game rather then the physics effects? You can't say that everyone in there is saying how groundbreaking the physics are?

We've seen cloth physics plenty of times before in many games that didn't use PhysX. There really isn't anything there that new and different. Even so I think most people in here are just peed off that nVidia are paying to have game companies actually make games run slower and with less features on ATi then is possible. No company should be allowed to do that.
 
Associate
Joined
18 Oct 2002
Posts
892
the fact that the 360 and ps3 use cpu for physics means gpu physics will not take off in a big way yet

it will just be a added extra a marketing tool for nvidia

software development is console orientated these days
 
Last edited:
Soldato
OP
Joined
7 May 2006
Posts
12,192
Location
London, Ealing
The 7 page thread is about Nv disabling the PhysX when their cards smell an ATi card present in the system to, yes i know its hard to believe, but im telling you, this is just what that big crying thread is about, pmsl :D

Bet your all wondering why now to :p



No im not, thats not what the issue is, the issue is Nvidia disabling it when ATi cards are used, thus they are trying to force you into only using their cards in your sytem, telling you, you can't have an ATi card as well etc..., but when you see how great PhysX is here, you have to wonder why all the crying, as its bugger all, PhysX is a gimmik, a utterly pointless useless feature thats gona be dead soon like everyone says, so why does everyone care Nv's disabling it when they smell ATi cards in the system to, im sorry, but imo, what you see here is not worth crying about, honestly, look at it, its bloody nothing, seriously, does anyone give a flying **** that this useless feature is getting disabled when ati cards are present, i certainly don't, so what we cant use PhsyX, they aren't using it either are they, just look at the games list. :p

http://physx.cwx.ru/ look at all those games for the PC using a Nv card for PhysX :D

The fuss is over the principle of the matter which you clearly don't understand.
The bit in bold matters, PhsyX does not.
 
Last edited:
Soldato
Joined
18 Oct 2002
Posts
2,953
Location
Greater Manchester
The 7 page thread is about Nv disabling the PhysX when their cards smell an ATi card present in the system to, yes i know its hard to believe, but im telling you, this is just what that big crying thread is about, pmsl :D

Bet your all wondering why now to :p



No im not, thats not what the issue is, the issue is Nvidia disabling it when ATi cards are used, thus they are trying to force you into only using their cards in your sytem, telling you, you can't have an ATi card as well etc..., but when you see how great PhysX is here, you have to wonder why all the crying, as its bugger all, PhysX is a gimmik, a utterly pointless useless feature thats gona be dead soon like everyone says, so why does everyone care Nv's disabling it when they smell ATi cards in the system to, im sorry, but imo, what you see here is not worth crying about, honestly, look at it, its bloody nothing, seriously, does anyone give a flying **** that this useless feature is getting disabled when ati cards are present, i certainly don't, so what we cant use PhsyX, they aren't using it either are they, just look at the games list. :p

http://physx.cwx.ru/ look at all those games for the PC using a Nv card for PhysX :D

PhysX is just a physical calculation library, nothing more, nothing less. It wont be "dead" soon as all it does is allow game developers to not have to re-code basic physical calculation code from scratch.

Some use Havok, some use PhysX, some use Bullet while others make their own (the foolish ones :)).

What may or may not "survive" is the GPU accelerated aspect of the PhysX library.

At the moment, as of SDK 2.8.1, all we have is a very rudimentary CUDA based implementation of SOME of the aspects of the PhysX engine, actually much less than with the original Ageia PPUs from ages ago. With those, all aspects of the engine could be calculated on the PPU (up to a certain size of simulation), leaving the CPU to do other stuff. problem is the PPU is just too slow now and cant keep up with the rest of the system.

All you see in that game can be done with the PhysX library whether its on the GPU or CPU.

The effects are not a result of PhysX, they are a result of the devs using PhysX to generate more realistic effects such as breakable rigid bodies and the cape (cloth simulation). The devs COULD have written this code themselves, or used Havok or Bullet to do exactly the same. Either way it would have been running on the CPU.

With PhysX SDK 3.0, due relatively soon i'm told, there will be a full and well implemented CUDA implementation of ALL the PhysX library. From my own experiments, a cloth simulation (i.e. batmans cape there) in SDK 2.8.1 on a reasonable GPU is about 5x faster than on a reasonable CPU (and oc the CPU is offloaded). I expect it to be better with SDK 3.0. Also, all of the rest of the physical effects you see will be done on the GPU, so it really should become quite apparant.

The reason NVIDIA are "blocking" ATi is that the library uses their own stream processing API, CUDA to access the GPU. I am hoping in time they will switch to OpenCL, if not, then accelerated PhysX will always remain an NVIDIA thing and tbh this could really play against the PhysX library, allowing Havok or Bullet to really take hold.

That aside, the comments in this thread do highlight another issue. Sure, from a physical point of view the "PhysX" cape will be much more physically accurate than the (I assume) simplified home-brew one (it'll still be the same basic particle/spring simulation though, just less accurately calculated, with more physics assumptions in the code etc.), but does the average gamer really care? :)
 
Back
Top Bottom