• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ATI with nVIDIA PhysX in W7 x64 - 2nd card not detected

k sorted it

For ME I have 9800gt and 5870 installed 196.21. I went to safe mode, used PhysX-mod-1.02. Went back in to windows turn on PhysX GPU "Enabled". in nvidia control panel and it works fine must say the fog effect and other things just add a bit of realism which allot of games lack imo.

worth it for what it cost

Nice one on the 9800GT, I wanted to get that for Physx but I wanted a single slot nvidia card for the cheapest I can get.

So I settled for a 8800GT, 2 x 5850's and 1 8800GT, does a great job :D.

As a note, I didn't need to go into safe mode.

Just open up taskmanager, end all processes beginning with nv (nvidia related) and end explorer.exe. run the patch a it should work fine.

Then open up a new task in task manager and type explorer and everything should be gerat :)
 
http://www.techpowerup.com/reviews/NVIDIA/Shady_9600_GT

I wonder if some brands have done the same thing with more recent 8600GTS as well - I have run into a problem with the 9600GT before where it would only work in the primary slot of the persons PC which seemed to be down to its referencing the PCIe bus for the clock gen.

Its also possible the board/4850 in primary combo just doesn't play nice and locks out the 2nd slot for some arbitary reason.

Wouldn't that technically mean that such cards aren't eligible to claim "SLI" support?
 
Wouldn't that technically mean that such cards aren't eligible to claim "SLI" support?

Its only certain motherboard that have problems - pretty sure all SLI boards would work just a few (mostly older/cheap oem) boards that were probably never intended for the 2nd slot to actually be used for multiple videocards (don't support CF or SLI officially).
 
Well, I'm afraid that I can't call my effort a complete success. The computer has been getting sound corruption after about 20 minutes of gaming, followed by turning itself off.

At a guess I'd say that being in close proximity to the HD4850 is causing my X-Fi to overheat. Could be a few other things including the PSU isn't up to running 2 graphics cards but I reckon it's heat. If not the X-Fi then the HD4850.

If I'm going to get PhysX permanently it will have to be a single slot card that works in my 2nd PCI-E slot. Taking the 8600GTS out and moving the HD4850 back to the first slot and going to play some games to test stability.

Oh well - was worth a try.
 
Last edited:
Interesting thread. I've got an old 8800GT kicking around after upgrading to a 5870 vapor-x. The size of the 5870 vapor-x is enough to tell me that its just going to create problems if I stuff the 8800GT in for just one feature..

My case, with HDD and Xonar D2 soundcard is going to give the same troubles as the OP I would feel..

OP - least you tired, but is it honestly worth it in the end? Its maybe a more of a nice to have, then a must have...
 
Is it worth it? For a freebie I'd say yes (without the stability issue I mentioned). I'll generally take eye candy over framerate in single player games.

Bur really - do we need GPU physics at the moment? To contrast Batman with Ghostbusters I've been very impressed with physics in the latter. Both games throw all sorts around. In batman the sheets of paper are persistant. In ghostbusters they disappear when they hit the floor. Both games have cpu cycles to spare, at least when I run my CPU unlocked to quad core. But Ghostbusters is rock solid framerate wise on my rig where as Batman is a bit all over the place.

I'd say with more efficient programming there is no reason why the sort of physics we see in Batman (bearing in mind I've not got far into it) couldn't be done fairly comfortably on a quad core CPU. Unfortunately I don't think it's in nVIDIAs interests to optimise physx for CPUs.
 
Last edited:
i got a zotac 9800gt single slot low power version, does not use any extra power connections from the psu either.

sits @ 60oc 25oc ambient under full load
 
Is it worth it? For a freebie I'd say yes (without the stability issue I mentioned). I'll generally take eye candy over framerate in single player games.

Bur really - do we need GPU physics at the moment? To contrast Batman with Ghostbusters I've been very impressed with physics in the latter. Both games throw all sorts around. In batman the sheets of paper are persistant. In ghostbusters they disappear when they hit the floor. Both games have cpu cycles to spare, at least when I run my CPU unlocked to quad core. But Ghostbusters is rock solid framerate wise on my rig where as Batman is a bit all over the place.

I'd say with more efficient programming there is no reason why the sort of physics we see in Batman (bearing in mind I've not got far into it) couldn't be done fairly comfortably on a quad core CPU. Unfortunately I don't think it's in nVIDIAs interests to optimise physx for CPUs.

The softbody effects like the cobwebs would be hard on any current CPU - sure you can use a small number of instances of them but putting in a number of them in the same place would kill performance on CPU. The smoke/steam physics would also be fairly hard on the CPU and theres some very infrequent instances of destructable objects, complex debris effects that would bog down a CPU.

But all told Batman doesn't really make good enough use of GPU Physics to justify using it - the effects it does use tend to be limited to one appearance then they got lazy and couldn't be bothered to put them in throughout the whole game so kinda defeats the purpose i.e. in some parts there are objects you can hit to knock tiles off or blow chunks out but then the identical wall/object 2 rooms later will be completely unaffected by damage to it.
 
I'm not convinced softbody effects would be a problem. Splinter cell on the original xbox had soft curtains that moved in response to your character or bullets (no ripping though). That was on a 700 odd mhz CPU that was something between a pIII and a celeron. Yes - it was usually one or two curtains on screen but CPUs have come a long way since then.

Ghostbusters is throwing around some softbody stuff (albeit not persistently)- at the same time as some particle effects that are not far off what you get with PhysX and more rigid body objects than in most CPUs with cycles to spare on very affodable quad CPUs.

I'm sure GPU physics is capable of stuff that isn't yet being used. However benchmarks of PhysX running on CPU alone look like the performance is artificially capped, rather than maxing out the CPU (or even stressing it at all. Benchies even show CPU utilisation lower on ATI cards when Phyx is enabled). See here for info http://www.tomshardware.com/reviews/batman-arkham-asylum,2465-10.html

I accept that a cheap single or dual core CPU is going to be out of its depth here but something like a Phenom II, or certainly an i7 should be able to be utilised a little more efficiently.
 
Last edited:
PhysX doesn't work with an ATI main card because thats the way Nvidia drivers are setup. There is a work around that allows this to work, should be able to find it doing a search on the internet. :)

you mean
"thats the way it meant to be played"

:D
sorry couldn't resist
 
Well, with the Radeon HD4850 back in the top HD4850 slot, I had about an hour and a half of crash free gaming.

I may have another crack if I can find a very cheap single slot nvidia card to run in the 2nd slot.
 
What I find ridiculous is that Nvidia are trying to promote and implement Physx in video games, yet they go ahead and disable it if you use ATI cards. There is no reason for this when it used to work and now it doesnt, other than Nvidia being greedy money grubbing ******** trying to use Physx as a tool to convince people to buy Nvdia only products, like they did back when SLI only worked on Nvidia motherboards.

If Physx worked with ATI, and it was supported in more games, I would buy a single slot Geforce card to use in my third slot. But I suppose Nvidia dont want my money.
 
Yeah one of the most retarded things nVidia have done in awhile... we were all set to use PhysX on a project I'm working on and after that, with some reluctance as we'd done quite a bit of the ground work and really wanted GPU performance, swapped to ODE on CPU.
 
What I find ridiculous is that Nvidia are trying to promote and implement Physx in video games, yet they go ahead and disable it if you use ATI cards. There is no reason for this when it used to work and now it doesnt, other than Nvidia being greedy money grubbing ******** trying to use Physx as a tool to convince people to buy Nvdia only products, like they did back when SLI only worked on Nvidia motherboards.

If Physx worked with ATI, and it was supported in more games, I would buy a single slot Geforce card to use in my third slot. But I suppose Nvidia dont want my money.

Hence why I wanted to buy used. Not keen on nVIDIA's business practices at the moment. They can have my money when they get a grip again.

Edit: I'm not really that principled. If they release a card that's a decent improvement on my main HD4850 at a price I fancy paying, I'll consider it.

Last I heard, SLI still only works on SLI motherboards.
 
Last edited:
Officially SLI only works on nForce and X58/P55 boards... unofficially there are "patches" to enable it on most P45 and P35 boards.
 
Back when SLI was released, I upgraded from a 9800 pro to two 6800s and a shiny SLI board. The mobo died while windows was installing. The replacement died after another 6-8 months. I never bought another Nvidia product since and went with ATI using AMD or Intel chipsets, which only supported crossfire and never had any more motherboard failures.

Now Intel boards support both SLI and Crossfire (P55 + X58 if you didnt know), but thats after the ATI HD 4000 and 5000 are already out. Back when the shiny 9600 and 8800 GTs were around, I had P35 / X48 so could only use crossfire, and ATI have had great <£150 cards to use in pairs since the HD 3850. I liked my 3850s, 4850s, and now 5770s, I hated my 4870s, because my PSU didnt have enough cables and they were too power hungry, but now have a new PSU I can use two cards with 2 power inputs, but stuff like 5850s would be way over my budget unless I'm just buying one, and the two 5770s are much better in most games (I just ebay my old stuffs and spend it on new, and <£150 cards dont cost too much as they dont lose as much value :))

If I could have changed anything, I wouldnt have bought any 4800 cards and instead gotten the brilliant 4770s =D. Now I just plan on selling and hopping up to the next *770 cards each time. But my third slot is empty and waiting for a single slot Physx card should Nvidia allow it to work. I were planning on getting a nice low profile 9800 GT a while ago, until I found out that it wouldnt work.

Heres one, nicely priced and green edition:

http://www.overclockers.co.uk/showproduct.php?prodid=GX-109-GW&utm_source=froogle

Will buy if Nvidia allow Physx to work with ATI. Or on closer inspection is has a double I/O pannel which ruins it so I'd need the Zotac.

Or this Club3D one looks single slot:

http://www.overclockers.co.uk/showproduct.php?prodid=GX-021-C3&groupid=701&catid=56&subcat=1009
 
Last edited:
Back
Top Bottom