• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Dedicated PhysX card

Associate
Joined
28 Jan 2009
Posts
400
Location
Gloucestershire
I know there are benefits to having a separate graphics card for dedicated physX processing... I have a few questions though.

First and foremost, does a board supporting Crossfire allow for 2 nVidia cards to have one as PhysX?

Second... What would the performance hit (if any) be like for using it on my board? I have 2 PCIe full length slots. If i use just one card, its at x16 bandwidth. If i have 2 cards in there, then they both run at x8. Do nVidia cards even work on a x8 slot? My main card is a GTX260, core 216. My secondary card if i use one will be an 8800GT.

Would I just be better off sticking with my single GTX260 with physX enabled?

Cheers.
 
it would probably have been quicker for u to just stick the extra card in and see if it works than think up the questions :p
 
I'm guessing that since your board is crossfire it won't work. However you can get PCI physX cards for around £30 now.
 
yes I believe that (assuming you can use at least 2 pcie lanes at x8 ech) you can use a nvidia card for physx in a non-sli board.
 
Thanks Amnesia, seems having a few thousand posts doesn't guarantee you're a helpful person...

Thanks everyone else though.

Has anyone tried this setup on a similar board? I'm asking the question rather than just doing it because there is quite a bit of effort involved in finding the old graphics card, and then sticking it in my pc. Yes, i'll admit i'm a little lazy, but I don't want to risk corrupting drivers and things if its not supported.
 
I'm guessing that since your board is crossfire it won't work. However you can get PCI physX cards for around £30 now.

Except nVidia has dropped support for the old PhysX cards - newer games and drivers just won't use it.
 
I think up to the 4890 level x8 lane will be fine for a graphics card, any higher and i think it would bottle neck?

It depends on your chipset if it support crossfire or SLI.
 

i cant see a reason why it wouldnt work....its not like u will be trying to crossfire/sli them.

performance hit should be very little ..possibly a couple of fps regarding the drop from 16x to 8x.

and try the 2 card solution to see if u can get it working...post back let us know if it works



btw dont try just the latest drivers as nvidia doesnt always play nice....try some earlier versions if ur not having any luck
 
If your using an ATI card and for some reason want a Geforce card to do PhysX then you may as well no bother as you only flushing you money down the drain. Nvidia have dropped PhysX support for Geforce cards that are being used in conjunction with a non Geforce video card as of driver revision 186.

Link.
 
unless you stick with old drivers which AFAIK shouldnt be an issue, anyone clarify?

Yes the old drivers still work. Only problem is that the version the dropped the support was the version which saw massive performance gains in physx and they had finally started to get it right.
 
SLI support is not required to run 2 Nvidia GPUs where one is a display adapter and the other is a PhysX device.

As far as bandwidth is concerned, your board will provide each slot with 8 lanes of PCIe2, which will be enough. (8 lanes of PCIe1 would not be). The difference in performance between 8 lanes of PCIe2 and 16 lanes of PCIe2 on current high end single GPU cards is less than 2% as far as frame rates are concerned.

What isn't clear is just how much GPU power is needed for any current PhysX implementation in any game (My suspicion is that the answer is not much, and your GTX260 won't suffer much running it at the same time as being the primary display adapter.) However I haven't seen anyone try any decent PhysX requirements testing.
 
Last edited:
If your using an ATI card and for some reason want a Geforce card to do PhysX then you may as well no bother as you only flushing you money down the drain. Nvidia have dropped PhysX support for Geforce cards that are being used in conjunction with a non Geforce video card as of driver revision 186.

Link.

he is using 2 nvidia cards


What isn't clear is just how much GPU power is needed for any current PhysX implementation in any game (My suspicion is that the answer is not much, and your GTX260 won't suffer much running it at the same time as being the primary display adapter.) However I haven't seen anyone try any decent PhysX requirements testing.

dont be too sure...not that ive tested it myself but i recall someone saying enabling physx in batman crippled the framerates so a secondary card could help...how much i have no idea but im guessing a graphics card dedicated to do that 1 job should do it damn well
 
Wow, I didn't expect to have any more responses to this. Thanks for the explanations and things.

Amnesia: Apologies for the snip, just seemed a bit like you were telling me to "go away" though in different words...

At the moment, my 260 runs the games i have perfectly well, so the 1-2% drop because of going to 8 lanes doesn't seem too bad. I think my most demanding game at the moment is Fallout 3. Maybe a PhysX card would increase performance... maybe not. Since there's not much knowledge of this actual setup, I may save my PCI slots by not putting a graphics card to block them... I'll keep it as a possibility for future releases though, especially if i get my hands on a PhysX intensive game.

Cheers for the input!


I wonder if Portal utilises lots of PhysX...
 
Back
Top Bottom