• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

I guess it that it would be overkill to keep my 280 for PhysX when I upgrade to the 480. I if I do so I'll know it's not the PhysX holding me back :)
 
This is a move in the right direction, can't believe nVidia locked it out in the first place really.
 
That's all it is good for is trying, then you will soon realize what a waste of time, energy and heat a physx card is and subsequently remove it form your system.
 
That's all it is good for is trying, then you will soon realize what a waste of time, energy and heat a physx card is and subsequently remove it form your system.

YEAH BUT OMG!!!111!!! IT'S PHIZZIKS SO WE HAVE TO HAVE IT!!111!!

/joke :D

I wonder how differently games like Mafia 2 will actually play depending on whether you have an NVIDIA or an ATi card...
 
NVIDIA will be reinstating the restriction in the WHQL build of these drivers

Originally Posted by Anandtech
We just got done talking with NVIDIA about the matter and they clarified the issue for us. In what we expect is going to be a disappointment for many of you, the lack of a PhysX restriction on the current 257.15 beta drivers is a bug, not a feature - the restriction should have been in those drivers and it was not. NVIDIA will be reinstating the restriction in the WHQL build of these drivers, and presumably all drivers thereafter.
http://www.anandtech.com/show/3744/...terogeneous-gpu-physx-its-a-bug-not-a-feature
 
stupid question ......but does this restriction only discount ATI cards or is it any cards by any company other than Nvidia, in that case what about the i3 and i5 CPU's with built in Intel GPU's.
 
Why would you use integrated graphics with a physx card? even the most crap card capable of physx is miles ahead of the IGP in i3/i5
 
well you wouldn't but its not a case of would you .....its a case of could you, i mean it could be a matter of not being able to use the feature of Nvidia card when it is the only graphics card in your system......or are you meant to remove your CPU first...:p
 
Out of interest how do you get a nvidia card to work with an ATI card? Do you just plug the Physx card into another PCI-e slot without any other connectors? Does it need to be connected to the monitor etc.

I plug my 5870 into my monitor, the 9600GT is in a spare pcie slot connected to nothing, then just install these drivers
 
The CUDA option is there under 3D Settings for me but trying to change the selection just crashes the control panel (and with nVidia cards in the system for rendering it just does CUDA on the first card anyway). Hopefully something thats fixed in the WHQL release.

Thanks Rroff I see it now, changing these make no difference. I can select my 9600GT as the cuda card but I can't enable funky water effects in JC2
 
Yeah I don't think its implemented in these drivers, even when I force the settings CUDA is still done on the default GPU.
 
This probably just means somebody with brains at Nvidia realised that alienating some 35% of your target market is not a good sales tactic.

I was interested in where the 35% came from exactly?

I'm pretty sure it alienated every ATi/AMD user, and Rroff and guys thought it was a very poor idea and didn't reflect well on Nvidia to really, anyone. So really they were alienating both AMD and Nvidia users alike, all those guys that had a Nvidia card for physx, who were boned by Nvidia when they removed support.

Meh, its the best move Nvidia could make, but can only really come across as failure "we're letting it work with an AMD card present again because, well, removing support just got us a shedload of bad press so we'll stop doing it".

Its not like "it took us tonnes of work but out the goodness of our hearts we finally got it working when an AMD card is present, look how caring we our of our clients".

If Physx was good, it would be great news, honestly, the quicker the death it dies at this stage the better, as dozens, and dozens of games have showed, accuracy, overly complex and high power using effects that slow a game down pale in comparison to easy to do widely available fast to use effects for everyone, Just Cause 2, Crysis, and the list goes on.

At this stage Havok is the standard, ported to open cl and accelerated on any Nvidia/AMD card is the best for EVERY gamer, will work for every game and is already useful enough with widescale use, accelerating it would only enable them to increase the amount of things they can add into a game and inevitably, get slightly more realistic.

I'm all for accuracy, and realism, when it doesn't kill performance or take away from the fun of a game, moveable objects, interactivity CAN make a game more fun, I don't require it to be ultra accurate at 10fps, when I can have it work and be fun and not too accurate, at 60fps(and so can everyone else).

The real worry I have is the Ageia guy will push useless technologies for AMD, rather than push acceleration of things we need, time will tell.
 
Back
Top Bottom