• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Noobmatron - AMD/PhysX?

the events that followed and the conjectures you make from them are not the question here. Back in 2008 NVIDIA was ready to open physX up to AMD and AMD gave them the finger.
 
the events that followed and the conjectures you make from them are not the question here. Back in 2008 NVIDIA was ready to open physX up to AMD and AMD gave them the finger.

As it stands right now, it's Nvidia's PhysX that disables when an AMD GPU is detected with the Nvidia card, that's what counts. I'm not even saying AMD are some saint company, as their BD charade was pitiful.
 
Which is exactly why Amd dont want anything to do with physx. nVidia would still have ultimate control to make it only run its best on nVidia hardware.
 
Is raven not allowed in the gfx section at all?

Or does he just choose not to post his (whatever's not in my system at the time is garbage) comments here?
 
Except a bit of googling would've shown you that this is old news that has also been talked about on these forums before.

http://www.bit-tech.net/custompc/news/602205/nvidia-offers-physx-support-to-amd--ati.html

AMD just likes to pretend to be altruistic, with their holier-than-thou attitude. There were other reports that followed this one up saying AMD was not interested in PhysX. They also bitched a lot about PhysX and continue to do so. But funny thing is AMD was thinking of buying Agiea before NVIDIA did. and now suddenly the grapes are sour and nobody should want those grapes.

AMD's also been going on about releasing hardware physics for a while now, but have nothing to show for it thus far. Same thing with their 3D. They do lip service to the glory of open technologies and yet fail to understand the key underlying philosophy behind open-ness: keeping progress alive without stifling it.
NVIDIA is actually making progress with their techs, closed or not. While AMD is dicking around a lot.
Take CUDA, 3D and PhysX as examples. I use two of those and my work has depended a lot on CUDA. If I had been waiting for a AMD's vague promises I'd have lost years waiting for AMD's so-called open approach which only serves to hold things back rather than serve the true purpose of open technology.

Thing is, for AMD to actually use it would either require

a) AMD to include CUDA support
or
b) PhysX to be ported to OpenCL

Frankly I'd imagine b) is the only real option, and I dont imagine that it's in AMD's interest to actually make that investment, when it is investing in other technologies as you say. It's just not cost effective for them. Similarly, it's just not in NVidia's interest to port it when it's running fine on CUDA.

Now, on AMD's own investment yea there isnt a true competitor in precisely the same way, but actually there is progress in the open standards and it's kinda short sighted to simply say that AMD is terribad simply because they havent yet produced a direct competitor.


Anyway the discussion in this thread isnt on implementation, more on NVidia's decision to actively block people from using their own products when AMD products :)


EDIT: Just noticed the rig in your sig, blooming heck! :eek:
 
Last edited:
The main problem is again that, I've yet to see any good physx effects in game. its usually a case of, remove the three most bog standard, seen in most game effects, recode them to run like crap on a modern cpu and run like crap, but fast enough on a modern gpu, then re-add the effects to the game........... profit.

Mafia 2 effects are flat out crap, as are most of the effects in most of the games around, Mafia 2 made a mockery of it though, the glass shattering effect, only with physx, the glass disappears ,then reappears a split second later in completely identical explosions every time. IE its a predone effect, that isn't even timed accurately and in NO WAY REAL TIME ultra realistic effects. This is something games had done for a decade previously, but Nvidia is claiming you both, should have your framerate reduced to see it, should spend more on Nvidia cards to see it, and that its ultra realistic which is the reason to have physx.

its a shame, a con, and a complete joke, and the more people that buy into it the more games you'll get with STANDARD effects removed, and missing features like AA.

If Nvidia does it enough, with enough games and enough features I can see AMD going that route as well then it will become nasty.

The laughable thing is, Nvidia harming NVIDIA users performance, just to get at AMD.

Tesselation in Crysis 2, you literally can't see it, its a complete joke, it hurts performance for AMD and Nvidia, and offers no IQ increase, its completely nuts.
 
The main problem is again that, I've yet to see any good physx effects in game. its usually a case of, remove the three most bog standard, seen in most game effects, recode them to run like crap on a modern cpu and run like crap, but fast enough on a modern gpu, then re-add the effects to the game........... profit.

Mafia 2 effects are flat out crap, as are most of the effects in most of the games around, Mafia 2 made a mockery of it though, the glass shattering effect, only with physx, the glass disappears ,then reappears a split second later in completely identical explosions every time. IE its a predone effect, that isn't even timed accurately and in NO WAY REAL TIME ultra realistic effects. This is something games had done for a decade previously, but Nvidia is claiming you both, should have your framerate reduced to see it, should spend more on Nvidia cards to see it, and that its ultra realistic which is the reason to have physx...

I understand that NVidia devs work with dev teams on implementing NVidia tech (hence twimtbp), but surely at the end of the day it's still down to the devs themselves to implement the effects in the 'ultrarealistic way'?

Perhaps this is a reflection of a feature which quite literally splits the market down in the middle. If 100% of your users will be able to enjoy an added feature then it'll probably be in your interest to implement it and implement it well, whereas if only 60-70% will (if that, I'm making up numbers here) then in terms of manhours invested it's just not worth it in a restrictive development cycle, looking towards a deadline, when there's far more broad and (to be frank) important things to work on.
 
when there's far more broad and (to be frank) important things to work on.

Like working DX11 mode in BAA!

I'm all for phys-x, I'd rather have it than not, I liked the BAC effects, but when it cripples performance as dm said, it starts to become a bother.

I can also remember nvidia stating that 2x 260's(one for dedicated physx) were a minimum requirement for Mafia 2, what a joke it ran fine for me with a 9800gt dedicated for physx!
 
Like working DX11 mode in BAA!...

I hear that! :D

Although it seems as though it was a last minute addition to add value to the PC version. Nice idea, just seems to have taken longer than they anticipated. They couldve happily released it with just DX9 though, so I appreciate the extra effort :)

Anyway off topic!

@OP: Just get the one that's best performance per pound at the time you buy. Physx shouldnt be a big deal in deciding which one you want :)
 
I've got a GTX260 in my machine along with a 6850 for PhysX duties.

Most of the time, it sits there unused. I only did it 1) because I could, and 2) because I had the GTX260 not doing anything else.

I certainly wouldn't be looking to buy any hardware specifically for it!
 
xsistor is MR NVIDIA.

Yes, Nvidia really wanted AMD users with a dedicated Nvidia gpu to use phys-x, they even created the infamous 'time bomb' in the driver:

'Versions 186 and newer of the ForceWare drivers disable PhysX hardware acceleration if a GPU from a different manufacturer, such as AMD, is present in the system.[14] Representatives at Nvidia stated to customers that the decision was made due to development expenses, and for quality assurance and business reasons.[15] This decision has caused a backlash from the community that led to the creation of a community patch for Windows 7, circumventing the GPU check in Nvidia's updated drivers. To counter this patch, Nvidia implemented a time bomb in driver versions 196 and 197 that slowed down hardware accelerated PhysX and reversed the mavity,[16] but an updated version of the patch removed all unwanted effects.'

http://en.wikipedia.org/wiki/PhysX

Wearing those glasses are having a detrimental effect on your mental health mate!:p

raven mkII

If anything, you guys are the butt hurt AMD fanbois. I stated what has been reported and backed it up. What conclusions you draw from that and later developments is conjectural. The fact remains that AMD has been rabid in its attacks on NVIDIA and PhysX and hypocritically so to anyone who knows the history.
 
Total nonsense mate, the Nvidia PR man's not telling porkies, I have it on good authority that they edited out:
'it's the gospel truth, honest, cough, cough!'

If you believe that, then you'll believe anything xsistor(nvidia's new superduper forum infiltrator) will tell you.:D
 
PhsyX is just nvidia`s way of trying to monopolize the graphics card market
everything they do can be done on the CPU. there is no reason what so ever why there slow crap cant run on modern cpus. i have seen better smoke effects on other games that dont require physx hardware to run.

Thats my take on the subject the reason why nvidia will not allow amd hardware to run there crap is they want people to buy there marketing crap instead.
 
Total nonsense mate, the Nvidia PR man's not telling porkies, I have it on good authority that they edited out:
'it's the gospel truth, honest, cough, cough!'

If you believe that, then you'll believe anything xsistor(nvidia's new superduper forum infiltrator) will tell you.:D


Of course saying I can't see the forest for the trees through my green tinted glasses in vibrant 3D is a cogent and objective argument overflowing with unassailable rhetoric and extraordinary logical insight. I tremble in its wake.
 
If anything, you guys are the butt hurt AMD fanbois. I stated what has been reported and backed it up. What conclusions you draw from that and later developments is conjectural. The fact remains that AMD has been rabid in its attacks on NVIDIA and PhysX and hypocritically so to anyone who knows the history.

But to be fair talking about amd's reasoning for not taking nvidia up on this 'offer' is as much speculation and conjecture as tommy's doing. Ive tried to engage in discussion on this issue but perhaps you arent interested in actually talking about it? If so feel free to continue asserting your opinion.


God i forgot how much fun the GC subforum is :D
 
But to be fair talking about amd's reasoning for not taking nvidia up on this 'offer' is as much speculation and conjecture as tommy's doing. Ive tried to engage in discussion on this issue but perhaps you arent interested in actually talking about it? If so feel free to continue asserting your opinion.


God i forgot how much fun the GC subforum is :D

Except I didn't talk about AMD's reasoning for it. I merely stated tht NVIDIA offered and AMD declined. Saying that NVIDIA later crippled AMD drivers for PhysX so they can't have wanted AMD to use PhysX (contrary to what they said), etc is all conjecture. If I'm conjecturing I could've said something like: "Nvidia got ****ed off that AMD snubbed them when they extended an olive branch etc etc..."

And no I'm not really interested in guessing at each company's motives on this and talking about it. Because we can never know. But at the same time I am annoyed at speculation that only casts AMD in a positive light. And outraged that those very people would then proceed to call me a fanboi (in no so many words). I only provided the link because some people wanted to see it. And I bow out of it at that point. I don't want to get drawn into a PhysX vs no-PhysX argument as I don't even use PhysX. Not really into it.

I like NVIDIA because they offer very good products that I use. I have no brand loyalty. If AMD had a better or equal version of 3D Vision and CUDA I'd have gone AMD because I think 6950 xfire is a very good deal.

But they don't. I used a string of AMD cards from ATI Rage 128 pro to Radeon 9800 Pro back in the day -- when ATI was better. But now I stick to NVIDIA because they make what I want. Doesn't mean I'll stick to it forever.
 
Saying that NVIDIA later crippled AMD drivers for PhysX so they can't have wanted AMD to use PhysX (contrary to what they said), etc is all conjecture.

The problem is that it's not conjecture, it's fact that phys-x is disabled on Nvidia hardware when an AMD card is present!

When you fail to acknowledge that fact, then try and bury your head in the sand rather than accepting the fact to be true, that's when you get grief and end up being labelled a fanboi!
 
Last edited:
Back
Top Bottom