• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Borderlands 2 PhysX can be forced to run on the CPU

It kinda is taken away. A lot of basic bog standard effects are removed (in Borderlands 2 for example) if you turn PhysX off, effects that have been in games for years, to presumably exaggerate the differences between with PhysX and without PhysX.

For example, the physics in Borderlands 2 with PhysX off are a lot more basic than the physics in Borderlands, which didn't have any hardware PhysX capabilities.

You can run physx on low which will run it as standard on the CPU and give you the basics that you have come to expect in any modern game, this will be a very minimal performance hit. Medium/high settings are added extras for people who have hardware that supports it.

Saying things have been taken away is like saying something has been taken away when only the high end cards get soft shadows & multiple dynamic lights. They add to the game so why cant we have them on low settings? Simple answer, its not supported by a lower end graphics card. Its the same for physx but instead of lower end graphics card it is a certain make. You have the choice when buying hardware if you want certain extras, just like any other in game setting that may make you chose one card over another...
 
Guys/gals i suggest you go and play borderlands with physx on high and then physx off,you will see a massive difference i kid you not,anybody who says different is just in denial and must be blind,seriously physx in this game makes a big difference.
 
So its not a check box feature and yet I bet it is listed on every AMD graphics card box :rolleyes:

The point in saying "checkbox feature" is to show that it's just that, something to put on the box to get people excited. You know, kinda like how seriously low end graphics cards get stuffed with loads of slow RAM just so they can put it on the box.


You can run physx on low which will run it as standard on the CPU and give you the basics that you have come to expect in any modern game, this will be a very minimal performance hit. Medium/high settings are added extras for people who have hardware that supports it.

I know you can, but it's arguably more a performance hit that should really be required considering other games manage those basics without hardware acceleration.

The whole issue here is the performance hit for physics effects that don't require that sort of performance it. There's multiple ways in which you can look at it like something's being compromised for no benefit of the end user really.

Saying things have been taken away is like saying something has been taken away when only the high end cards get soft shadows & multiple dynamic lights. They add to the game so why cant we have them on low settings? Simple answer, its not supported by a lower end graphics card. Its the same for physx but instead of lower end graphics card it is a certain make. You have the choice when buying hardware if you want certain extras, just like any other in game setting that may make you chose one card over another...

It's not really comparable though because the way they've got PhysX set up now is to make competing products look like they offer poor performance. That's the whole idea behind the way nVidia are using, with how they've hobbled CPU performance too.
 
Guys/gals i suggest you go and play borderlands with physx on high and then physx off,you will see a massive difference i kid you not,anybody who says different is just in denial and must be blind,seriously physx in this game makes a big difference.

No one's saying it doesn't make a difference, so how about you read the thread first?
 
Yes you probably were, and those games you were playing had multimonitor support coded into them obviously, but what is it about the fact you were playing on multiple monitors before eyefinity was announced makes my previous post completely untrue?

I was playing games on multi-monitor waaay back in what 2001 or so on nVidia when they had spanning enabled in the GeForce drivers, which being nVidia they removed and made a "professional" feature a couple of years later only available with Quadro drivers. No games supported multimonitor then it just worked with most games other than the ones that hardlocked to a resolution or had very low limits to their max resolution.

Makes me laugh when people talk about nVidia surround as being a less mature/developed technology to eyefinity when infact its a 7+ year older technology and it was only because of eyefinity that forced nVidia to bring it back to GeForce :S
 
I was playing games on multi-monitor waaay back in what 2001 or so on nVidia when they had spanning enabled in the GeForce drivers, which being nVidia they removed and made a "professional" feature a couple of years later only available with Quadro drivers. No games supported multimonitor then it just worked with most games other than the ones that hardlocked to a resolution or had very low limits to their max resolution.

Makes me laugh when people talk about nVidia surround as being a less mature/developed technology to eyefinity when infact its a 7+ year older technology and it was only because of eyefinity that forced nVidia to bring it back to GeForce :S

OK sorry to sound stupid but how if no games supported multi monitors were you playing games on multi monitors back in 2001.

Also the whole point of my earlier posts wasn't to say that any of these features were worse or better than the other sides counterparts, but literally to show that once both sides have a similar thing it becomes a non issue.
 
I know you can, but it's arguably more a performance hit that should really be required considering other games manage those basics without hardware acceleration.

No they don't. The video of battlefield 3 showed paper being shot & pieces of paper falling through the desk then vanishing. Most physics in modern games is like this, it's all just graphical effects with no physics calculations at all. Off the top of my head I cant think of a single game that has the amount of physics borderlands 2 uses on its low setting. These are real physics effects, not just graphical effects designed to look like physics. With low setting on Borderlands 2 you get a very small performance hit (2/3fps depending on CPU), but you also get some real physics calculations.

It's not really comparable though because the way they've got PhysX set up now is to make competing products look like they offer poor performance. That's the whole idea behind the way nVidia are using, with how they've hobbled CPU performance too.

No, they have physx set up as an Nvidia feature. Being an Nvidia feature means of course it wont run well on other hardware. Complaining about that is the same as complaining that AMD dont offer hyper-threading when Intel do, then saying people shouldn't make programs that take advantage of it because the programs wont run as fast on CPUs without it.
 
Last edited:
OK sorry to sound stupid but how if no games supported multi monitors were you playing games on multi monitors back in 2001.

Also the whole point of my earlier posts wasn't to say that any of these features were worse or better than the other sides counterparts, but literally to show that once both sides have a similar thing it becomes a non issue.

Games don't have to specifically support multi-monitor to work with multiple monitors they just have to support a high enough resolution (spanning worked by making the game think it was rendering to one very large monitor instead of multiple) obviously if games are designed with multi-monitor in mind it works a bit more optimally especially with things like bezel correction.
 
No one's saying it doesn't make a difference, so how about you read the thread first?

CBA because most of it is waffle.
And physx on the cpu is rubbish compared to the gpu,if i run it on my cpu my fps is halved(3960k),even with a 690,its terrible on the cpu,as soon as i put it on the gpu fps shoots back up to a constant 60fps.
 
Last edited:
No, they have physx set up as an Nvidia feature. Being an Nvidia feature means of course it wont run well on other hardware. Complaining about that is the same as complaining that AMD dont offer hyper-threading when Intel do, then saying people shouldn't make programs that take advantage of it because the programs wont run as fast on CPUs without it.
Well said;)
 
Ati hardware is more than capable of running physics calculations. The fact that it doesn't is down to the implementation of physx. Spoffle is of course correct about that.
No, they have physx set up as an Nvidia feature. Being an Nvidia feature means of course it wont run well on other hardware.
Same thing, different view point. However, being an nvidia 'feature' does not mean it should run any slower unless intentionally hobbled.
PFS said:
CBA because most of it is waffle.
I would say you have contributed greatly to that steaming pile of waffle.
 
I wish AMD would give nvidia some cash so that both cards could use physx, and we might see more developers putting physx into their titles, so we can all benefit from better effects in games. Shame AMD are so skint and nvidia are the Nazis of the tech world (along side apple)
 
I wish AMD would give nvidia some cash so that both cards could use physx, and we might see more developers putting physx into their titles, so we can all benefit from better effects in games. Shame AMD are so skint and nvidia are the Nazis of the tech world (along side apple)

Why do AMD have to give Nv[dia cash? Intel own Havok and dont stop it running on an AMD CPU.
 
They don't need to pay Nvidia, they just need a comparable API/SDK then games developers will code for both and it will become a non issue just like SLI/crossfire, Eyefinity/Surround and even 3d vision/HD3D.
 
Yeah I can not for the life of me unerstand why Nvidia will not give away something they paid for. What is the world coming to when a company starts behaving like that ;)
 
They don't need to pay Nvidia, they just need a comparable API/SDK then games developers will code for both and it will become a non issue just like SLI/crossfire, Eyefinity/Surround and even 3d vision/HD3D.

Its a lot more complicated than that - unless the APIs exactly mimic each others behavior it would be a nightmare trying to keep things working as intended ingame.

It would be much better if someone implemented a hardware physics API on directcompute but thats a lot of work.
 
Its a lot more complicated than that - unless the APIs exactly mimic each others behavior it would be a nightmare trying to keep things working as intended ingame.

It would be much better if someone implemented a hardware physics API on directcompute but thats a lot of work.

Yes of course its not actually that simple, but just as crossfire/sli has to be coded for and eyefinity/surround has to be coded for, if AMD had a working physics implementation then games would be coded for both and the whole thing wouldn't be an issue.

or are you trying to say that if you code a game to use eyefinity then it will automatically work with surround, I don't think so somehow. There are differences, they may be slight but that is why these companies have their own proprietary tech. It is only when both companies are using similar technology that these things are used by everyone and then it becomes irrelevant which make of card you have.

Just imagine if games were coded so they only work with eyefinity or crossfire, then another game was surround/SLI.
 
To be fair, Bru is correct.
Companies could code game engines to support both physx and an API designed by AMD, then both sides would get it. As usual though, just like a proper driver level 3D implementation, AMD won't bother.
 
Nvidia paid $150m to buy Ageia, and in 4 years since then we have seen only a few titles where physx can be considered worthwhile. I would argue its been pretty bad value for money up until this point, also considering the other costs for them in that time.

How much would it cost for AMD to make their own equivalent and how many years would it take? Then they have to convince developers to start coding for it as well?

Its not gonna happen.
 
Back
Top Bottom