• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Mirrors edge physx vid

Is it better to use a GTX 260 as a sole card and let it do both or use it just for graphics, and get a cheap nVidia card to do Physx?

If cheaper one then what is lowest level card that can handle Physx 100% maxed without frame drops?
 
Just put it on the main card. I tried physx on a PPU, 8600GTS and on my main GPU at the time, an 8800 ultra. The ultra was the quickest by miles and it was rendering the graphics as well. So in short, getting a dedicated card just for the physics now is not worth it as most high end nVidia cards can handle it :)
 
Just put it on the main card. I tried physx on a PPU, 8600GTS and on my main GPU at the time, an 8800 ultra. The ultra was the quickest by miles and it was rendering the graphics as well. So in short, getting a dedicated card just for the physics now is not worth it as most high end nVidia cards can handle it :)

Thanks. :)

Stange how a card dedicated to the job doesn't surpass even an 8800 Ultra doing both... :confused:

I guess it's down to the driver infancy, I would certainly expect dedicated cards to perfrom better with the right software.

Anyone else been experimenting with various Physx hardware?
 
Yeah i thought it was strange as well. I think you hit the nail on the head with drivers. I did do my tests when GPU Physx first came out so i have no doubt the drivers will have improved. Im just waiting for ATI to support physx......i think i could be waiting a long time :(
 
Does nVidia not have exclusivity on Physx?

I'm sure if they do then ATi will create an alternative.

Hopefully that wouldn't lead to developers having to choose which one to opt for...
 
No i dont think they do. Im sure i read somewhere that they have allowed ATI to see the code etc.. So i think we are just waiting for ATI to code some drivers. Im sure nVidia wouldn't make it easy for ATI to do tho.

And yes, having 2 physics system would be lame
 
Looks good, but the Physx effects are just tick-box features as usual.


New technology in that vein is always going to be like that for a few years though - until the majority of people's PCs (and consoles if it's multi-format - and most games are) can handle dedicated physics processing, the developers aren't going to hang vast amounts of gameplay on effects that only 2% of their user-base can use. It took years between the first 3DFX cards and the first acceleration-only games, and between that period the 3D acceleration didn't really do anything at all except speed up the frame-rate, certainly nothing for the gameplay.
 
New technology in that vein is always going to be like that for a few years though - until the majority of people's PCs (and consoles if it's multi-format - and most games are) can handle dedicated physics processing, the developers aren't going to hang vast amounts of gameplay on effects that only 2% of their user-base can use. It took years between the first 3DFX cards and the first acceleration-only games, and between that period the 3D acceleration didn't really do anything at all except speed up the frame-rate, certainly nothing for the gameplay.

It also took a unified API in the form of DirectX to bring acceleration to everybody. At the moment PhysX is Nvidia only, and until a time we have a open accelerated API it will continue to be a niche feature as it is now.
 
Exactly. Including it in games certainly isn't a bad thing despite how much people may sniff at it because it 'doesn't do anything for the gameplay'.

Personally, anything that makes the world that bit more believable is doing something for the gameplay, but that's just me.
 
If you don't have PhysX or if they didn't put it in it would be just like the consoles, if it looks like that then what's the problem with it.
 
the point is, physx does nothing at all any other physx engine can't do, it just does it, more accurately and realistically.... supposedly. because the coding is just real phsyics rather than estimation thats it, you still have to design realistic effects.

The problem here is its impossible, I mean inarguably and literally impossible to see the difference between realistic, and estimated physics if they are both implemented well.

The gust of wind that blows up litter on the roof when the helicopter is close, its default stuff, estimated or scripted physics can create a completely 100% identical scene at the fraction of the processing cost in software and you could simply not tell it wasn't real.

We've already seen, i think in one or both graw's, where the physx acceleration added more particles seen when explosions happen, however, they were badly designed and frankly didn't look realistic, also it was mostly the same every time.

Whats the conclusion, bad coding = poor looking physics no matter if its software or hardware, physx, havoc or any other engine.

Good coding = real feeling physics, no matter the engine used.

Excessively accurate physics are pointless, because you can't possibly know where every particle will fall yourself, so the difference between a well scripted, or estimated explosion to a realistic explosion will be particles all blowing off in random but slightly different paths. You simply can't tell the difference, you never will be able to tell the difference and you'll never need to tell the difference.

We have games where cloth gets shot and rips, signs fall down, if the cloth falls and ripples, thats enough, it doesn't matter if the ripples are 100% accurate, because you can't know how it would happen anyway. Even physx hardware, with the best engine ever written still has LOTS of limits on it based on the coding of the game its used in.

In real life, a gust of wind on the other side of the planet does effect how a leaf blows in your garden, eventually. The fact is the engine will always be limited by what it can take into account, its simply grasping at straws and tacking far to much effort, time and power to do something an estimation can perfectly mimick.

Now yes, the TECH DEMO of cloth rippling on physx's site isn't matched by any other engine out there, however, its also not matched by any game with a physx engine, because the tech demo is using a incredibly powerful setup to do that one thing, if you want a game that only has one piece of cloth on screen, and thats it, fine, if not its useless.

Physx, has always and will always be pointless, realism of physics, and anything else in games is PURELY down to the coding and design of the game. The only reason Physx hardware runs physx games faster is because it uses an overly complex engine for ultra realism, when its simply not required. The fact that the physx games that are out, offer nothing in comparison to other games with good physics engines should be enough proof of that. The fact that physx has been around for a good few years now and has done nothing at all, still has no games that show a single smattering of useful stuff.


This game demo doesn't look bad, I never said that at all, its not my cup of tea at all however. My point was the "physx" parts of it, make up 1% of everything you see, and that 1% is completely unimpressive, nothing remotely new, and nothing that effects gameplay or even the look of the game, at all.


Think of it like this, one game has a body that drops of the roof, the physic engine estimation simply has the acceleration of any particle falling in the game at 9.8m/s, the physx lists the body's mass, the mavity, calculates the weight, etc, etc, and comes up with 9.82546m/s more accurately. It is more accurate physics, however you only see both body's fall in say, 4 seconds, or 4.0002345seconds. Does either change gameplay, no, does either seem more or less realistic, no, is there any benefit whatsoever to the accuracy and complexity of the calculations done by the physx engine, no. What does it do, it add's time and a power cost, but nothing else. Thats the only difference.

Well made estimations are completely indistinguishable from realistic ones, the only thing paramount to realism, immersion and detail in games, is design, nothing more, nothing less.
 
Last edited:
Apparently the five month delay is down to them putting PhysX in.

They're getting pretty hacked off over at Bit-Tech and to be honest I agree with them...

Quite frankly, why not just release it now so everyone can play it and then offer a patch for Nvidia folk later? Remember FarCry 1.3 adding HDR for Nvidia GeForce 6 hardware? A five month delay is just ridiculous and anyone who will have wanted to play it, will have already done so on a console, especially if they own ATI hardware because it won't make a blind bit of difference for them. As much as we love funky visuals, cool eye candy and real physics, we love playing games more.

http://www.bit-tech.net/news/2008/11/20/mirrors-edge-pc-delay-down-to-physx/1
 
Lots of sensible stuff

I pretty much agree with all of that. Where PhysX would come into it's own would be for events that simply cannot be scripted well. What springs to mind would be multi player games where physics interacts with other players. This would be almost impossible to script well as there are too many variables to take into account, so you would end up with scripted events that didn't look right or clipped through other players standing close etc

Now, to have that kind of physics in a game, comes back to what someone else mentioned - everyone needs to be able to have a pc that can handle doing it, otherwise you cannot implement it as a game play attribute without the risk of alienating a percentage of your customers.

Until we get to this stage, I think Physx will be relegated to the "ohhh, look at that" category.
 
Until there is a single standard for Physics, ie DirectX but for Physics then I can't see taking off.

Nvidia have said that AMD can use Physx on there cards if they want no problem, AMD just haven't bothered yet.

Intel bought Havok and killed off HavokFX (ie Havok for GPU) and AMD have apparently worked on porting Havok to AMD CPU.
 
Back
Top Bottom