• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Let Battle Commence

I'm not sure about that - I dare say that games publishers wouldn't be to happy at being accused of massivly optimising the game for one GPU architecture, and as a result, be accused of nobbling, in this case, ATI users with a 'restricted' game.

I suspect this is why CUDA hasn't been seen in games much, at least not to any really noticeable effect - it might be good for Nvidia users, but it would be bad for the publishers, who really dont' care what hardware their game runs on, as long as it sells so they can recoup development costs. They can't do that by lockling out half of their potential audience!

OpenCL/DirectCompute allow a way to get around that though, by being hardware agnostic - less of a problem for publishers, and hopefully, games coders [I know a couple - sounds like a nasty job], who don't have to attend a vendor sponsored training course to get stuff working.

FWIW I'm not one of these people who think that CUDA is evil evil evil - it was, rather like the Ford Model T, the one that brought GPGPU to mainstream attention and made it practically usable. Being first, however, doesn't make it worthier than any other solution. Unless you are a Ford man, natch.

The wider adoption that OpenCL will naturally get as a reult of not being vendor locked should accelerate it's development, after all, eveyrone wants their code to run faster, and now they don't have to get specialised hardware to do it - with OpenCL, any recent GPU can be used to run GPGPU tasks. This is not a bad thing in the slightest.

look at batman or Assassin's Creed

edit:- i don't think nvidia would care what people think if it gave them a edge over ati
 
Last edited:
Am I the only to sit here with jaw wide open at this? Thats a stunning difference.
Was intrigued by your comment, so just went to watch. Was not impressed, just a bunch of background effects, which is what you'd expect from optional PhysX. We won't get anything meaningful integrated into gameplay until there's an open standard which works on all major graphics hardware.
 
Was intrigued by your comment, so just went to watch. Was not impressed, just a bunch of background effects, which is what you'd expect from optional PhysX. We won't get anything meaningful integrated into gameplay until there's an open standard which works on all major graphics hardware.

End of story no matter how good the tech demos look.
 
This is interesting, I wasn't expecting anything from nvidia until next year at least.

I will definitely wait to see what they have to offer before I buy my next graphics card.
 
Sure but it does make anything I say a little more likely than purely wild unsub'd speculation...

Its not just about performance increases (tho if you could bring GPGPU properly to bare on some tasks you would see more like 400+% gains) its about all the things you could implement without unplayable performance - AI especially is something that spends a lot of time sequentially cross comparing static data. You could massively increase the quality and realism of things like dynamic way/path finding without bringing performance to its knees.

Are you sure you aren't a used car salesman?

Have you ever used CUDA? And what for?
 
The stats tho are taken from collision testing against the kinda detail normally found in a game level... the core physics solver wouldn't be any different.

Its beside the point anyhow... physx lets the developer do a broad range of physics effects without having to spend a lot of time hand opptomising complex software routines for every specific case.
 
Proprietary standards usually go the way of the dodo, like EAX and glide. They didn't really bring anything to the table that couldn't be done in an open way. Granted 3dfx and Creative don't have the big bucks like Nvidia, Microsoft and Sony. Where if you get in early or competitors don't exist or can be beat down with a giant wad of cash.

I don't understand what Nvidia's strategy is? Do they want to gain mass acceptance and then start charging a royalty/commission everytime physx is used. Maybe place a hefty charge for GPU vendors to implement physx on their own cards.

The proprietary nature of the standard means developers are segregated and have to choose which platform to develop for which ultimate means more costs, slower development and frustrated consumers.
 
look at batman or Assassin's Creed

edit:- i don't think nvidia would care what people think if it gave them a edge over ati

There was nothing in the batman that could not be done visually with the usual tricks.

The option for scripted repeated effects or the effects using physx should be options.

Flags, water, flames, breaking glass & particles are nothing new the problem is that NV get some of them taken out to with no physx as its a more noticeable visual difference in favour of physx as its easier to notice something not being there at all then the same thing being there but moving more realistically.
 
Last edited:
Its more complex than that... a lot of developers have experience with havok and would prefer hardware accelerated havok, and theres some games that are more suited to physx and some work better with havok... what we need is a common platform to run both of these ontop of... unfortunatly nvidia tried to tie physx to running ontop of CUDA when approaching other parties which is why many told them to get lost...
 
There was nothing in the batman that could not be done visually with the usual tricks.

The option for scripted repeated effects or the effects using physx should be options.

Flags, water, flames, breaking glass & particles are nothing new the problem is that NV get some of them taken out to make a more noticeable visual difference in favour of physx as its easier to notice something not being there at all then the same thing being there but moving more realistically.

I not saying that,what i am saying nvidia and ati will do almost anything to get a edge be that with duda/physx or what ever
 
Its more complex than that... a lot of developers have experience with havok and would prefer hardware accelerated havok, and theres some games that are more suited to physx and some work better with havok... what we need is a common platform to run both of these ontop of... unfortunatly nvidia tried to tie physx to running ontop of CUDA when approaching other parties which is why many told them to get lost...

Yes but some never and that more likely to do with nvidia giving them help/money or both
 
There was nothing in the batman that could not be done visually with the usual tricks.

The option for scripted repeated effects or the effects using physx should be options.

Flags, water, flames, breaking glass & particles are nothing new the problem is that NV get some of them taken out to make a more noticeable visual difference in favour of physx as its easier to notice something not being there at all then the same thing being there but moving more realistically.

While I agree that most of the effects could be done in the "usual" fashion and many of those affects can run at full speed on the CPU in the limited fashion used in batman... I can see now why many of the effects are simply missing on non-hardware physx as the developer would have had to spend a few more months coding alternative paths for each individual feature that with physx they could implement with a few lines of code...

Unfortunatly the subtlies of the smoke effects, etc. seem to be lost on a lot of people - personally the "usual" style smoke looks cheap and nasty to me now I have noticed and can compare it to the physics smoke that interacts with the environment, flows and reacts to objects passing through it.

Same with the cloth effects, etc. and proper smoke/fluid dynamics, cloth, etc. effects can't currently be done on the CPU with useable performance.
 
Last edited:
I not saying that,what i am saying nvidia and ati will do almost anything to get a edge be that with duda/physx or what ever

What edge has ATI implemented that is purely ATIs that needs coding for that stops it running on other hardware.
Ati has not tried to ram anything down coders throats.
 
Last edited:
Battle lost, now this has arrived

Might as well change the title now.
Change to "Battle lost, now this has arrived" :D

5870x2.jpg
 
Back
Top Bottom