• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ATI DirectX 10 Software Dev Kit released

Well now we (the customer) have a complete choice

1) Inbuilt hardware (aka SLI configs)
2) Seperate hardware (seperate agia card)
3) Software (ATI - who blieve their gpu is already powerful enough)

Wondering whether this might turn out to be something akin to soundcards ie years ago you had to buy a soundblaster - they were the only cards around at the time i blieve - then the option came about for having it built into the mobo, now that very few mobo's come without sound (and some have two different varieties) more and more high quality pci cards - and maybe pci-e at some point?? - are coming out to improve again on the inbuilt market

As stated elsewhere I think the £180 or so is too much for a seperate card right now with so few games using it

Its a plus point for potential investors in sli, you could be getting an inbuilt physx card for no extra cost

Software side--hmmmm maybe a little arogant of ATI to believe they dont have anything to do, (as I understand it anyway) but all choice is good!!!
 
My understanding of the Physics effects on Nvidia were that it was purely a graphical implementation rather then an object interaction physics.

Physx does the object interaction so will still be useful once the games get support.
 
mdjmcnally said:
My understanding of the Physics effects on Nvidia were that it was purely a graphical implementation rather then an object interaction physics.

Physx does the object interaction so will still be useful once the games get support.


Sorry, I am probably being stupid but I cant see the difference :o

Any physx interaction would be graphical, as its a "real " affect even if you are talking invisible forces like mavity ( or The Force :D ) surely?
 
mdjmcnally said:
My understanding of the Physics effects on Nvidia were that it was purely a graphical implementation rather then an object interaction physics.

Physx does the object interaction so will still be useful once the games get support.

There isn't such a thing as a 'graphical implementation.' Graphics are graphics and physics are physics. Nvidia want to do physics calculations on the second card of an SLI setup.
 
PhysX seems to me to be a bit of a Catch 22 situation.

Without developer support, there will be little incentive to buy one. And without a significant number of people buying them, there will be little incentive to develop for them (unless the work involved is trivial - which I suspect it isn't).
 
G.O.A.T said:
There isn't such a thing as a 'graphical implementation.' Graphics are graphics and physics are physics. Nvidia want to do physics calculations on the second card of an SLI setup.
Sorry, but mdjmcnally is right. Nvidia's implementation is purely show. Debris and/or any other graphical objects projected from any physics calculation done via Nvidia's solution will be unable to interact with you, or the environment around it, ie it is purely for show.

PhysX on the other hand is a true interactive solution. Any object projected off of something via the PhysX solution has the ability to effect anything else in its path.

So imagine a wall collapsing. On Nvidia's solution, each brick would just fall to the ground regardless of what was in it's way. On the PhysX solution, every brick has the ability to hit and bounce of every other brick (or anything else) tumbling down.
 
my thought: phsyics and graphics are both maths based problems. nvidia/ati and ageia cards will do exactly the same things but with a different SDK used by developers. Both solutions require an extra card to perform the maths etc.

I dont see why the ageia one will allow you to interact with stuff and the nvidia one wont - they should both be the same.... ie AMD and Intel.
 
Goksly said:
my thought: phsyics and graphics are both maths based problems. nvidia/ati and ageia cards will do exactly the same things but with a different SDK used by developers. Both solutions require an extra card to perform the maths etc.
Not strictly true. Both require heavy use of FP maths and both share very similar architectures, but that does not mean they are identical.

The dedicated silicon of PhysX gives you the extra grunt to have interactive physics. The dedicated-for-another-purpose silicon of ATI/Nvidia will currently give you "pretty" physics.
 
BubbySoup said:
Sorry, but mdjmcnally is right. Nvidia's implementation is purely show. Debris and/or any other graphical objects projected from any physics calculation done via Nvidia's solution will be unable to interact with you, or the environment around it, ie it is purely for show.

PhysX on the other hand is a true interactive solution. Any object projected off of something via the PhysX solution has the ability to effect anything else in its path.

So imagine a wall collapsing. On Nvidia's solution, each brick would just fall to the ground regardless of what was in it's way. On the PhysX solution, every brick has the ability to hit and bounce of every other brick (or anything else) tumbling down.


I want to make clear that I am not argueing with your post from the offset, more trying to understand the point of nVidia doing this

If as you suggest there is no interaction between falling bricks, it makes the game or whatever unplayable or unrealistic (which after all is the whole point)

If a brick just continues to fall whether something is in the way or not, either you get overlapping objects - which could cause a game engine to crash as its an impossibility, and one object passes through another or they "fake" the physics part and do it randomly - and remember I am sure nvidia are doing this to get more to convert to SLI, and if anything it would turn people OFF SLI as it wouldnt be as real as other implementations, wouldnt you say?

I am the first one to say I know nothing about software developement I am just trying to see why nVidia would state something was equal when even a layperson would notice the difference
 
erm i still fail to see the point of buying another expensive graphics card to do physics calculations??? When a gfx card is made to do gfx its surely not gonna have as much power as something thats been built from the ground up to do that task is it? ageia has already got a lot of support from games developers and its growing every day!

once again - http://www.theinquirer.net/?article=30434
 
Wow, The Inquirer - authoritative source! :p

I personally think PhysX is a gimmick in the short term. Until it gains significant ground its very much a pointless "nice to have" card, which will just sit there twiddling its thumbs with the current generation of games.

Maybe next we'll have a seperate card that just deals with lighting, and another that deals with showing realistic water! etc etc. Physics is admittedly a big part of games, but Havoc already does an awesome job at it.

EDIT: Also, if Havoc really does cost a blanket $250k (which I would dispute) and PhysX costs zero (even with support??) then I'm sure Havoc would lower their prices to remain competitive
 
Last edited:
Back
Top Bottom