• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Graphics card physics

In defense of this whole physics-on-a-gpu thing that's coming out, I recently had the pleasure of seeing some demos by nVidia at an IGDA (International Game Developers Association) meeting. Of course, I'm sure they were running on an extremely nice pc (SLi 7900GTX's if I remember right) but the demos were very impressive - even more so than what I've seen of Ageia's offerings.

A number of objects were onscreen and interacting/colliding with each other in real time, I'd hazard a guess at over a thousand or so actual objects, like chess pieces, for example...lots of chess pieces spawned from the 'sky' and piled up into a huge mass. Sure, it slowed down after a while but that's only because he kept dumping more and more chess pieces onto the pile! :D

What I'm trying to say is that the technology and software already exists today, and it can be pretty remarkable to see it in action. It might be a rather well-off gamer that chooses SLi or Crossfire, but I imagine it'll be a little more appealing to more people as we see other uses for it, rather than just extreme resolution gaming.
 
Hope this wasnt covered & i missed it, but heres some info.

ATI announces GPU-based physics acceleration plans
by Geoff Gasior - 09:58 am, June 6, 2006

TAIPEI, TAIWAN — ATI used the first day of Computex to announce its strategy for GPU-based physics processing. Radeon X1000 series graphics processors will be capable of accelerating the Havok FX physics API as a part of what ATI is calling a "boundless gaming" experience. GPU-based physics acceleration is nothing new, of course; NVIDIA announced its support of Havok FX back in March. However, ATI says its approach is far superior to that of NVIDIA, in part because ATI's implementation can support three graphics cards in a single system.




ATI had a demo system running a pair of Radeon X1900s in CrossFire with a third X1900 card dedicated solely to physics processing. This configuration was appropriately referred to as the "meat stack," and while it produced silky frame rates in a number of demos, it's not the only Radeon configuration that will support GPU physics. In addition to supporting three-card configs, ATI will also allow a pair of its graphics cards to split rendering and physics between them. The graphics card dedicated to physics doesn't even need to match the other graphics card(s) in the system; for example, it's possible to run a high-end Radeon X1900 XTX crunching graphics alongside a more affordable Radeon X1600 series card for physics. In fact, ATI had a demo system set up with a pair of Radeon X1900s in CrossFire and a Radeon X1600 XT accelerating the Havok FX physics API.
With support for three-card configurations and no need to match cards used for graphics and physics, ATI looks to have the most flexible Havok FX acceleration implementation. ATI also claims to have a significant performance advantage when it comes to GPU-based physics acceleration, citing the Radeon X1000 series' ample shader processing power, efficient dynamic branching, and fine-grained threading. Of course, the first games to use Havok FX aren't expected until later this year. Havok FX isn't exactly comparable to what Ageia's doing with hardware physics acceleration, either; Havok FX is limited to "effects physics" that don't affect gameplay, while Ageia's PhysX PPU has no such limitation.
http://techreport.com/onearticle.x/10117
 
"Of course, the first games to use Havok FX aren't expected until later this year. Havok FX isn't exactly comparable to what Ageia's doing with hardware physics acceleration, either; Havok FX is limited to "effects physics" that don't affect gameplay, while Ageia's PhysX PPU has no such limitation". - http://techreport.com/onearticle.x/10117

So this Havok FX + ATI solution is 'eye-candy physics effects' only, and not real reactive/interactive physics that affect the gameplay. Doesnt look so good to me, imagine buying into this (2 or 3 card) frenzy and then realising that actually the Ageia ppu that got so heavily dissed by those without one is actually the better solution (which i think still is). Hmmmmm, if i'm gonna pay to have extra physics then i expect those physics to affect gameplay, dont you think? I reckon ATI has just found the answer to their problem of not shifting half as many x1600 cards as they thought they would - "i know lads, lets just tell em that they'll be great for processing phyics, cos no-one wants one for graphics thats for sure". To quote the late Bill Hicks - "those who work in Marketing and Advertising should just go kill themselves".
 
dEl_fUEGo said:
"Of course, the first games to use Havok FX aren't expected until later this year. Havok FX isn't exactly comparable to what Ageia's doing with hardware physics acceleration, either; Havok FX is limited to "effects physics" that don't affect gameplay, while Ageia's PhysX PPU has no such limitation". - http://techreport.com/onearticle.x/10117

So this Havok FX + ATI solution is 'eye-candy physics effects' only, and not real reactive/interactive physics that affect the gameplay. Doesnt look so good to me, imagine buying into this (2 or 3 card) frenzy and then realising that actually the Ageia ppu that got so heavily dissed by those without one is actually the better solution (which i think still is). Hmmmmm, if i'm gonna pay to have extra physics then i expect those physics to affect gameplay, dont you think? I reckon ATI has just found the answer to their problem of not shifting half as many x1600 cards as they thought they would - "i know lads, lets just tell em that they'll be great for processing phyics, cos no-one wants one for graphics thats for sure". To quote the late Bill Hicks - "those who work in Marketing and Advertising should just go kill themselves".

Agreed, and it's not likely to exist in this form for that long anyway... hopefully the implementation will die out before it gets past the early adopters. Surely it is in the interests of the card makers to have "The first Dual GPU with integrated PPU card on the market"? Imagine the qudos that they would receive, and surely that solution would be cheaper to produce and more efficent from a power consumption perspective?

(LOL - for some strange reason the classic circus music theme came to mind :) )
 
Last edited:
lay-z-boy said:
in the future we have quad or octal cores, wtf are they gonig to be doing?!

so in the future we will have:

2/4*x1900xtx in xfire

1/2*x1900xtx for physics

1*x1900xtx for sound

1*x1900xtx for ai/emotions

2000 billion watt psu

:rolleyes: :rolleyes:

i think we need more pci slots gov' seems 7 isnt enough anymore. :eek:

haha! Made me laugh. I'm sure It will get to something stupid like that if they have their way.
 
Back
Top Bottom