• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AGEIA respond to ATI physics..not with benchmarks but with excuses

Associate
Joined
13 Apr 2006
Posts
1,145
AGEIA (who are trying to stay alive in the physics market) are trying to fob off ATI's physics solution, with excuses of minor technicalities rather than trying to prove genuine physics performance (and visual physics performance) increases.



It's a shame no benchmarks were there to show off each company’s physics options.



Also AGEIA stating that no games are currently available for the ATI solution is just a cheap way to push off the attention from the actual physics performance of the two options.



Here is the link to an article about it: http://www.dailytech.com/article.aspx?newsid=2772



You don't have to be a genius to know 3 ATI card will out perform the PhysX solution in physics calculations.



IMO, if you use the analogy used in the article, the third ATI card is not just another wheel, it's a completely new engine with the raw horse power to do 110% more than a standard physics card.



Yes a third graphics card does seem unreasonable in terms of cost (then again you do pay around 80% of a graphics card cost just for a dedicated PPU solutions like PhysX), but with performance issues currently embedded within PhysX, ATI's solution looks quietly attractive.


I also can't wait until HavokFX comes out with their solution, which will really put pressure on to AGEIA.
 
Last edited:
i have to question gfx cards having more physics power than a dedicated physx card?

On another note has anyone tried the new cellfactor demo that enables cloth in software mode? That shows physx is actually doing something at least as it starts to resemble the 3dmark06 cpu tests on my system :p
 
n3x said:
i have to question gfx cards having more physics power than a dedicated physx card?

For sure, just take a look at the raw specifications, you can see that it will out perform the PhysX solution with just raw horsepower.

Just need some benchmarks from AGEIA to prove the ATI solution a wrong choice...which we will not get..hence the current poor excuses.

ATI physics will perform better than the PhysX solution, just like how the way conroe will perform better than FX62 ;) (joking) /me *hears the drums of war* :)

n3x said:
On another note has anyone tried the new cellfactor demo that enables cloth in software mode? That shows physx is actually doing something at least as it starts to resemble the 3dmark06 cpu tests on my system :p

Cool, so they sorted it out? That's about time! Are we still looking at malteasers and M&M's on the screen? OR is there actually fluid physics movements. (minus lag/fps drops?)

*edit* Ahhh just read you ran it in software mode ...still waiting for physx to do something then? :p
 
“You don't have to be a genius to know 3 ATI card will out perform the PhysX solution in physics calculations.”
I have to ask way would it outperform? I have seen nothing to prove its going be slower or faster. It’s far to early to tell if it’s going to outperform or not. That and with no game support it’s not going matter. ATI need to get some developer support ASAP.



“For sure, just take a look at the raw specifications, you can see that it will out perform the PhysX solution with just raw horsepower.”
Raw horse power means next to nothing. There are plenty of cases where the card with raw power is the slower card. Its not raw power that matters it how you use that power and how efficiently you use it.

Raw power in its self it not enough evidence that ATI will be faster. There have been to many times in the past where raw power is slower over the card with less raw power but more efficient use.




“Just need some benchmarks from AGEIA to prove the ATI solution a wrong choice...which we will not get..hence the current poor excuses.”
That’s true we do need Ageia to prove they have the better solution. But shouldn’t ATI also be providing us with fair benchmark to prove they are better?
 
I believe that users on this forum completely miss the point with Physx and/or ATI/nVidia 'physics'. Basically, you're not comparing apples with apples. Physx is a 'proper' implementation of physics with objects that you can interact with (through improved collision detection for example). ATI/nVidia are dealing entirely with 'effect physics', for example making explosions look better. I know which one I'd prefer, and it's not the 3rd graphics card.... Yes, Physx is in it's infancy but to compare frame rates of games with/without Physc AIBs is entirely missing the point - It's there to simulate a real physical environment.

For example, how frustrating is it that when driving a tank in BF2 that you can be stopped by a white piquet fence, or that you can't drive over and crush a hum-vee. Physx is perfect for dealing with this situations realistically. The fence would splinter and crumple, and there'd be a long, tank shaped groove right through the centre of the hum-vee! ATI/nVidia's solution would mean you could blow up that car or fence and it'd look better as it did it, but it wouldn't change your gameplay experience in the slightest. Microsoft are rumoured to be developing a physics API to plug into DX10 so hopefully the API to interface with Physx cards and the like will then be standardised and we can see real improvements in the 'performance' of the cards, and an increase in the number of games that support it.

Simon
 
“Physx is a 'proper' implementation of physics with objects that you can interact with (through improved collision detection for example). ATI/nVidia are dealing entirely with 'effect physics', for example making explosions look better. I know which one I'd prefer, and it's not the 3rd graphics card....”

I believe only Nvidia have the none interactive physics. ATI on the other hand can be fully interactive physics like AGEIA.

The only difference between Ageia and ATI should be the speed difference which is currently unknown. Nvidia on the other hand have the basic physics which are not much use.
 
Another thing you are forgetting is that you dont NEED a 3rd card for physics.
You can have a crossfire setup and decide that you dont want to use that 2nd card for graphics in a game, instead you want to use it for physics. Then for the next game you might decide that you would rather have for graphics power and so enable crossfire for that game.
This to me sounds like a great idea! Not to mention these cards can be cheaper then the Ageia solution (even the 1900xt is now only £50 more!).
Does anyone know if the ATI solution is propriatry or like the nVdia option Havok based ?
 
Pottsey said:
“Physx is a 'proper' implementation of physics with objects that you can interact with (through improved collision detection for example). ATI/nVidia are dealing entirely with 'effect physics', for example making explosions look better. I know which one I'd prefer, and it's not the 3rd graphics card....”

I believe only Nvidia have the none interactive physics. ATI on the other hand can be fully interactive physics like AGEIA.

The only difference between Ageia and ATI should be the speed difference which is currently unknown. Nvidia on the other hand have the basic physics which are not much use.

LMAO! Fanboyism!
No, Nvidia's and ATIs are the same, nvidia's demo was litterally an Alpha demo, not even beta of what the card can do. The only difference between ATIs and Nvidias is that ATI have done it through a third card & Nvidia have done it through normal SLI.
 
no_1_dave said:
LMAO! Fanboyism!
No, Nvidia's and ATIs are the same, nvidia's demo was litterally an Alpha demo, not even beta of what the card can do. The only difference between ATIs and Nvidias is that ATI have done it through a third card & Nvidia have done it through normal SLI.

Don't argue too much, the last thread me and pottsey were debating in got deleted when a few others got a little OTT with replying to Pottsey.

His comment about ATI and Ageia is correct, ATI on paper has much more powerful cards (Ageia = 100Gflops ATI 1900XTX = 350Glfops) but we'll see which offers a better solution when they start being useful. Ageia use their own engine, whereas Nvidia and ATI will be using the Havok engine.

Personally I don't think Ageia have a chance against the big boys, and I'm not a big fan of their implementation.
 
“LMAO! Fanboyism!
No, Nvidia's and ATIs are the same, nvidia's demo was litterally an Alpha demo, not even beta of what the card can do. The only difference between ATIs and Nvidias is that ATI have done it through a third card & Nvidia have done it through normal SLI.“

First of all ATI don’t have to do it though a 3rd card it works with just 2 ATI cards. Secondly its not fanboyism. Everything that’s been released that I have read on Nvidia solution suggest its 1 way none interactive. Now it could well have been updated and I missed something and if I did please post a link. But last I read and I quote from Nvidia press realise “that will provide the GPU with a one-way transfer of critical information that will allow Debris Primitives to respond to game play objects and large-scale world definitions.” & “based on the direction and intensity of a force (e.g. brick and stone structure blown apart by a cannon blast)”
One way means physics for the explosion is worked out from the direction and intensity of the force applied. But as its one way the explosion will cause no more physics. You will not get the bricks bouncing off a car door window and smashing the window. That requires two way transfer of critical physics information. In other ways Nvidia physics are none interactive. You get fancy explosions but not objects colliding and changing other objects.

EDIT: Another quote from Nvidia “The data will stay on the graphics processor, a "transfer to the CPU will not be required," according to the company.” So the CPU can never see the physics data and effect game play based on the physics. All you get are graphics physics effects not gameplay physics effects. ATI on the other hand can feed the physics back to the CPU so you get gameplay physics.

EDIT2: Example you cannot have a situation where I shoot a box and the box falls onto a player killing them. As the CPU has no idea what physics are applied. Nvidias solution is more of a hybrid setup you get the CPU doing the interactive physics and the GPU doing the none interactive physics like explosions. Think about the Ghost Recon the explosions with the PPU, those are the limit of what Nivida solution can do.



“Does anyone know if the ATI solution is propriatry or like the nVdia option Havok based ?“
I am 70 to 80% sure ATI solution is Havok based.
 
Last edited:
Pottsey said:
“Does anyone know if the ATI solution is propriatry or like the nVdia option Havok based ?“
I am 70 to 80% sure ATI solution is Havok based.
So would I be right in saying that in responce to Agiea claiming that there are no developers planning on using the ATI physics technique, in reality they dont need to say it as any (?) Havok based game will automaticly get a speed boost from ATI/nVidia?
 
You gota give this guy from Ageia credit though, hes gota make a living, he's in the business of selling his cards, he's not gona rubbish em, you don't see Nvida or ATi saying our cards are crap do yer.
 
“technique, in reality they dont need to say it as any (?) Havok based game will automaticly get a speed boost from ATI/nVidia?“
I don’t believe so. Just like old Ageia games don’t benefit from the PPU it’s the same for Havok. The game has to be coded to recognise the GPU can do physics. There is a chance Havok planed ahead and some of the recent Havok games benefit but I wouldn’t bet on all the old ones working. There are now 2 confirmed games that support GPU Havok physics. Buts it late and I don’t have time to dig up the names until tomorrow.




”You gota give this guy from Ageia credit though, hes gota make a living, he's in the business of selling his cards, he's not gona rubbish em, you don't see Nvida or ATi saying our cards are crap do yer.”
Was that directed at me? If so please stop it with the silly comments. The last tread got closed due to silly stuff like that. I am just trying to have a decent conversion about hardware physics and the diffrent ways of doing it.

If you have a problem with my facts go over them with me instead of making comments like that. I am not making facts up. Nvidia’s physics are far more basic then Ageia or ATI. If you don’t believe me find something that says otherwise. Go read Nvida own press where they say they have 1 way physics.
 
Last edited:
"If you have a problem with my facts go over them with me instead of making comments like that. I am not making facts up. Nvidia’s physics are far more basic then Ageia or ATI. If you don’t believe me find something that says otherwise. Go read Nvida own press where they say they have 1 way physics"
Eh i don't have a problem, no one rubbishes their own products do they. :confused:

Ive never seen Gibbo post, "please dont buy form here were utter *****, we'll probably just fleece yer" etc.... have you.
 
Last edited:
At the end of the day can Ageia compete with ATI and nvidia. Not just in terms of developing better hardware but in getting developer support for there product. Most big games are optimised for either ATI or nvidia, are ageia big enough to change this?
 
ageia needs to go bankrup and learn the hard way that if they are gonna be a bunch of faggots and pull out a crap implementation of physics then they deserve to get slapped for it.

i mean how could they even pull out this product on the market? im sure people normally test their own final product before mass manufacturing and retailing.
ageia is planning to go bankrupt and make as much cash out of the fools who bought the cards.
 
Jack Bauer said:
At the end of the day can Ageia compete with ATI and nvidia. Not just in terms of developing better hardware but in getting developer support for there product. Most big games are optimised for either ATI or nvidia, are ageia big enough to change this?

I have to agree with you, it will be hard work for Ageia to compete with ATi/nVidia, but i feel their solutions (Ati/nVidia) are just a gimmick to compete with Ageia and won't come to much, i mean, why waste the powes of another gfx card for physics, i'd rather SLi/Crossfire it... :rolleyes:

The Ageia solution is still in the early stages, and until more software developers decide to support it, it's not going to get off the ground, i remember when 3dfx released the 1st voodoo cards, not many games supported them, and sales were hit... After a while, developers realised the potential and the sales of 3dfx voodoo cards went through the roof... :)

On another note on physics in games, what about Dual Core CPU's, surely 1 core could run the game, and the 2nd core could run the physics, thats another option open to developers? :confused:
 
“Eh i don't have a problem, no one rubbishes their own products do they. ”
But I have proven time and time again that I don’t work for Ageia. I have said bad things about Ageia so will you finally admit I don’t work for them or is it just convenient for you to ignore all my facts and accuse me of working for them instead of having a decent conversation?

It seems a lot of people around here instead of admit they got something wrong or if they dont want to hear something choose to accuse people of working for the revile company and then they write off everything that persons says as invalid even when it’s true.




“ageia needs to go bankrup and learn the hard way that if they are gonna be a bunch of faggots and pull out a crap implementation of physics then they deserve to get slapped for it.
i mean how could they even pull out this product on the market? im sure people normally test their own final product before mass manufacturing and retailing.”

How do you know its Ageia who have made a bad implementation? It’s too early to tell, I mean it could be Ageia but we don’t know for sure. What if they have a great implementation and it’s the early software that’s at fault. Shouldn’t you then be telling the software developers to pull there product not Ageia? As Ageia them self’s say the first few games hardly use any of the PPU’s power and where only patched to support it. We need to see a game made from scratch to support it.

I am the only one that thinks we need to find out where the problem lies before writing of the PPU? This is no different from the first 3Dcards, the first games where rubbish 6months later the decent games came out and the cards become usefull. I hope Ageia PPU is the same.
 
Last edited:
Back
Top Bottom