• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

nVidia buy Ageia

I sincerely doubt Nvidia will be after the hardware that ageia developed. I'd put my money on them gunning for the Software engineers / all round physics nerds and the accompanying SDK.

If I was to bet further I'd say that they'd like to design the SDK to utilise their GPU's for doing physics calculations. One more thing to hammer at poor old AMD with.

Suddenly Triple SLI could start to make sense.
 
or perhaps potsey kidnapped their families and held them at ransom to buy ageia... could happen lol.
Is there a particular reason you’re ignoring my fair question and just making fun of me instead?

I don’t have a problem with people joking around a bit but it would be nice if you at least answered the question directed at you at the same time as the jokeing.
 
I think the motives for this move are fairly obvious.

Physx is a good technology, but the hardware was caught in a catch 22- games developers are reluctant to put a lot of effort into it because most users don't own physx hardware. Users are reluctant to splash out on hardware because games don't support it. There weren't enough brave pioneer users to reverse this situation. Including myself. I'd think twice before installing a physx card if it was free with my cornflakes.

I wouldn't expect Nvidia to add anything to their hardware. After all, GPUs are computationally pretty awesome for this sort of work- we all know that research labs are using them as a cheap form of supercomputing.

I suspect Nvidia began implementing an efficient physics runtime, and soon realised that they are reinvented the wheel- i.e. their physics API was shaping up to be very similar to Ageia's, perhaps to the point where plagiarism accusations would start and, the lawyers would step in, development would cease and everybody loses. In that situation it makes sense to acquire the pioneer product. There is no mainstream competing technology so the choice was pretty clear cut.

So I'd expect to see several of those GPU stream processors running Ageia physics code before too long.

Nvidia rescued the sinking ship. A decent ship, but a sinking one nonetheless. We'll all benefit.
 
Last edited:
Is there a particular reason you’re ignoring my fair question and just making fun of me instead?

I don’t have a problem with people joking around a bit but it would be nice if you at least answered the question directed at you at the same time as the jokeing.

where is the question you asked me? IVe searched through and i see no question.
 
if nvidia start selling PPU's as a seperate product rather than integrate them into gfx cards then its a waste of time and hardly anybody will buy them, and they would milk people of more money if they did sell.
 
where is the question you asked me? IVe searched through and i see no question.
You said “UT3 and they screwed that one right up”
I asked how was it a screw up? UT is one of the few games they got right in my mind. It was GRAW 1 they screwed up amoung others.
 
You said “UT3 and they screwed that one right up”
I asked how was it a screw up? UT is one of the few games they got right in my mind. It was GRAW 1 they screwed up amoung others.

Sorry honestly didn't see it.

Well the fact that the PPU only maps were unplayerble is a good enough reason for me to see it as a failure. Unless recent patchs ahve changed that?
 
“Well the fact that the PPU only maps were unplayerble is a good enough reason for me to see it as a failure. Unless recent patchs ahve changed that?”
It was never unplayable with the final game and final maps. http://www.driverheaven.net/reviews/ageia/flashers/utlighthouse.jpg

The only time it was unplayable was that first review which used beta maps and used very old drivers. But I had a right go at that review back then didnt I?
 
“Well the fact that the PPU only maps were unplayerble is a good enough reason for me to see it as a failure. Unless recent patchs ahve changed that?”
It was never unplayable with the final game and final maps. http://www.driverheaven.net/reviews/ageia/flashers/utlighthouse.jpg

The only time it was unplayable was that first review which used beta maps and used very old drivers. But I had a right go at that review back then didnt I?


I made a dash back to my orginal source of bit-tech to find they found out they had a hardware fault.

So you can scratch what i said.

Update - 21st Dec 2007
We've since spoken to Ageia, who have informed us that there may be a problem with our hardware configuration which results in the low FPS we encountered. You can read our thoughts about it here, or Ageia's response here. We are currently investigating the matter further.

Update - 18th Jan 2008
We have recently discovered that faulty hardware may have caused unreliable performance in the PhysX review of Unreal Tournament 3. As such, we later revisited the game with working hardware, the results of which you can read here
 
Very interesting. I thought it was the beta maps and drivers they used. Didn’t realise they had a hardware fault.

Shocked that Ageia did something about it and spoke to them. I thought Ageia where useless in that repect.
 
Very interesting. I thought it was the beta maps and drivers they used. Didn’t realise they had a hardware fault.

Shocked that Ageia did something about it and spoke to them. I thought Ageia where useless in that repect.

well id been thinking that the ppu was terrible in ut3 because i had no reason to doubt Bit-tech, who imo are one of the bset review sites around. Seems only recent that they found out it was a hardware fault.
 
Back
Top Bottom