• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

My Test of GRAW running with and without physX

Wow lowrider007, henceforth you shall be known as the man who sealed the fate of discrete physics accelerators by single-handedly killing off Aegia... maybe you are on Havok's payroll now :p
 
Cyber-Mav said:
when the first 3d accelerators came onto the market, like the voodo card etc it was an obvious win win situation.

When I saw the first 3d cards, it make quake have glowy special effects. I could do without it and did for several more years

Quake 3 was the first game I couldnt play because of having no 3d card. Driver was the reason I brought a geforce, it developed artifacts in 6 months :/
 
Sumanji said:
Wow lowrider007, henceforth you shall be known as the man who sealed the fate of discrete physics accelerators by single-handedly killing off Aegia... maybe you are on Havok's payroll now :p


tbh I was actually thinking about this last night and could'nt help feeling a little guilty over the currunt situation.............Havok if your reading this WHERES MY CHECK :mad:
 
Cyber-Mav said:
when the first 3d accelerators came onto the market, like the voodo card etc it was an obvious win win situation.

Well, if we we look at some of the earliest 3d accelerators such as S3 Virge, and indeed the Verite/PowerVR things weren't quite so clear cut.

In fact even on Voodoo Graphics, you could get situations where it run slower than software rendering (providing you had a fast cpu) due to the inability to run at resolutions below 512x384.
 
I think that there is room for improvement in the Nvidia and ATI drivers for GRAW, 37 fps without afr, and 80-90fps with afr on the ATI 1900 in Crossfire all details at max 1280X1024.
Mind you the 10 fps drop will mean that anyone using the Ageia PPU will need Crossfire at a bare minimum to play some of the titles at decent frame rates.
The next official release of a game boasting Ageia compatibility will seal the succes or failure of the hardware, I do hope it works out.
 
raja said:
I think that there is room for improvement in the Nvidia and ATI drivers for GRAW, 37 fps without afr, and 80-90fps with afr on the ATI 1900 in Crossfire all details at max 1280X1024.
Mind you the 10 fps drop will mean that anyone using the Ageia PPU will need Crossfire at a bare minimum to play some of the titles at decent frame rates.
The next official release of a game boasting Ageia compatibility will seal the succes or failure of the hardware, I do hope it works out.
Voodoo chained cards didnt work out but now look at the current technology.
 
If you are going to compare these cards to the introduction of hardware accelerated 3d then you should bear a few things in mind.

First off there was an immediate and obvious benefit to using hardware accelerated 3d. The difference between it and software rednering was the proverbial 'chalk and cheese'. It was clear where the future lay.

Also, although the voodoo cards performed better for rendering gl quake and quake2 it was painfully obvious that a single card solution ( hello ATI RagePro ) was vastly more preferable, albeit still slower. And so the writing was on the wall, single card solution was the way forward.

Now then lets see what we have with these new cards. Arguably they provide no tangible benefit at the moment. Adding very limited eye candy at the cost of performance. As for the future, I've yet to see anyone from havoc or ageia offer up any examples of where these cards might provide a tangible benefit in games further down the road.

Now this brings us to the crux of the problem, theres big performance hit with these cards and I see only one way to do somethign about that. Bring the chip onto the card alongside the GPU but this in itself then begs the question. Whats the point? You see if all these cards do is add extra eye candy then there is no future for them. The extra eye candy can more easily come from the GPU...

to me it just looks like these companies are just trying to create a market where none exists. If someone can provide an example of how these cards can bring benefit to future games and do things that the GPU cannot then i'm sure we'd all love to hear it but there's a noticeable lack any such info.
 
Cenuij said:
to me it just looks like these companies are just trying to create a market where none exists.

I must admit I was somewhat suprised at the amount of hype surrounding these add-on boards over recent months, to me a "physics accelerator" wasn't something which seemed that obvious a step forward to me. You may as well have almost anything that the cpu does... an "AI accelerator", for example.

The biggest suprise, of course, was the cost. £200 is a ridiculous amount of money to expect people to shell out for something which does relatively little, at least with graphics and soundcards you have something which effects the WHOLE of an output stream, ALL of the time.

I really can't see a long-term future for seperate physX addon cards... incorporated into soundcards would be a good move, maybe even graphics cards.
 
HangTime said:
£200 is a ridiculous amount of money to expect people to shell out for something which does relatively little.

I think you've hit the nail on the head with that statement. It seems to be reinventing the wheel somewhat doesnt it.
 
card does seem a little pointless at the moment.

i wonder how andy gibson is trying to promote this card to get more sales. would be interesting seeing how its currently a decelerator card :p
 
Its not quite that bad since it adds some special effects, if it cost 20 pounds to do that you'd be allright with it but people expect just a little more for so much.

Personally I dont even care that much about the graphics, if it really was a physics card you'd see no direct effects just experience much better gameplay/feeling in the game - then I'd be tempted :)


Hrm bf2 dune buggys that have gt4 physics would be pretty good along with varying handling according to damage
 
Last edited:
Fulcrum said:
Hrm bf2 dune buggys that have gt4 physics would be pretty good along with varying handling according to damage

Thats what I thought the whole idea of a physics card was. To enhance the current physics in game, taking the work load off of the processor and increasing framerates... Not just using the card to add in extra eye candy.
 
I'm sure it will eventually... GRAW was just a big name title that they gave paper-support to so they could slap a big PhysX logo on it; I thought this was common knowledge?!

Suman
 
Sumanji said:
I'm sure it will eventually... GRAW was just a big name title that they gave paper-support to so they could slap a big PhysX logo on it; I thought this was common knowledge?!

Suman

Yeah they sort of forgot to let us know that bit :)

Anyway, looking at the results from the tests, we can see that, even after a publicised update to the driver, there are no improvements.

I hope PhysX are not trying to dig themselves a hole...it's such a early time in the PPU world, but they're not doing their best to control things at the moment - it's pretty bad of them to imply that the beta driver will solve the problems with the PPU, it really needs to have more thorough testing before release...even for a beta.

Anyway as I said in my earlier posts, it is still early days in the PPU world, let's hope things improve...and fast!
 
Well OcUK have them in stock now, but 256meg versions will be out from Asus by the end of the month. Plus from reading the results people are getting, with the few games that do support it - frame rates suffer big time with Physx turned on. Personally I am going to wait for 3 things to happen before I consider buying onel

1. more games come out which support it,
2. the issues with framerates are solved
3. Prices drop dramatically.

Incidentally, what does the fan in this physx cards sound like. Does it speed up when the card is being used or is it on 100% all the time ?
 
Muzy said:
Guys, is PPU worth buying, or should I wait for more games to support this technology ?

I think the general concencus of this technology is that is is far too young. If I were you I would wait a few years to buy a card, because at the moment its pretty pointless.
 
Totally not worth it imo. Id like to see more games getting destructable environments but atm no games do and its an awful lot of money to pay for in the hope u can take advantage of it in the future.

I read UT2007 is getting destructable environments but, a year after thats out the graphics cards alone will do it. The last physic based effect I heard of was ragdoll effects (in the ut2004 engine) and thats part of 3dmark03, a graphics power test.
 
Cyber-Mav said:
card does seem a little pointless at the moment.

i wonder how andy gibson is trying to promote this card to get more sales. would be interesting seeing how its currently a decelerator card :p
Probably with a thread saying "OMG NEVER REPEATED FOR DEFINITE THIS TIME - PhysX only £182.95!!! (Previously sold for £184.95).. While stocks last!!" ;)

I hope that doesn't count as me being too sassy. :)
 
Back
Top Bottom