• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

My Test of GRAW running with and without physX

Hey this is good, try and get some official feedback...that would be awesome!!!

Also try and get the guy to come on to this thread and post :) That would be a bonus!
 
Too me, a lot of the "extra" effects you get in GRAW because you have a PhysX card can easily be done on a standard system anyway. I've used the CryEngine from the makers of FarCry, and using the Snadbox level editor, have been able to produce some cool physics interactions. True the CPU struggles abit once you go over 100 objects interacting at once, but the key is to increment.

I've made a level with several 1000 barrels that are all have physical properties, by making sure that no more than 100 or so are interacting heavily at once (it's basically stacked barrels that go into a domino effect and push the next set of barrels over).

Also, about the slow moving debris in GRAW, this also happens in the CryEngine, but only when the CPU struggles with the amount of physical objects (they'll travel in slow-mo only to speed up during the end of interaction), so what does that tell you about the "power" of this card.

all this seems dubious to me.

Edit: I must admit though, the Cell Factor demo would slow to a crawl using the CryEngine and the CPU to calculate physics, so maybe it just is the GRAW implementation.
 
Perhaps it is being restricted by the rather lackluster (shared) 132MB/s of PCI bandwidth? Has anyone tested to see if having a PCI SoundBlaster or something installed at the same time could be too much for the old interface. :confused:
 
G.O.A.T said:
Perhaps it is being restricted by the rather lackluster (shared) 132MB/s of PCI bandwidth? Has anyone tested to see if having a PCI SoundBlaster or something installed at the same time could be too much for the old interface. :confused:

Yeah I tried to say something similar in my previous posts in this thread...but no one can know for sure unless you have something which can track the communication between the CPU/GPU/PPU.

The only way we could possibly tell is if we benchmark it in a PCI-X slot and see if the extra bandwidth plays any factor in the speed.
 
G.O.A.T said:
Perhaps it is being restricted by the rather lackluster (shared) 132MB/s of PCI bandwidth? Has anyone tested to see if having a PCI SoundBlaster or something installed at the same time could be too much for the old interface. :confused:

But at the same time there's no space on most boards for the PhysX card to go in a PCI-E slot if you've got SLI, which it sounds like you might need to render all of the new eye candy :(
 
lowrider007 said:
Should I give them my number ?

If they want to communicate with you, you should tell them how and when to get in touch...

I'm curious as to whether they think you are doing something wrong with regards to installation (i.e. they haven't found any frame rate drop) or whether they just want to keep you sweet having purchased a £200 paperweight with a propensity to devalue itself it it is introduced to static.

If either of these are the case, the guy has zero social skills - you do not merely explain that you wish to phone someone and ask for their phone number expectantly. You explain your position to the stranger, you explain why it might be of benefit if you were to have contact with him and then you politely welcome the stranger to have a conversation with you about it...
 
I think there has been a rushed release of hardware, we're seeing a lot of this, even from hardware old timers. I'm actually positive upon the role of physics in a game and would be pleased if this new harware addition can supply more realism and immersion. Poor decision on Ageia's part to release the hardware without waiting for good software implementation, I hope they can recover this, but I think a lot of damage has been done to the consumers perception of the need for the hardware. Some of the pre release videos to upcoming games did look great, but the GRAW implementation is very laclustre. I'll sit this one out for a while to see what happens. Ageia have certainly managed to deter all but the least shrewd from making a purchase, until something conclusive is released to justify the cost.
 
May as well get a dual socket AMD board, fit with a 275, then plug in an AMD FPGA Opteron Accelerator into the second socket and then reprogram it with physics functionality.

That way you have physics on the HT bus rather than PCI,PCI-X or PCI-E...

It seems good enough for Cray to use it for the second CPU...
 
There's a lot of angst here over a single game not running as expected on a piece of new hardware.

The game isn't the best coded example anyway, there were numerous problems with it running multiplayer at a recent LAN event.

It seems easy to blame the hardware if the frame rate is dropping when the hardware is used, but it could be several things. No doubt we'll probably see in-depth articles appearing on Toms Hardware or Anandtech before long giving their thoughts and hopefully, feedback from game developers / ageia peeps.

But referencing the PC Perspective article, http://www.pcper.com/article.php?aid=245, it would appear that Cellfactor will provide a better demonstration than Ghost Recon. I don't want to sound churlish, but just trashing the PhysX card for any reason you can think of seems kinda juvenile right now.
 
andyf said:
There's a lot of angst here over a single game not running as expected on a piece of new hardware.

The game isn't the best coded example anyway, there were numerous problems with it running multiplayer at a recent LAN event.

It seems easy to blame the hardware if the frame rate is dropping when the hardware is used, but it could be several things. No doubt we'll probably see in-depth articles appearing on Toms Hardware or Anandtech before long giving their thoughts and hopefully, feedback from game developers / ageia peeps.

But referencing the PC Perspective article, http://www.pcper.com/article.php?aid=245, it would appear that Cellfactor will provide a better demonstration than Ghost Recon. I don't want to sound churlish, but just trashing the PhysX card for any reason you can think of seems kinda juvenile right now.

Agreed, however the perceived line between genuine observation and outright slating seems to be somewhat blurred. In this day and age you'd hope that while a new product would be hyped and marketed, it would also be released with genuine development support. Surely the mistakes have already been made in previous precedents for them to not be repeated?

From an outside perspective it would seem that the PhysX release has been premature and are slowly being a victim of their initial marketing success.

The new technology should be lauded, the implementation and initial introduction to the market should not.
 
Last edited:
LoadsaMoney said:
Thats criminal. :(

Ripoff of the century, and they will still sell in droves. :(


aye lad the saying "A fool & it's money is easly parted" comes to mind.

it dont even improve the looks of the game that much.

for the cash your spending on it why not get a 360
 
Lanz said:
Havok Sounds Off On Ghost Recon AGEIA Physics

http://www.firingsquad.com/news/newsarticle.asp?searchid=10096

And theres a link to this very thread :)

Yep, it seems OcUK has a lot of eyes on it - this thread in particular.

I can understand Havoc's contention in this issue, it seems like their product is doing most of the work yet is getting glossed over...

Would it to see AGIEA's comments on the slowdown...
 
cleanbluesky said:
Yep, it seems OcUK has a lot of eyes on it - this thread in particular.

I can understand Havoc's contention in this issue, it seems like their product is doing most of the work yet is getting glossed over...

Would it to see AGIEA's comments on the slowdown...

Agreed, especially following from Havoks statement:

"Multiple direct tests on the game by using NVIDIA’s and ATI GPUs indicate the GPU has room to spare and in fact, if the PPU is factored out of the game, that the particle content generated by the PPU can easily be drawn at full game speeds by the GPU."

Food for thought...
 
SteveOBHave said:
Agreed, especially following from Havoks statement:

"Multiple direct tests on the game by using NVIDIA’s and ATI GPUs indicate the GPU has room to spare and in fact, if the PPU is factored out of the game, that the particle content generated by the PPU can easily be drawn at full game speeds by the GPU."

Food for thought...

What source/website etc. did you get that information from?

On the note of the GPU having room to spare, were they specific on what GPU's?
 
Back
Top Bottom