• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Graphics card physics

Wonder if the physics is still limited to 16/32bit floating point calculations rather than 64bit.

Reason I say that is that the current limitation by GPU bods that program GPUs have always said this is a limitation for non-game-visual physics.
 
Aegia = complete fad that will hopefully die out quicker than the vegetarian lion (990BC-899BC).

On-board PPU for the win.
 
Richdog said:
Aegia = complete fad that will hopefully die out quicker than the vegetarian lion (990BC-899BC).
On-board PPU for the win.
LoadsaMoney said:
Yup, what is the point of Ageia PhysX cards when you can get PPU's already on yer graphics card.

From Article said:
WE HEAR THAT the red team has prepared a live demo of some fancy physics on Crossfire cards.

We suspect you'll need at least two cards to show off the real power and assist the CPU in performing the cool-looking stuff.

All solutions on the table from Ageia, Nvidia and ATI all seem to require an extra add on board, and if you think either ATI or Nvidia will be putting a PPU on their graphics cards any time soon, I think you will be bitterly disappointed. Look at the size of the cards... where can you put an extra chip which will require extra cooling etc etc? Dont expect them to be onboard until they start using dual core GPUs.
 
Last edited:
But utilizing a whole second card for physics is a good idea.

Tbh it nvidias solution works, I'd buy me a second 7800gt.
 
Goksly said:
All solutions on the table from Ageia, Nvidia and ATI all seem to require an extra add on board, and if you think either ATI or Nvidia will be putting a PPU on their graphics cards any time soon, I think you will be bitterly disappointed. Look at the size of the cards... where can you put an extra chip which will require extra cooling etc etc? Dont expect them to be onboard until they start using dual core GPUs.

er... can you provide links to back up this statement?

I have done a cursory Google search and found nothing to back this up. What I think you will see is a push from both companies to work on their Crossfire/SLI systems and utilise one of the GPUs for a similar purpose... sell more video cards... Dangerous game tho, as they would run the risk of pricing themselves out of the market and falling for the 'artificial' market created by Ageia. If all of a sudden you had to pay another £300 odd for another VGA card to get decent physics and gameplay reproduction, a £180 offering from Ageia might be a sweeter deal. I would be surprised if both Nvidia and ATI went down this track...
 
Last edited:
Erm.. suspect you'll need 2x cards to show of the power = Xfire.

Where are they going to put a GPU, and a PPU on one card, what about cooling etc... erm... the 7950 x2 is 2x GPU's on one card, if they can get 2x GPU's on one card, can they not get 1x GPU and 1x PPU on them as well then, and would'nt 2x GPU's need more cooling than 1x GPU + 1x PPU anyway. :confused:
 
Last edited:
SteveOBHave: The first part is covered by the OP's link. The second part is an observation.

Money: well technically they arent one card..... but i suppose thats a decent enough answer :p but if you look at it from a frame rate point of view... having the 2nd GPU calculated physics will result in a theoretical 50% frame rate drop (minus what ever the ppu takes the strain off the cpu to allow it to process more frames).

But er yeah... I dont see the difference between the nvidia/ati solution to the ageia one. both require an extra chip and board (whether its connected to the mobo or not). both are going to have to put a lot of R&D into these methods and do you think they are going to bundle these free? I wouldnt be surprised if they just under price Ageia by £10 or something.

Not too sure why you guys are dissing the Ageia way and bigging up the unproven methods. i have little doubt in my mind that if the nvidia one etc were to run GRAW the effects and outcome would be identical. People are confusing the PPU with the software; as its the latter that creates the visual representation of the data the former gives. A very big mix up.
 
Goksly said:
SteveOBHave: The first part is covered by the OP's link. The second part is an observation.

Money: well technically they arent one card..... but i suppose thats a decent enough answer :p but if you look at it from a frame rate point of view... having the 2nd GPU calculated physics will result in a theoretical 50% frame rate drop (minus what ever the ppu takes the strain off the cpu to allow it to process more frames).

But er yeah... I dont see the difference between the nvidia/ati solution to the ageia one. both require an extra chip and board (whether its connected to the mobo or not). both are going to have to put a lot of R&D into these methods and do you think they are going to bundle these free? I wouldnt be surprised if they just under price Ageia by £10 or something.

Not too sure why you guys are dissing the Ageia way and bigging up the unproven methods. i have little doubt in my mind that if the nvidia one etc were to run GRAW the effects and outcome would be identical. People are confusing the PPU with the software; as its the latter that creates the visual representation of the data the former gives. A very big mix up.

Not at all m8, we are well aware that part of the failing is in the implementation of the PhysX engine within the game, however we are also aware that there is an amount of onus that should fall squarely on Ageias shoulders, that of releasing hardware that has little or no apparent supported application. Surely if you release a product, you do it on your own terms, not based on someone else’s - i.e. GRAW or Cell Factor.

Regardless of what the PhysX card is capable of, we have not seen decent results, and as such draw our conclusions from the vehicles supplied.

As for a difference between the Ageia implementation and the ATI/Nvidia implementation, I also see very little, and see them as being equally as poor from an investment vs rewards standpoint. Either way I'll be aggravated by the requirement to spend another £200-£300 to keep my machine on par.

Something that Ageia have to remember is that perception is reality, and all we perceive is a lacklustre offering.
 
It's abit too early yo be slating Aegia just yet, considering it really is down to how game developers use Aegias SDK and we have only seen two examples of this ebing GRAW and Cellfactor.

There were rumours that Microsoft were going to make a Physics API for game developers to write their games around. I heard that rumour from here though and have no hard proof or links to back this up.

I doubt we will see actuall PPUs on cards for quite a while yet, not untill the likes of nVidia and ATi are producing GPU's on smaller processes and power consumption is down.
 
SteveOBHave: fair enough.... i feel the same; i'd rather the calculations be carried out by the cpu with dev teams using the full power of an extra core if there is one available... as soon as quad core cpus start the paint the town red and with game devs saying to properly multi thread games being very difficult -> well just send the calculations to the extra cores.
I'm not sticking up for Ageia (although it prolly looks like it :p) I can just see that none of the stuff out there shows what it might be good at. UT2007 was the game they touted as showing huge differences as that uses it for what its good at and was designed in from the base up. That could be the make or breaker.

I think if CPU cores are not utilised, then external PPU cards (either glued to the GFX card or plugging into the mobo) are going to become all tha rage... doh ;/
 
The maths involved in physcis calculations are very similar to those involved for graphical calculations, so the problem with using a 2nd core for physics would be the exact same problem as saying, "let's ditch our expensive graphics cards and use a 2nd/3rd/4th core for calculating all our graphical needs". It doesn't work because CPU's are just no good when compared to dedicated hardware.

The obvious problem with this analogy is that we can carry on gaming quite happily without any (or poor) physics, whereas our games would not be so great without any graphics. The principal still stands though :)
 
aye dont get me wrong... a purpose built solution will always be better than something that was designed to do many tasks. What I was trying to get across was the fact that multi core CPUs are likely to become more affordable/popular than multi GPU (or GPU + PPU) solutions. Game developers have been talkin about how hard multi-thread games is, where as if phsyic calculations can already be seperated (as demonstrated by the feasability of the PPU/GPU acting as PPU) - then thats probably a good starting point. Having the CPU calculating the phsyics might mean the number of calculations is cut in half or maybe more in comparision with the PPU.... but still much more than just one CPU :P

Of course, with the PS3 and Xbox360 all being multi-core platforms, maybe they will come up with a more efficient way to multi-thread games.... hmmm :)
 
cleanbluesky said:
Might make upgrades more worthwhile... buy a new card and use the old one for physics rather than puchasing two 1900s
I can see it now:
"For <physics marketting name> to work, both cards must be of the same kind..." :p
Also, it would probably force a lot of people to have SLI/CF mobos.... but by the time they finally roll it out that could be a standard i guess :)
 
Goksly said:
I can see it now:
"For <physics marketting name> to work, both cards must be of the same kind..." :p
Also, it would probably force a lot of people to have SLI/CF mobos.... but by the time they finally roll it out that could be a standard i guess :)

...and thus pricing out the majority of new and mid range PC gamers... Can't say as I am likeing the way this trend it going... :-/
 
This makes me feel ill... and I think I'll blame Ageia for creating an artificial market... ready your already thin wallets guys and gals.

The Register said:
By the end of the year, Cheng said, Havok expects to have realistic liquids, cloth and even hair effects made possible through its API and GPU acceleration. In the same timeframe, Cheng said he expects motherboards based on ATI's RD600 chipset to be shipping with room for three graphics cards.

ATI can burn in hell. :mad:

http://www.reghardware.co.uk/2006/06/06/ati_gpu_physics_pitch/
 
it isnt an artifical market though? the next step in gaming is going to be realism... Think about it - the graphics have improved loads, but in physics only the havok engine has made a decent stab at making the interaction realistic. I want to be able to shoot into brick and see a chip taken out and not just a bmp stuck on there representing one.

Multi-Core CPUs might be an answer; im not sure how bad they are at pure calculations compared to something like the ageia, but the pure lack of talk on this point probably means they are a fair bit worse. Pricing will come down - more competition and all that.
 
Goksly said:
it isnt an artifical market though? the next step in gaming is going to be realism... Think about it - the graphics have improved loads, but in physics only the havok engine has made a decent stab at making the interaction realistic. I want to be able to shoot into brick and see a chip taken out and not just a bmp stuck on there representing one.

Multi-Core CPUs might be an answer; im not sure how bad they are at pure calculations compared to something like the ageia, but the pure lack of talk on this point probably means they are a fair bit worse. Pricing will come down - more competition and all that.

Well, it's not anymore. Until Ageia came along, the concept of purchasing an extra GPU card to get physics wouldn't have been considered, much as very few of us would consider splurging on a quad SLI rig. It's gutting to think that as consumers we have fallen into the idea that we 'need' any of this, well more accurately, been told we 'need' this.

I believe that it comes back to a previous point, ATI and Nvidia are happy to provide the vehicle for lazy coding so that they can sell more pointless high end hardware to the dupes with more money than brains (nasty premonition of Steve spending his hard earned cash on an extra vid card for physics ;) ).

ATI and Nvidia have seen the market created by Ageia and gone "We'll have a piece of that", especially when ATI are talking about using the X1600 chipsets which price in at around £60-£100 thus pricing PhysX out of the market...
 
Back
Top Bottom