• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia GPU PhysX almost here

“then silly ideas like this, will always be on the horizon”
If it boosts high physics games like the high physics levels in UT from 10fps up to playable FPS it’s not a silly idea. If the GPU can do something way faster then the CPU why is it silly to do the job on the GPU?
 
Yes why not give it a neagtive spin, all that is great don't get me wrong, but they are going to want you to get another gfx card to do it, as as i said above, this is Nvidia here, no one can tell me they are just going to enable the physics in a future driver, a driver that you do not have to pay for, so your effectively going to get physics for nothing if you already have an 8/9 series etc.. card, a card that you could have bought nigh on 2 years ago, yeah right. :D

very true. The age old saying applies here: "you get nothing for free in this world"

nVidia are simply not going to give us the Physx technology for free. If they do give us this technology in the form of a driver. Then it will be a seperate drivers that you have to pay for. Which then leads to piracy, since people will just get the driver from other sources.
 
sure a few people on here will try and put a negative spin on this :rolleyes:

I'm not been negative. I think it is a good thing and I always want something for free.

Perhaps my upgrade from a GT to a GTS with the extra stream processors will come into it's own.:D

The encoding is really impressive and in my case it ought to encode at least 20 times faster than my dual core AMD. Not sure I would pay for that though even £25. If I encode now and it takes an hour and will only then take 3 minutes it won;t bother me. I just leave my computer encoding overnight anyway.
 
“Yep you will need a second card if you want to keep your framerates up. There are only so many stream processors and that is what is been used for the physics.”
Surly that depends on the game. Freeing up the CPU and moving physics off it will in some games give a bigger FPS boost then the drop from running physics on the GPU.

GPUs are fundamentally dumb.

The CPU needs to orchestrate the individual processing and instruct the GPU to initiate data transfers (on and off GPU) and perform data transforms between data.

In addition each GPU data transfer (texture load) between GPU memory and main memory takes a DMA which steals bandwidth from the other GPUs and CPU cores..

For example - the data rate for operations in an X1950XTX are 120GB/sec whereas PCI-E DMA transfers are 1-2GB/sec max. (which is a lot for an 8GB/sec memory bus!)

Now you can see the reason for attempting to keeping as much as possible in GPU memory (hence larger GPU memory sizes). This reduces the impact.. but increases the cost..

Sure if they can do realistic smoke modelling around the 3D scene (ie it's fluid dynamics interacting with the mostly static game environment). Things like opening a door for example should reflect on the smoke..
However the bandwidths required for this exceed what current memory buses can handle (it's fine calculating on the CPU but to offload the smoke particle data to main memory then render it takes big time CPU (rather than GPU) performance..
So I doubt an extra GPU could do this without reducing the data volumes (ie perhaps halving and then interpolating the smoke particles during rendering).

Also things will have to be supported for CPU based gamers (the main market for the games companies) so it's really just the frilly bits.. is that worth an extra £200-400??
 
very true. The age old saying applies here: "you get nothing for free in this world"

nVidia are simply not going to give us the Physx technology for free. If they do give us this technology in the form of a driver. Then it will be a seperate drivers that you have to pay for. Which then leads to piracy, since people will just get the driver from other sources.

I think you will get it for free. It's like bringing out a driver with AA or AF. Just because you can switch it on doesn't mean to say it's usuable at playbale framerates and it then won't encourage you to upgrade your card :p

Say you get some killer games in the next year which make full good use of physics and your 9800 GTX handles it fine with physics off but switched on in the driver it grinds to a halt. Isn't that going to make you buy the 9900 GTX which can do both (surmising here) ?
 
It wont be free though, Nvidia have bought Ageia, i.e spent money, they aint going to give us, what they bought for possibly millions, just in an updated driver for nowt, i just can't see it. :)
 
Also things will have to be supported for CPU based gamers (the main market for the games companies) so it's really just the frilly bits.. is that worth an extra £200-400??

Won't be that much. Get your 9800 GTX or 9900 GTX for the main graphics and have a 8800 GTS or less for the particles. Look at the 8800 GTS results from the demo - 16 times quicker than a quad core Intel. So say in 6 months time you can put a 8800 GT in your system for £50 or just keep it and not sell it when you upgrade.

If the games are out there and the results are worth it I will be tempted to keep my old card just for the physics.

Plus with high end cards with lots of stream processors, one card ought to be sufficient - well at least until they release the patch for Crysis that it.
 
It wont be free though, Nvidia have bought Ageia, i.e spent money, they aint going to give us, what they bought for possibly millions, just in an updated driver for nowt, i just can't see it. :)

Makes a Nvidia more diserable to buy over an ATI card?????

Isn't that what they spend millions on reserch for in the first place?
 
they're simply positioning themselves against intel and its fabled larrabee, they bought ageia to get into physics systems just as intel bought havok, they also bought the mental ray raytracing engine to position themselves on that front
 
It will almost deffinatly be free - the physics just run on the CUDA pipeline thats applicable to all 8 and 9 series cards - it won't have a huge performance hit on most games - maybe crysis - potentially it could be a very good companion for SLI as in SLI setups its rare for the GPUs to be running above 70-80% useage - giving quite a bit of free processing to physics - better utilising the 2 cards you have bought than SLI for rendering alone would.
 
Yer won't be using yer old GTS with your 9800 GTX etc... if they do it via SLi though will yer. ;)

If i remember, i think ATI's Havok was going to be done with Crossfire, so if the game used the Havok Physics, then it would use your 2nd card as the sole Physics one, and to get the Crossfire and the Physics, you had to be using 3x cards (i.e your 2x in Crossifre, with the 3rd being used solely for the Havok Physics), Now whats the chances of Nvidia going this route, as think about it, they are going to get a lot more £'s by making you spend £300 x3 on the exact same cards (as SLi can't mix and match like Crossfire don't forget), than they would get by people just buying years old cards for £50 quid that are well out of production, and using them to get the Physics.:p

If they just sling the physics into an updated driver (which they've paid millions for) which you download for nowt, and make it run fantastic, so there is no need for anything else (no second card, nothing), then im going to go out and buy a hat. :D
 
Last edited:
My guess is that they will have a 9x00 physics card and a 9x00 graphics card.

Wack some **** code in the physics card that only lets it do physics for about 2 weeks until some one hacks it and vice versa.

And either the physics card or the graphics card will be higher priced than the other and once they get hacked it will make Nvidia look like proper tools once again.
 
Won't be that much. Get your 9800 GTX or 9900 GTX for the main graphics and have a 8800 GTS or less for the particles. Look at the 8800 GTS results from the demo - 16 times quicker than a quad core Intel. So say in 6 months time you can put a 8800 GT in your system for £50 or just keep it and not sell it when you upgrade.

However online games (by their requirement) have to be fair to all that play... so they can't add additional content into the game that affects game play and actually puts section of their playing market at an online disadvantage (eg additional killer robots etc)..

True it may be 16 times as quick on a demonstration that's tailor made. Once into actual games engine this may not be as large.

Don't get me wrong - if it does give fantastic feeling to a game great but in all it all boils down to game play..
 
Yer won't be using yer old GTS with your 9800 GTX etc... if they do it via SLi though will yer. ;)

Why on earth would you want to? if you mean the G92 GTS and the 9800GTX then that is possible if you want to fiddle (modified BIOS, etc.) - would probably work well enough if you clocked them the same - if your talking the old GTS the performance difference would cause some hideous frametime instability.
 
Why on earth would you want to ?, as you would get the Physics cheaper than you would by buying another 9800 GTX. :confused:

You could use a 9800 GTX as yer main card, buy a cheap 8800 GT to do your Physics (instead of another 200+ pounder 9800 GTX), but you can't, as you can't pair an 8800 GT with a 9800 GTX, SLi is same exact cards crap, so how are you gona use your 9800 GTX as your main card, with your cheapo next to nowt 8800 GT doing the physics, your not are you. :p
 
Last edited:
Wonder if something like an 8600/8500/8400 would work alongside something like an 8800GTS?

Surely they would provide decent Physics processing - especially when you compare their technical spec to the old PhysX cards which were none too powerful in comparison! That way you could have dedicated physics processing in anothe PCI-E slot for as little as £20.

gt
 
Wont be possible though will it, you cant put an 8400 alongside your 9800 GTX or whatever, it has to be another 9800 GTX, only CrossfireX is mix and match, so unless they allow you to use an old card as a physics card, but not have it connected to your other card via SLi, then your screwed, and tbh, i can't see them doing it like that, as that won't net them the most amount of money, they'll do it via the SLi, so you have to buy another one of the exact same cards at another £200-£300 a pop imo, this is Nvidia. :)
 
Last edited:
Well from a hardware level there's nothing stopping you running two PCI-E video cards - they are, after all, just another hardware component.

What is the limitation is probably the nVidia drivers, windows or combination - probably born out of a marketing/accountant business model and to keep it technically simple.
 
Wont be possible though will it, you cant put an 8400 alongisde your 9800 GTX or whatever, it has to be another 9800 GTX.
Of course that would be the case if you were using SLi but i'm talking about them doing different jobs. One graphical the other physics.

Not sure if they plan to do this but it would be effective if they decided 1) that they could and 2) that they would!

I'd happily add anything up to an 8600 along side my GX2 if their driver system allowed it.

gt
 
Am I missing a point here though. Does the Ageia Physx card out now need SLI to work?

Therefore, unless Nvidia makes it so, why would you need them to be in SLI to work?

Hang on...... it's Nvidia....they probably will then :(

I suppose it will just rely on next gen cards with high number of stream processors to carry out the physics calcs.

People who go SLI with two matching cards ought to have more than enough horsepower to do both.

People who don't upgrade (from 8 series) will have a choice like now with how much AA/AF to apply and whether to switch on physics.
 
Back
Top Bottom