• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

So Carmack doesn't believe in PPU's...

Soldato
Joined
19 Oct 2002
Posts
5,613
Location
Reading UK
JOHN CARMACK appears to have put another nail in the coffin of dedicated physics add-in cards by suggesting that he doesn't believe they really add anything to a system that multi-core CPUs can't also do.

Speaking to BootDaily, Carmack said, "I am not a believer in dedicated PPUs. Multiple CPU cores will be much more useful in general, but when GPUs finally get reasonably fine grained context switching and scheduling, some tasks will work well there."

Carmack, like Intel, Nvidia and DAAMIT, appears to think that using GPU and CPU capacity to handle this stuff is a much better idea than requiring an extra add-in card. Indeed, upcoming Valve title Half-Life 2: Episode 2 and Crytek's Crysis both do advanced physics calculations across CPU cores, and Carmack's decision to go the same way may tip the scales.

In contrast, Unreal Engine 3 uses PPU hardware, but there are still scant details on how, exactly.

Currently, PPU owners can enjoy Ghost Recon 2 in its fancy-physics glory, but that's about it. How long can Aegia wait for UE3 to save the day?

http://www.theinquirer.net/default.aspx?article=41102

Another good reason not to waste your money ;)

/runs away
 
Physics cards will almost certainly go the way of add-on maths co-processors, for those of us old enough to remember those.


A more interesting question is: as cores get smaller and more of them are added to each CPU, how long before all the gfx are on the CPU, rather than a dedicated GPU?


M
 
Meridian said:
Physics cards will almost certainly go the way of add-on maths co-processors, for those of us old enough to remember those.


A more interesting question is: as cores get smaller and more of them are added to each CPU, how long before all the gfx are on the CPU, rather than a dedicated GPU?


M


The GPU on CPU won't work IMO. The mem B/W on graphics cards are 10x of system and they can't get 1 or 2gb into a 2x2 cm square.


High end CPUs and GPUs will always remain separate IMO, low power stuff in laptops possibly :)


Phys-X was a gimmick from day one. year after it came out and still NOTHING to warrant the £200.

Ill stick to my Mac Pro thanks :)
 
Meridian said:
A more interesting question is: as cores get smaller and more of them are added to each CPU, how long before all the gfx are on the CPU, rather than a dedicated GPU?


M

Isn't that what Intel's 'Larrabee' is trying to do?

EDIT: I went to edit but it's posted another reply?, Sorry :(.
 
Pretty sure both "Larrabee" and "Fusion" will be just low powered solutions to begin with (like integrated graphics).

Interesting to see where it goes in the future though.
 
Well for those expecting something like an 8800 on a C2Q will be disappointed, unless Intel develop something that can withstand around double the heat output.

Think about it, 8800GTX + Q6600 would need a Tmax of around 175C. Not gonna happen just yet I don't think! :o :D

*Hauls thread back on track*
 
Concorde Rules said:
The GPU on CPU won't work IMO. The mem B/W on graphics cards are 10x of system and they can't get 1 or 2gb into a 2x2 cm square.


High end CPUs and GPUs will always remain separate IMO, low power stuff in laptops possibly :)


Phys-X was a gimmick from day one. year after it came out and still NOTHING to warrant the £200.

Ill stick to my Mac Pro thanks :)

ALWAYS remain seperate huh? do you honestly believe that, wonder how many people 10 years ago said youll never get more than one core in a processor, its inevitable it will happen, just a matter of time
 
Tute said:
UE3 is going to be "judgement day" for the PhysX. :o

UT3 is exclusive to ps3 last i heard

check gametrailers.com, and look for the gears of war pc interview. the developer speaks of the engine of that and also the engine of UT3. he says its exclusive to ps3.
 
Meridian said:
Physics cards will almost certainly go the way of add-on maths co-processors, for those of us old enough to remember those.


A more interesting question is: as cores get smaller and more of them are added to each CPU, how long before all the gfx are on the CPU, rather than a dedicated GPU?


M

Remember what happened to the maths cpu, it became a standard part of the CPU but it is still there. After all the 486 was basically a tweaked 386+387 with some improvements.

Intel said a Pentium would be enough why need expensive graphics chips and disk controllers, well they are still around, and way ahead of CPU power.

Personally I can see Physics Chips appearing on the GPU cards in much the same way that the Video Processing is now done, where the GPU consists of the basic GPU for 3D/2D Rendering, a VPU section for decoding HD Video, and likely a PPU section with silicon for processing Physics.

If I was Aegia I would be looking at transitioning to being an API company and getting it adopted by Microsoft, licensing the PPU designs so that can possibly be incorporated into AMD/Nvidia's GPU's rather then as an expensive add on board.
 
ergonomics said:
UT3 is exclusive to ps3 last i heard

check gametrailers.com, and look for the gears of war pc interview. the developer speaks of the engine of that and also the engine of UT3. he says its exclusive to ps3.
No it's not exclusive, PC and PS3 versions due November, 360 version January.

- Mark Rein
 
Gashman said:
ALWAYS remain seperate huh? do you honestly believe that, wonder how many people 10 years ago said youll never get more than one core in a processor, its inevitable it will happen, just a matter of time

He did say High End after all.

Mainstream to Low End Graphics wil integrate, in some ways they already have, after all Intel has the biggest slice of the Graphics market and doesn't make anything other then the Graphics in there Chipsets, AMD and Nvidia also offer IGP chipsets in addition to there GPU addins. Will there be a benefit moving the Graphics from the chipset to the CPU, it is still going to need to go off chip for memory. How big would the 965G be and how much heat if it incorporated an 8800GTX Ultra graphics rather then an entry level graphics system.

Personally I cannot see the High End GPU disappearing as I can't see it becoming effective to put 8800GTX Ultra's into every CPU, afterall AMD and Nvidia are already starting to use these for scientific calculations the so called GPCPU market. Tesla is only 8800 GPU's running different programming

General Purpose CPU's just aren't as efficient as dedicated GPU's, so you either have to bolt in lots more Cores to make up for the inefficency (and end up with a large chips) or end up sticking silicon dedicated to graphics in the CPU the same as we now do for FPU's and AMD has done with the Memory Controller, the development of which is somewhat more stable.

High End GPU's also use much faster memory then CPU's so the high end would be throttled by the slower standard memory that the CPU uses, whilst CPU memory speed is increasing, there is still a huge gap and increasing.
 
Jihad said:
No it's not exclusive, PC and PS3 versions due November, 360 version January.

- Mark Rein

hmmmmm interesting,

well the engine developer does imply its ps3 exclusive in the way he speaks about it. but maybe its me misunderstanding him, but you'd think it was too with how he words it....
 
ergonomics said:
hmmmmm interesting,

well the engine developer does imply its ps3 exclusive in the way he speaks about it. but maybe its me misunderstanding him, but you'd think it was too with how he words it....
When it was announced exclusive they only meant consoles not PC.

He elaborated on it further in an interview, PS3 is exclusive this year (console wise) and 360 gets it in January.
 
Maybe he's realised that the £120 for a PhysX could have been spent on something useful, like half the money for a 8800 GTS ;)
 
Back
Top Bottom