CPU vs GPU - why arent CPU's as powerful?

Associate
Joined
27 Jun 2007
Posts
577
Location
England
As title really, with using the GPU F@H client its seems the GPU is hugely powerful, I also hear something about CS4 utilizing GPU power to help along. So how comes GPU technology cant be intergreated into CPU's to make em more quicker?
 
Most the people in GD can't decide on what hairstyle or socks to go for, do you think this is the best place to discuss such a thing?! :D
 
GPU's are built for very specialized calculations very useful for gfx and some maths, but woeful at the stuff your cpu is.

just like your cpu is crap at gfx.


like a tank and an F1 car, one will out do the other on a track, but fail epically off road.



Although they are working on gpGPU stuff in dx11 so it will be able to help in certain tasks.
 
Most the people in GD can't decide on what hairstyle or socks to go for, do you think this is the best place to discuss such a thing?! :D


lol yea i did wonder that after seeing theres a bloomin big borther topis in here!!!!!!!!!!!!
Maybe should have stuck it in General Hardware? Feel free admins ;)
 
GPU's are built for very specialized calculations very useful for gfx and some maths, but woeful at the stuff your cpu is.

just like your cpu is crap at gfx.


like a tank and an F1 car, one will out do the other on a track, but fail epically off road.



Although they are working on gpGPU stuff in dx11 so it will be able to help in certain tasks.

Hmmm, true, would be interesting if they could use the GPU for those kinda things in line with the CPU, like sharing the load - would make some serious difference in computing power.
 
Hmmm, true, would be interesting if they could use the GPU for those kinda things in line with the CPU, like sharing the load - would make some serious difference in computing power.

depends there are every few things you use your home pc for other than GFx and folding that need that kind of processing, so it wouldn't really help anything.

Now if you are a lab that uses very expensive suplimentey maths processors a GPGPU may be a lot cheaper than your current custom built add in cards.
 
As title really, with using the GPU F@H client its seems the GPU is hugely powerful, I also hear something about CS4 utilizing GPU power to help along. So how comes GPU technology cant be intergreated into CPU's to make em more quicker?

Heh hard question and not possible to answer non technically. GPU's are built for floating point operations ( used a lot in GFX as tefal says). The reason they come in handy is because the cpu's ALU deals with integer arithmetic and all the floating point operations can be offloaded to the FPU/GPU. GPU technology can of course be integrated into the CPU (Which is exactly what integrated GFX chipsets do). The main problem here is memory.. the CPU has register memory and cache memory (small amounts) dedicated video memory is really needed and thats what your video card provides. :p

Hmmm, true, would be interesting if they could use the GPU for those kinda things in line with the CPU, like sharing the load - would make some serious difference in computing power.

They do already. A supercomputer has been built using GPU's.. What your missing is that the circuitry on the GPU is not built for general purpose tasks. The term your kinda looking for is co-processor anyhow.
 
Last edited:
A CPU is, at heart, a serial device.

A GPU is a massively parallel device.

A CPU is far more flexible than a GPU, but when a program CAN be adapted to work with a GPU (ie when it can be broken down into a massive number of parallel threads with relatively little communication between the threads), then the GPU will be many times faster. Relatively few applications lend themselves to such multi-threading, unfortunately.

A CPU gives over most of its transistors to control logic and caching, wheras a GPU uses almost all of its logic for raw calculation:


cpuvsgpuzk2.jpg



To make an analogy: Think of the CPU as a scalpel, and the GPU as a sledgehammer. You can break down a wall with a scalpel, but it will take a long time. However you can never do brain-surgery with a sledgehammer.
 
Last edited:
Heh hard question and not possible to answer non technically. GPU's are built for floating point operations ( used a lot in GFX as tefal says). The reason they come in handy is because the cpu's ALU deals with integer arithmetic and all the floating point operations can be offloaded to the FPU/GPU

CPUs deal with floating point arithmetic too. The nature of the data is not the issue. The issue is that GPUs are designed for parallel operation, and CPUs are designed for highly flexible serial operations.
 
CPUs deal with floating point arithmetic too. The nature of the data is not the issue. The issue is that GPUs are designed for parallel operation, and CPUs are designed for highly flexible serial operations.

Yea I know they do. Oh ok fair enough I stand corrected then :p Got to admit GPU design is one of the areas I don't know that great.. Totally forgot they were so parallel :D Im still thinking of the olden day separate FPUs!
 
Last edited:
The term your kinda looking for is co-processor anyhow.

Thats the word! You probably guessed I'm not too technically clued up on this, hense asking :D

cpuvsgpuzk2.jpg



To make an analogy: Think of the CPU as a scalpel, and the GPU as a sledgehammer. You can break down a wall with a scalpel, but it will take a long time. However you can never do brain-surgery with a sledgehammer.

Cheers to you both though, very helpful and now I understand more about what the two different processors do.
 
A CPU is, at heart, a serial device.

A GPU is a massively parallel device.

A CPU is far more flexible than a GPU, but when a program CAN be adapted to work with a GPU (ie when it can be broken down into a massive number of parallel threads with relatively little communication between the threads), then the GPU will be many times faster. Relatively few applications lend themselves to such multi-threading, unfortunately.

A CPU gives over most of its transistors to control logic and caching, wheras a GPU uses almost all of its logic for raw calculation:


cpuvsgpuzk2.jpg



To make an analogy: Think of the CPU as a scalpel, and the GPU as a sledgehammer. You can break down a wall with a scalpel, but it will take a long time. However you can never do brain-surgery with a sledgehammer.
Wow. I knew the answer but that still made it sound better in my head!
 
Wow. I knew the answer but that still made it sound better in my head!

Thanks.

To be fair though, I've had practice at answering these questions. I've had to justify (or 'sell') stream computing to others in our department a few times now, and the first thing I always do is highlight the real-world differences between CPUs and GPUs.
 
You can't really compare the two.

If you mean "why are GPUs stressed more by most games" the answer is that they are, effectively, younger. GPUs are less technologically developed than CPUs-while CPUs are a long way ahead of most software, GPUs are still neck and neck with it.
 
GPUs are massively parallel - cpus are intrinsically mildly parallel. Do some research into * SIMD (GPU) and MIMD (CPU are good at this). Then check out vector machines etc .... oh and check out Amdahl's law whilst you are at it and then decide which is more powerful!
 
Last edited:
GPU's are dedicated to just one thing where as a CPU can be used for many differernt tasks, from decoding games, to encoding films
 
Back
Top Bottom