CPU vs GPU

Permabanned
Joined
13 Nov 2006
Posts
5,798
I still don't quite understand which are better to use.

Do the work units that CPU's get given differ from what the GPU's crunch?

I remember reading the PS3 was only good at certain calculations so it was limited as to what it can crunch.

For general crunching are GPU's more efficient than CPU's ? I have a Q6600 at 3.2ghz and my ppd is around 1800. Would my budget gfx card which is an ATI 3650 beat my quad core?

Thanks if anyone can clear this up.
 
The CPU is a much more general purpose cruncher - i think they can pretty much configure a cpu to compute just about anything, it just takes quite a long time. Modern GPUs are highly parallel (a little similar to the PS3 as far as i know), so are are very fast for some computational operations. This is also a limitation. They also currently rely on the application being written into the CUDA (or ATI equivalent) langauge - again, a limitation (for example, CUDA cannot currently handle some 'standard' programming statements, such as a switch() ). This means that different work is currently asigned to each client.

Im not sure as to which is more efficient - both are pretty power hungry. My guess CPUs are probably preferred due to their versitility, but why waste the computational power of GPUs? I doubt you would get as many point from the ATI card as the quad, but there is nothing to stop you running them together. If you have vista, the GPU client uses very little ofthe processors power (1 core if you have xp), so you can run both at the same time. Try it out, see which gives you the best ppd. It should be fairly hassle free.
 
Back
Top Bottom