The ppd 18 folding PCs got you in early 2007

Associate
Joined
17 May 2006
Posts
1,189
Location
Manchester, UK
Found this old screen-grab of mine. I spent days setting these up and baby-sitting them.

ppd.jpg


Mainly 3Ghz Pentium 4 Prescotts and some 2.4Ghz Celerons.

The points I got from these in a month is what you can get in a day from one GPU now. (around 50k)
 
Interesting :)

Those 3 GHz Pentiums probably did about 6 GFLOPS (source), a GTX 670 today can apparently go up to 500 GFLOPS (source), which would make a GPU five or so times faster than the entire lot above. The remaining factor of 7 maybe comes from memory speed and architecture efficiency.
 
No easy way to deal with this, unless one constantly rescores hardware.

It is a bit galling when you know that a few years ago, you had a little farm running, and your ability was maxxed out, but your returns from that has now been wiped out for today's comparatively minor expenditure. Was the contribution made to science a few years ago on the hardware and energy expense of that time, any less meritous than today's hardware and energy use? So many of the modern WUs couldn't be achieved on older hardware, but if we hadn't done all that old work would today's WUs even exist?

The distributed computing projects seem to continue to benchmark their WUs and thus rewarded points against a fixed base. Would it have been better for a sliding benchmark, perhaps regraded every 6 months, perhaps based against RRP of components and their predicted energy consumption, that the simple computing power?
 
But that would mean your ppd would drop on your existing equipment every six months and that would suck too.
If you scored it on core-days older hardware would benefit but there would be no advantage in OCing or upgrading.
It's just how it is, hardware improvements are going exponential and so do the scores. I bet if you asked someone who ran a Cray 2 what he would have thought of my $2000 desktop beating the pants off it he would have been a bit grumpy :D
 
Not saying the points inflation can be avoided but it is galling to see how the early contributions are lost. There kind of needs to be an "overall contribution" measure but let's face it, Stanford cares about science not our individual statistics and that's not going to change.
 
A further example of points inflation.

That was my first thought too, but actually you expect the points to go up as computing power increases. As I showed above, a single GPU today really is several times faster than 20 Pentium 4s (1 GPU = 80 or so P4s!).

The points are just proportional to how powerful the machine is, exactly as it should be.
 
Yeah stop whining, the points system works fine, the more you do the more points you get. I've been doing this crunching malarky for a looooong time and it all works out in the wash.
 
Back
Top Bottom