Eh? Those results prove what we said is true

The reason they came out almost identical is because as well as reducing the CPU clock to 1.5GHz you also simultaneously increased the allowable CPU usage to 50%... that just counteracted the reduced clock speed...
But if I've understood correctly, this is exactly the point of dirtydog's argument. Assume that he has a 3Ghz CPU. He's saying that if he does a task which requires 20% CPU (and he's not doing anything else which uses his processor significantly), then that task would take the same amount of
real time to complete on a 1.5GHz CPU, with the difference being that 40% of the CPU would be used instead of 20% (assuming the same efficiency). The reason that this task hypothetically only uses 20/40% CPU is that it's limited by some other factor on the system.
Everyone else is arguing that the 3Ghz CPU would complete the task in half the
CPU time of the 1.5Ghz, which is obviously true, because it does twice as many CPU cycles in the same time. But dirtydog is talking about
real time, and saying that the 1.5Ghz would complete the task in the same real time by dedicating twice as many of its resources (CPU cycles) to it in an identical timeframe. His point is that a 1.5Ghz CPU at 50% usage is doing the same amount of work in the same time as a 3Ghz CPU at 25% usage. In other words, both CPUs are providing the task with the same number of CPU cycles per second (in this case, 0.75Ghz of CPU cycles) - the 3Ghz CPU just has more unused cycles. Thus, if you never actually max out the slower CPU (probably because the speed of the task is limited in some other way), the faster CPU will not run the task any faster. Basically, this is an elaborate way of discussing an operation which is not CPU-limited and may not therefore benefit from a faster CPU. Right, DD?
I think that what DD is saying is true,
providing that the slower CPU never hits 100%, i.e. that the task is never limited by CPU speed. But as AcidHell says, the task manager figures are averaged over a second or so. A second is ages for a CPU, and it's highly likely that the slower CPU would have hit 100% for some time in his My Computer test, despite only averaging 37-40%. When the slower CPU is at 100%, the quicker CPU can obviously complete the task in a shorter real time, since at this point real time = CPU time on the slower CPU, and the faster CPU simply has more CPU cycles available to give to the task. Or, looking at it in reverse, the task can utilise a greater number of CPU cycles in its timeframe than the slower CPU can provide. This is the key point. If this is not the case, then the faster CPU would not complete the task any more quickly. An obvious example of this would be a GPU-limited game. DD's example would work if the speed at which his PC opens My Computer is limited by the hard drive and not the CPU, whether it's at 1.5Ghz or 3Ghz.