I think that's where we differ because i don't believe either is a good solution especially if it leaves a consumer in the situation where they can no longer see the screen in certain situations. figures of 100% and 50% brightness have been mentioned here but the reality is few people use their phones at 100% brightness for any real period of time. Mine for example sits mostly at 40 to 50%, so if the solution on my phone was to reduce the brightness, you're looking at (pardon the pun) a reduction to say 25-30%. that's a big difference and it stops me using the phone at all in any strong sunlight. That is not, IMO, a suitable solution for the battery issue. Neither is slowing the phone down. The answer, the only one, is a new battery. That, or design some batteries that actually last the distance and Apple are far from the only guilty party there.
The best solution is for there to simply be a notification telling you that your battery is degrading. Then it's up to the user whether he can get away with reducing brightness just a little. Then you're not forced into a situation "where they can no longer see the screen". That way you're not forced into a situation where you cant see the screen anymore. Even a small reduction in screen brightness is an advantage.
But the thing is it's even worse if they literally half the performance of your device without any warning. What if you have chargers at work, what if you're not concerned about having a battery life? The user is basically forced into halving the performance and he wont be able to play that game for 30 minutes on his commute home because it's now an unplayable jerky mess.
Therefore the "screen dimming reminder" is pretty much the best option right now. The battery savings of halving the cpu speed is still pretty much unsubstantiated.
From what I understand, there are two things which must happen in order to notice considerable battery savings from reduced cpu cycles.
A) You have to be literally
bottlenecking the cpu cycles
B) The software needs to be temporally sensitive, it needs to be synced with realtime (such as video and games) and thus has a built in mechanism for compensating for no available CPU cycles, such as frame dropping.
That is to say that the frames per second in game is limited to 10fps due to bottleneck then 20 frames will be dropped and they will never be rendered, as opposed to steady 30fps with overhead available. But no one is going to play a jerky bottlenecked game anyway. All other software is not time sensitive, the commands are simply queued and the calculations will eventually be performed.
Whether it takes substantially less power to compute something in 2 seconds at 1ghz or 1 second at 2ghz hasn't really been proven here.