I've got a little home server running Windows Server 2012. It spends a good chunk of it's life (over night, in the day) sitting at idle when i'm not using it. It's currently got a i3 2120T which copes ok, one of the the lower power i3's. Although I've recently come across an i5 3470 for free, a chip quicker by quite a stretch. What i'm looking for is a idea what the energy usage would be just sitting at idle of the two chips. As I know most modern chips really bring the clock etc down when idle, and the i5 being newer maybe more. The TDP figures are 35W for the i3 and 77W for the i5. Now I realise if they were running at 80-100% utilisation their whole life i'd be saving a chunk, though that's not the case. Any views everyone?