I won't ruin the CPU if you don;t go too far. It's a bit of a theory rather than something that has been measured and proven - after all, with the current gen of cpu's, the manufacturing processes engaged have not been around long enough to be able to measure degradation over time.
It is believed that as the voltage approaches and exceeds the maximum permissible voltage for that chip, then the chip's construct begins to degrade at an atomic level. I've not seen any evidence of this on my rigs, but it's quite widely reported that if you increase voltages, say for Sandy bridge, above 1.5v for any period of time, then previous lower overclocks that were achievable, would no longer be stable at the voltages previously attained - more being required.
My advise is don't go too extreme if you require the kit to operate for long periods of time, for a long period of time (ie 24/7). This is kind bore-out, at least anecdotally by the server range of chips, ie the Opterons and Xeons. Both ranges have lower V ranges than their desktop counterparts, and overclocking is NOT supported out of the box. Maybe I read into that too much though...
At the end of the day, it's about risk. 65nm chips contain much more silicon per transistor than 43nm, 32nm or 22nm ones.
It stands to reason that as well as requiring higher voltages to operate, they are likely to be more durable as to degrade requires more material to physically degrade.
CPU degradation is very widely discussed subject, but is one that I've not seen any solid articles on - but lots of reported instances that hint that it has happened.
Every CPU contains microscopic differences, and this is only really discussed on a batch level. You can with a lot of searching find out peoples experiences with a particular batch of processors, however, you will also have to take into consideration the other kit that they operate (motherboard and PSU will be absolutely key) as these will also affect the power supply to the chip.