Heat is what degrades a CPU, the hotter it gets the more the molecules in the circuit move and with that the more resistance you get in the circuit, in turn the more voltage you need which in turn pushes the heat up.
That resistance is the copper breaking down, the moving molecules get in the path of the electrons and are destroyed causing the material to breakdown, the more it breaks down the more resistance it develops and that's the degradation.
The perfect semiconductor, or superconductor has 0 resistance, when you cool the chip to absolute zero -273c the molecules stop moving entirely, you have a superconductor. the holy grail of semiconductors is superconductors at room temperature, the first person to figure that out will be set for life, a very wealthy one.
When you have a relatively small piece of silicon, like a 200mm 14900K pulling 250 watts its very difficult to cool, 80 to 90c is a very high temperature, despite it being "rated" its not good, not at all, if you want a CPU or GPU to last for many years without degrading much you want to keep it at around 60c, no more than 70c