Burn in on TV's is totally different, burn in isn't a loss of performance, it's a loss of image quality. With older CRT's it was cause by the phosphorous losing its luminescence and with modern OLED/LCD the former is a luminescence degradation, the latter is a stuck pixel, none of which effects the performance as you don't suddenly notice the display running at 14 fps or whatever instead of what it's rated for.
And CPU degradation again doesn't result in a loss of performance, it results in crashes because the software running on it doesn't get a result that makes sense to it. Yes fans dying and TIM going hard is also a thing but that can happen whether you mining on the card or not and is fairly easily fix by anyone who's capable of removing a GPU heatsink, but again not a loss of performance, at least not directly as the silicon throttles because of thermals.
CPU's/GPU's/silicon dont lose performance simply from being used, they may need more voltage to maintain the same clock speeds until the increased voltage eventual breaks something inside resulting in errors, but transistors and the traces connecting them simply don't work like that, they're nothing more than wires and switches, wires and switches don't lose performance they either work or don't work, you may need more force (voltage) to make them work but they don't become less performant.