Caporegime
Hahaha, well everyone to their own, but AMD needed clock speeds, and for clock speeds they need voltage and for that generates too much heat, no way on earth would you design a card to be running at a maximum of 110C imho, just bcecause it can doesn't mean it should lol, if you did then quite frankly it's a poor design when we can see nVidia can do better performance for less heat... however that's by the by... for me, 110C is way too high and generates more heat in your loop or PC case... so for me, if you wanna run standard clocks just devolt the bugger straight away... and have a happier case and world lol and don't contribute to global warming hahaha
Interestingly, I'd put a wager to say that the next gen AMD won't go above 90C junction!!!
That's Junction Temperature or "Hot spot" the GPU as a whole is not running at 110c but rather this is the hottest part of the GPU, like all GPU's different CU's are under different levels of load from one moment to the next so different part's of the GPU are running hotter than the rest at any given time.
AMD have sensors all over the GPU which measure that, those are "Junction / Hot spot" temperatures.
You have two temperature readouts on Navi, one is usually say 75c while the other 100c, the 75c one is "Edge Temp" with the 100c one "Junction / Hot spot" temperature, Edge Temp is simply reading the temperature of the GPU overall, when it comes to Nvidia you only get the Edge Temp sensor, Just like AMD's cards while the overall GPU temperature is 75c parts of it will be running at 100c+ the difference is Nvidia aren't telling you that.
Edit: there are 64 of these Junction Temp sensors.
Last edited: