So I have a Ryzen 3950x on a dark rock pro 4 cooler & an MSI rx 6800 gaming X on the cooling that comes with.
So i was playing Shadow of the Tomb Raider at stupid 4k settings & very nice it is indeed. Meanwhile, I had AMD adrenaline or whatever it's called open out of curiosity. The rx 6800 would generally be at 65-100% usage, depending on the scene, often 90% + - this gave 70 deg GPU temp, 80 deg hotspot when under 95+% load. Power consumption was in the 150w - 200w range.
Meanwhile, the CPU was always around 6% (16 core CPU, Tomb Raider just doesn't require heavy threading). Adrenaline said it was consuming 50-60 watt but the temp was 70 deg ? It wasn't even hitting top single-core clocks, it was around 3.2 ghz.
OK, so I run a CPU benchmark, Cinebench R23 & the all core temp is 75 deg, consuming 103 watt, so it basically has almost the same temp regardless of the utilisation & watts being consumed.
So I have a question:
- Given that the CPU & GPU are both TMSC 7nm (I think ? I could be wrong) - how come the GPU using 4* as much power in W is able to maintain similar temps? Especially given that I have one of the best mainstream air coolers on the cpu ?
- Or to put it another way : Why is the CPU at 70 deg when lightly threaded & not even hitting max clocks ?
& then more of a statement:
It's crazy how we've got to the point where how many cores are being loaded makes basically no difference to CPU temps, where heavy single-core loads at much lower wattages result in basically the same temps, give or take. This is due to how modern CPUs work, but it takes a lot of mental adjustment to get used to.
I suppose what's happening is that the CPU is load balancing the fan to maintain lowest possible noise at steady temps. I'm just amazed how much more Watts the GPU can deal with, without becoming noisy as hell.
So i was playing Shadow of the Tomb Raider at stupid 4k settings & very nice it is indeed. Meanwhile, I had AMD adrenaline or whatever it's called open out of curiosity. The rx 6800 would generally be at 65-100% usage, depending on the scene, often 90% + - this gave 70 deg GPU temp, 80 deg hotspot when under 95+% load. Power consumption was in the 150w - 200w range.
Meanwhile, the CPU was always around 6% (16 core CPU, Tomb Raider just doesn't require heavy threading). Adrenaline said it was consuming 50-60 watt but the temp was 70 deg ? It wasn't even hitting top single-core clocks, it was around 3.2 ghz.
OK, so I run a CPU benchmark, Cinebench R23 & the all core temp is 75 deg, consuming 103 watt, so it basically has almost the same temp regardless of the utilisation & watts being consumed.
So I have a question:
- Given that the CPU & GPU are both TMSC 7nm (I think ? I could be wrong) - how come the GPU using 4* as much power in W is able to maintain similar temps? Especially given that I have one of the best mainstream air coolers on the cpu ?
- Or to put it another way : Why is the CPU at 70 deg when lightly threaded & not even hitting max clocks ?
& then more of a statement:
It's crazy how we've got to the point where how many cores are being loaded makes basically no difference to CPU temps, where heavy single-core loads at much lower wattages result in basically the same temps, give or take. This is due to how modern CPUs work, but it takes a lot of mental adjustment to get used to.
I suppose what's happening is that the CPU is load balancing the fan to maintain lowest possible noise at steady temps. I'm just amazed how much more Watts the GPU can deal with, without becoming noisy as hell.
Last edited: