Hi all,
Maybe someone more knowledgeable than myself can explain or point towards resources on how GPU utilization is reported? When I see a video of someone showing a benchmark, what does that number really mean?
For example, assume I see this readout:
FPS: 60
GPU%: 80
CPU%: 50
... and then later:
FPS: 60
GPU%: 100
CPU%: 50
I'd assume that the first area is easier to render than the second (maybe lighting, shadows, whatever...) and therefore the card is "underutilized".
But what does 80% mean?
Trivially one might think that if the compute capacity of the card is 10TFLOP per second and the label displays the utilization for the last 1 second (i.e. values are updated 1/sec), then one could say that the card executed 8TFLOP in that second?
But then what if the card has lowered its frequency to save power, or because the user has simply down-clocked it, or if it's simply thermal-throttling? Is that taken into account?
For example, in CPUs this happens (see this link from Microsoft technet) and if a CPU has a single core with max frequency 2GHz but is running at 1GHz and is fully busy at the time, it will register 100% utilization even though it's really 50% (clock is at half speed).
Maybe someone more knowledgeable than myself can explain or point towards resources on how GPU utilization is reported? When I see a video of someone showing a benchmark, what does that number really mean?
For example, assume I see this readout:
FPS: 60
GPU%: 80
CPU%: 50
... and then later:
FPS: 60
GPU%: 100
CPU%: 50
I'd assume that the first area is easier to render than the second (maybe lighting, shadows, whatever...) and therefore the card is "underutilized".
But what does 80% mean?
Trivially one might think that if the compute capacity of the card is 10TFLOP per second and the label displays the utilization for the last 1 second (i.e. values are updated 1/sec), then one could say that the card executed 8TFLOP in that second?
But then what if the card has lowered its frequency to save power, or because the user has simply down-clocked it, or if it's simply thermal-throttling? Is that taken into account?
For example, in CPUs this happens (see this link from Microsoft technet) and if a CPU has a single core with max frequency 2GHz but is running at 1GHz and is fully busy at the time, it will register 100% utilization even though it's really 50% (clock is at half speed).
Last edited: