So are we on agreement that at 70c the Frame Rate would be the same as at 30c?
This applies to both CPU and GPU, sorry that I confused it with just GPU by using that as an example.
For all intents and purposes yes. But there is always someone like Jokester to enter geek mode and teasingly reveal that, as with most things, in reality it's a bit more complicated, and there are dozens of physical phenomena in play that even experts wont have a full grasp of, so you can never really have a straight black and white answer
For example, since we're in geek mode now, it doesn't end at Jokester's description of how the cooler CPU gets its data out a fraction of a clock cycle quicker due to minutely faster pulse rises/falls.
Ordokai described two otherwise identical CPUs in 'environments' of 10 and 60 degrees, right? So that would imply the reference quartz crystals that feed the mobo clock generators are also at different temperatures due to their environments. Crystal oscillators have a frequency output that varies slightly with temperature which will feed into the CPU clock generator. So even though set the same in BIOS, the clocks of the two systems wont actually be the same - the cooler CPU might be running something like tens of ppms (parts per million) quicker, or of the order of 0.001% due to the thermal shift in the crystal frequency. They might use temperature compensated oscillator circuits but I wouldn't have thought so for such an uncritical application as a PC mobo, and mine appears to just have simple discrete 25MHz crystals. And of course even if they were compensated, the compensation wouldn't be perfect, etc, etc, etc......
So in actual fact the answer is they wouldn't really be the same, but you'd struggle to measure the difference! Leave the two CPUs running IBT (magically finding a way to ensure running IBT doesn't alter their temperatures from your thought experiment values) and the cooler CPU would gain a few runs in a year or so
Well, you did ask
