Relationship between Temperature and Performance

Associate
Joined
21 Feb 2010
Posts
531
Location
South Wales
Is there a relationship between the Temperature of hardware and it's Performance?

Obviously when the hardware becomes higher than it's stable temperature it will crash, putting performance at 0, but I'm on about when it is running.

Would a GPU running at 70c be hindered by it's temperature compared to a GPU at 40c?

If not then I can run my fans a little slower to decrease the noise, and have a slight temperature increase.
 
Insofar as higher temperature leads to higher resistance, so..i dunno, power consumption will be in some proportion to temperature..Dunno about direct performance though..
 
What GPU do you have?

These days, you can expect normal operating temperatures between 70 - 90c and still being absolutely fine.

Depends on your GPU though :p
 
On my gtx 275, im seeing temps of 74c max when playing bc2, fan is set on auto (50%), though im adding a second for sli so i can probably expect an increase in temps, therefore i may start to use manual fan setting.
 
Is there a relationship between the Temperature of hardware and it's Performance?

Obviously when the hardware becomes higher than it's stable temperature it will crash, putting performance at 0, but I'm on about when it is running.

Would a GPU running at 70c be hindered by it's temperature compared to a GPU at 40c?

If not then I can run my fans a little slower to decrease the noise, and have a slight temperature increase.

For a while now I have found myself musing over the the exact same question, only with regards to the CPU. My gut tells me that there might just be something to it beyond 'simple' stability issues due to high temps, but unfortunately I am ill-equipped to wrestle with the notion myself. A comment from someone more intimately familiar with electronics on the most basic of levels seems to be sorely indicated.


- Ordokai
 
Last edited:
as temperature goes up so does power consumption. so when overclocking & temps go up, more power is required to keep things working which in turn produces even more heat. keeping the temps down reduces this and aids stability allowing a greater overclock. hence all the money spent on cooling. going sub zero is the way togo. much less energy required, less heat produced- cool.
 
What GPU do you have?

These days, you can expect normal operating temperatures between 70 - 90c and still being absolutely fine.

Depends on your GPU though :p

4850, which I have heard to be a hot card, yet I get temps of 50c max when OverClocked.

I'm looking to get a 5850 and mostly wondered whether Vapor X was viable or not, but it seems I'd be wasting my money.

I was also considering an after market CPU cooler as my CPU gets toasty when OCed, yet as my maximum temperature there is 70c (which you state as normal) I think I'd omce again be wasting money.

Thanks a lot.
 
as temperature goes up so does power consumption. so when overclocking & temps go up, more power is required to keep things working which in turn produces even more heat. keeping the temps down reduces this and aids stability allowing a greater overclock. hence all the money spent on cooling. going sub zero is the way togo. much less energy required, less heat produced- cool.


All fine, good and understandable, but is also beside the point as far as my musings go. The question seems to be :



"Is there anything intrinsic to cold environments that could/does make CPUs and GPUs work 'better'/'faster' in addition to expanded overclocking opportunities and indeed their subsequent performance gains"

Or to put it more bluntly : would the exact same CPU running at the exact same clocks/settings with all other things being equal perform any different in a 10 C environment as opposed to a 60 C one ?


- Ordokai
 
Last edited:
Or to put it more bluntly : will the exact same CPU running at the exact same clocks/settings with all other things being equal perform any different in a 10 C environment as opposed to a 60 C one.


- Ordokai

Put perfectly.
 
If you're thinking along the lines of 'do i get less FPS when my hardware heats up', then no, not at all. Assuming you're not into such extreme temps that you are actually causing errors/damage then the CPU/GPU or whatever will function exactly as at lower temps.

In PC bits, lower temps give you potentially better performance by reducing parasitic resistances/capacitances, improving heat transfer, etc, which then allows you to clock the things faster which gives you your performance gain.

The only sort of application where cooling would directly improve performance is where thermal noise is a problem, for signals say. That's why you get cryogenically cooled amplifiers in radio telescopes and such like.

So can you turn down your fans a little to reduce audible noise causing your CPU/GPU to run hotter and observe no loss in performance? As long as the temps stay within spec so you don't cause damage, then yes.
 
Last edited:
Agreed, the answer to your specific question is no, there would be no change in performance. There are, as already mentioned, plenty of advantages to running cooler.
 
The reason for the question wasn't practical at all, as far as conventional situations go I suspected as much. I mused over it out of pure theoretical curiosity. It stands to reason, for example, that freezing and extremely hot environments would affect interaction between differing factors within a system on a molecular and atomic level, so I mused over the implications those differing conditions might have as far as electronics go.

But, of course, I'm not particularly well versed in any of those fields so couldn't come up with much. Moreover, the question was a tad too specific to search for, so I was left wondering.

The only sort of application where cooling would directly improve performance is where thermal noise is a problem, for signals say. That's why you get cryogenically cooled amplifiers in radio telescopes and such like.

Quite interesting, cheers for that. Now, the only remaining loose end is the following comment made earlier :

"The colder CPU will be at most, substantially less than the time of one clock cycle quicker."
I would very much like to hear the reasoning behind said comment.


P.S. Hopefully I'm on the same page as the OP here, as to not steal the thread so shamelessly :p


- Ordokai
 
Last edited:
I think the point is that you might get a quicker turnaround in clock cycle due to their being less heat involved and therefore less resistance. However, clock speed is not dependent on resistance so it would never be more than one clock cycle faster.

The main benefits of cooler temps are reliability and power saving. I must admit, I often undervolt and underclock my PC components to keep them cool and quiet. I do this far more often than I overclock stuff. Especially new stuff. I tend to overclock things more as they get older. If they get too hot or cause too much noise doing what I want then they get replaced.
 
Is there a relationship between the Temperature of hardware and it's Performance?
Absolutely! :D

a lot of modern hardware will *throttle* the frequencies if Thermal-Levels are breached . . . witnessed this first hand on both processors and motherboards but not seen a GPU throttle yet, they just glitch or artifact if they get too hot! :p

Although some computer hardware can run over 100°C and not break I personally prefere to keep things cool . . . the simple rule-of-thumb I use is literally my thumb, if I can't hold it to the hardware for at least 10 seconds without crying then its running too hot! ;)
 
Throttling is a design addition though, obviously implemented to protect the hardware, but I now think the question was more whether at temps above that if there was a physical effect that prevents the CPU from functioning properly. Or indeed below a reasonable temperature.
 
Computer hardware ticks over based on a set clockrate, the only way it would go faster/slower is if the temperatures affected the timing crystal used... which due to the mechanics involved would require drops close to absolute zero for there to be any change outside of a miniscule number of nanoseconds.
 
I would very much like to hear the reasoning behind said comment.
As you reduce the temperature (or increase the voltage) the signal propogation speed through the CPU logic increases. What it means is that each clock cycle is completed quicker but the CPU sits doing nothing till the next clock starts (it's this time your looking to minimise when you overclock). Because of this no matter how long a calculation the CPU does the output will be less than once clock cycle quicker on the last cycle, but when we're talking about CPUs taking 2x10-10 seconds to complete each clock cycle the difference is minescule.

Also modern colder CPUs use more power for a given voltage due to most of the power consumption coming from the copper interconnect. The benefit being you can decrease the operating voltage and thus decrease the power consumption.
 
So are we on agreement that at 70c the Frame Rate would be the same as at 30c?

This applies to both CPU and GPU, sorry that I confused it with just GPU by using that as an example.
 
Back
Top Bottom