GPU temps vs. CPU temps...

Associate
Joined
5 Mar 2007
Posts
20
Location
West Sussexshire
We're pretty much obsessed with keeping our CPU ([email protected] in my case) temps down to a 'sensible' level of around 60 Deg.C under load.

Now typical load temps for Nvidia 8800 GT* series cards seem to be accepted at around 75-80+ Deg.C.

Why is this? :confused:

I don't believe for a minute that Nvidia's silicon is 'better' than Intels! :rolleyes:

Has anyone tried running their C2D at much higher temps than Intel recommend? If so, what was the end result? Meltdown? Lifespan measured in months?
 
I used 'we' in the sense of the overclocking community in general. :confused:

Didn't mean anything by it. Sorry if offence was caused...
 
I wondered too why the new nVidia GPU's get so hot, the previous series and also the ATI series used to be much cooler. 80C is very hot.
 
Just found something interesting on Intels website...

The C2D T2700 mobile processor (2.33GHz) has a Thermal Specification of....

100 Degrees C :eek: @ 1.3v

Hmm, makes you think doesn't it?

Now imagine the clockspeeds you could get from your E6*** if you were to allow it to run at 100 Deg.C... :D

I don't think I'll be worrying too much about my CPU temps anymore :p
 
Back
Top Bottom