We're pretty much obsessed with keeping our CPU ([email protected] in my case) temps down to a 'sensible' level of around 60 Deg.C under load.
Now typical load temps for Nvidia 8800 GT* series cards seem to be accepted at around 75-80+ Deg.C.
Why is this?
I don't believe for a minute that Nvidia's silicon is 'better' than Intels!
Has anyone tried running their C2D at much higher temps than Intel recommend? If so, what was the end result? Meltdown? Lifespan measured in months?
Now typical load temps for Nvidia 8800 GT* series cards seem to be accepted at around 75-80+ Deg.C.
Why is this?

I don't believe for a minute that Nvidia's silicon is 'better' than Intels!

Has anyone tried running their C2D at much higher temps than Intel recommend? If so, what was the end result? Meltdown? Lifespan measured in months?