Does burning in a cpu actually work?

Big.Wayne said:
I think its the other way round :cool:

That is a CPU should perform its very best the moment it is installed, but slowly over time it will degrade due to electromigration.

Perhaps it is because people learn about their systems/mobo/etc and are able to get better overclocks because they have fine tuned all their settings/improved their cOOling etc?

Hmm maybe its also connected to the new car theory, i.e a cars engine is meant to work better after you done a certain amount of miles.

CPU (or Memory) burn-in is a nice idea but i don't think its real, so in theory I am saying a new CPU will overclock better than one which has been 'burnt-in' :D

I agree. It sounds more likely that electronic components will work at an optimum range out of the box. Its not like a car engine that needs running in... in my opinion.
 
I find it hard to believe that a CPU will become 'looser' after a while and therefore enable better overclocking but my old Opteron 146 would only do 2.6 initially but after a week or so was able to take it to 3.0ghz without any problems.

My Conroe on the other hand was able to be cocked to 3.6 out-of-the-box so am undecided really about the concept.
 
I don't think its a matter of electron migration or anything as complex as that. Imo its simply the thermal compounds that are betweem the core and the IHS and the IHS and the HSF that are bedding in correctly creating better temps over time allowing for very slightly higher clocks.

I was once a beliver in 'burning in' a cpu but never really saw much benefit that couldn't be attributed to lower ambient temps. If 'burning in' really works then theoretically a cpu should continue to get better until it pops - this isn't the case sadly. :(
 
w3bbo said:
I don't think its a matter of electron migration or anything as complex as that. Imo its simply the thermal compounds that are betweem the core and the IHS and the IHS and the HSF that are bedding in correctly creating better temps over time allowing for very slightly higher clocks.

I was once a beliver in 'burning in' a cpu but never really saw much benefit that couldn't be attributed to lower ambient temps. If 'burning in' really works then theoretically a cpu should continue to get better until it pops - this isn't the case sadly. :(

Sounds more a like a reasonable explanation of why some people get better clocks over time.
 
Jokester said:
Well, my old FX57 ended up doing 3.6GHz for Superpi 1MB runs after having it for a good few months when originally it wouldn't go over 3.5GHz and the only reason I can come up with is that it was burnt in.

I've definitely had RAM though that's been burnt in, BH5 is well known as far as I'm aware for burning in over time, letting it hit higher frequencies than when it was first bought.

Jokester

Did you, for example, update the BIOS in that time? Or perhaps it was winter and therefore colder when you managed to overclock it further?

Can't say I've witnessed a successful CPU Burn In :)
 
easyrider said:
Some suggest that it may have something to do with dopant stabilization and the dielectric properties fully forming in the tiny, in-circuit semi-conductor junctions, capacitors and other components....


Thoughts?

I don't believe it's anything to do with the CPU.

Also whilst current density in some parts of a CPU is very high, it's still not enough to have strong electromigration - and realy, this word is used way too much.

If you are pulling enough current through a junction to change dopant density, you have bigger problems.

If anything it's to do with the motherboard VRM. Large electrolytics take time for the dielectric to form propperly, to get this you need to run the capacitor close to its rated voltage with reasonably high ripple currents. Running at higher than normal voltage under load may help this. This part isn't a guess, it is true.

I think w3bbo's comment is also possible.
 
Back
Top Bottom