• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do o/c CPUs actually degrade?

I am ok with ith burning out after 3 cause i dont have electronics tor that long.

I thought this too; most of my rigs last about 18-24 months before being updated. Then came the 2500K, to which there has still not been a compelling follow up since it was launched... Assuming I can settle it down again, there still won't be a reason to swap it.

Given the near stagnation of single thread performance, I'm genuinely worried I'll have the same PC for another 5 years. How mundane is that :S
 
CPU's don't degrade.

Your overclock might get slower over time, but you wont notice and by the time it's running at its rated speed its fine.

Chips are stressed to the max to simulate years and years of usage.
 
Interesting... So in short, either volts or heat will hasten the degradation via electromigration?

That's what Black's eqn says. It's quite interesting actually, in this simple model the mean time to failure (MTTF) goes as current density (read, voltage) raised to to some -n where n is between 1 and 2, and with exp(-T).

So to stick some numbers into this very simple model, raising the volts by 10% (assuming it makes the current density go up by the same, not sure of the validity of this) will cut the lifetime by 10%-17%. Raising the temperature from 60 to 70 C (3% in kelvin) will cut the lifetime by about 65%. Seems to suggest heat is the main factor! But then again, raising the volts = heat anyway so raising the voltage is bad on both fronts.

There are more advanced models that I'm sure Intel/AMD and others are using. And don't forget that we have no idea what MTTF these chips are designed to. If they're designed for very long lifetimes anyway (many years) then even knocking 65% off might still result in a long MTTF. I have no idea.

CPU's don't degrade.

Your overclock might get slower over time, but you wont notice and by the time it's running at its rated speed its fine.

Chips are stressed to the max to simulate years and years of usage.

All ICs degrade, eventually to failure.
 
joey is right, all degrade. Yes, as a normal user it either works or doesn't but the chip still degrades gradually. That's not to say that is the only factor, as others have observed it may be other components degrading faster than the CPU itself that is reducing the max overclock.

Heat and voltage, as already mentioned, play a large part in the process (though what I've seen previously was about 50% for 10 degrees rather than 65%)

Dust doesn't impact a CPU's performance directly - it impacts a coolers performance by reducing airflow, which in turn causes the CPU to run at a higher temperature which may cause it to reach it's thermal ceiling. Same for paste. Being able to tell how degraded or otherwise it is doesn't mean it is still as new.
 
Last edited:
They either work or they wont. A lot of factors can affect performance of CPU's like dust, old paste etc, but the CPU will not degrade. Unless you get a scope and measure, you will never be able to tell.

What a load of bull. There's more to it than a cpu either working or not

My Phenom II 955be would do 4ghz @1.315v when I first bought it. 2 years abuse later from massive amounts of voltages (Well over AMD spec for Vcore cpu-nb and vdimm) and it needed closer to 1.5v to be fully stable at 4ghz.

Secondly my 3570k Engineering Sample, again lots of voltage and benching abuse, started needing more volts to hold a stable over clock, then started to bsod randomly at stock and eventually it would just about get in to the bios. This happened very rapidly over the space of around 3 days although I suspect the 32nm chip didn't take abuse so well being a more fragile manufacturing node
 
They either work or they wont. A lot of factors can affect performance of CPU's like dust, old paste etc, but the CPU will not degrade. Unless you get a scope and measure, you will never be able to tell.

Read these, they will explain a bit about degradation in intergrated circuits and CPUs.

http://www.anandtech.com/show/2468/6


Transistor Aging

http://spectrum.ieee.org/semiconductors/processors/transistor-aging


http://electronicdesign.com/products/circuit-aging-new-phenomenon-soc-designs


Electromigration

http://en.wikipedia.org/wiki/Electromigration
 
Last edited:
I used to work for HP, I have vast knowledge in SMT. I have debugged for years and have a lot of experience in schematics. CPU's do NOT degrade.

You all keep googling for your answers however.
 
What a load of bull. There's more to it than a cpu either working or not

My Phenom II 955be would do 4ghz @1.315v when I first bought it. 2 years abuse later from massive amounts of voltages (Well over AMD spec for Vcore cpu-nb and vdimm) and it needed closer to 1.5v to be fully stable at 4ghz.

Secondly my 3570k Engineering Sample, again lots of voltage and benching abuse, started needing more volts to hold a stable over clock, then started to bsod randomly at stock and eventually it would just about get in to the bios. This happened very rapidly over the space of around 3 days although I suspect the 32nm chip didn't take abuse so well being a more fragile manufacturing node

lol, there you go, I've picked out your own admission.
 
So, you say they don't degrade, then you say that they do but only cause of too many volts. I'm confused by what you're trying to say :(
CPUs 'normally' outlive the components around them but they do degrade. Increasing voltages and temperatures speeds this process up.

Also, Purgatory & joeyjojo are not googling for answers, but providing proof (they may have just googled the subject, no way of knowing) rather than saying "I'm an expert and I know better" which is the standard answer of everyone arguing on the internet ever when they can't back up a claim. It can be frustrating at times, but it's the only way of appearing reasoned in a debate where nobodies credentials can really be verified and wild claims abound.
 
Do you think an offset OC would be any safer, long-term? I believe it's called something different now, but my understanding of it on my Asus Z68 is that the voltage ramps up when turbo is engaged and sinks back down at idle, avoiding the issues associated with running at high voltages for extended periods of time.

Originally, I had a regular OC, with the voltage locked at 1.25v or whatever it was. I then switched to this offset method and I've been running without issue ever since (~2 years). Is this a better solution, or have I misunderstood?
 
So, you say they don't degrade, then you say that they do but only cause of too many volts. I'm confused by what you're trying to say :(
CPUs 'normally' outlive the components around them but they do degrade. Increasing voltages and temperatures speeds this process up.

Also, Purgatory & joeyjojo are not googling for answers, but providing proof (they may have just googled the subject, no way of knowing) rather than saying "I'm an expert and I know better" which is the standard answer of everyone arguing on the internet ever when they can't back up a claim. It can be frustrating at times, but it's the only way of appearing reasoned in a debate where nobodies credentials can really be verified and wild claims abound.

If you run your/a CPU at its rated speed and voltage it will merrily work away until failure. If you overclock a little you may stay within the CPU's tollerances, but go over that (or abuse it as mentioned) you are killing the CPU. The speed may slow over time after overclocking but without tools like an oscilloscope and oscillator there is 0 way to tell wither this is due to poor cooling, dusty environment, mainboard issues etc. There's many factors that can cause slowness to a CPU.

I am quite happy to accept that if you abuse your CPU it many not perform as it should, I'm also willing to accept that you might get the tiniest of tiny changes in frequency over time, but nothing that will change the rated speed - run a CPU the way its intended will not degrade it.

I totally agree with your claims from people on the Internet, its hard to know who to trust etc and who is telling the truth. I did a quick search of my own and there are many articles which agree with what I say and agree with others that say they do degrade, for example. Robert Redelmeier who created CPU Burn-In states they do not degrade and infact stated the same as me, "it works or it doesn't"

I think Ive said everything I can without repeating myself and sorry for the wall of text.
 
The speed a CPU is running at is dictated by the motherboard. The health of the CPU can perhaps be measured by the max obtainable clockspeed at given volts (for the time being assume stock volts). Over time, the max speed at steady volts drops. Eventually it drops to the point that stock speed fails. From an average user perspective this is now broken. The question Robert Redelmeier was asked was if the clockspeed of a processor changed over time, which it does not as it doesn't pick it's own clockspeed. It either works at that speed or it does not. The problem with this discussion is some are talking about how the CPU performs ignoring the (arbitrary) stock speed, and others are talking about it's ability to hit stock speed at stock volts, which is a yes/no scenario. In a forum like this I tend to think of it's performance separate to stock speeds, though if a non-computing friend was asking why their computer got slower over time, was it the CPU then I'd give the same answer Robert gave.

Given the OP is asking why his CPU no longer runs at the overclocked speed it used to but still runs fine slower, I'm going to say 'yes, your cpu degraded' - by the definition used by LambChop it has not as it'll still run at stock, but it's max performance has become worse.

Mainboard issues are a possibility, but if temps are the same then it's not dust or old paste or anything. Mobo VRMs can and do degrade and die, PSU consistency gets worse etc etc. However, CPUs get worse in terms of their max performance too - which I call degrading, regardless of if they can run at stock. Also their MTBF decreases also when run hot or at high volts, counting failure as no longer running at stock.
 
Last edited:
Also, Purgatory & joeyjojo are not googling for answers, but providing proof (they may have just googled the subject, no way of knowing) rather than saying "I'm an expert and I know better" which is the standard answer of everyone arguing on the internet ever when they can't back up a claim.

I should probably have said that I'm in no way an expert in ICs, my "credentials" I suppose is a physics degree.

I am quite happy to accept that if you abuse your CPU it many not perform as it should, I'm also willing to accept that you might get the tiniest of tiny changes in frequency over time, but nothing that will change the rated speed - run a CPU the way its intended will not degrade it.

Have a read of Purgatory's links, I don't think you understand much of this discussion.

Do you think an offset OC would be any safer, long-term? I believe it's called something different now, but my understanding of it on my Asus Z68 is that the voltage ramps up when turbo is engaged and sinks back down at idle, avoiding the issues associated with running at high voltages for extended periods of time.

Originally, I had a regular OC, with the voltage locked at 1.25v or whatever it was. I then switched to this offset method and I've been running without issue ever since (~2 years). Is this a better solution, or have I misunderstood?

Letting the chip reduce the clocks and voltage whenever possible makes sense, so yes the offset method would be the better of the two. Any increase in clocks/volts will reduce the lifetime of the CPU, you just need to find a compromise that works for you.
 
Last edited:
Hey guys,

Question in the title, really. Short version is that I've had a 2500K since a few months after they launched, running at 4.5GHz on a hair under 1.3v.

Only it doesn't seem to want to run at that anymore :( Getting increasingly frequent lockups while gaming. The weather's warm of late, but temps are never above 75 outside of torture tests.

Took the clock down to 4.4 yesterday, fingers x'd that'll be enough to stabilise it again, but it's worried me that I've pushed the old girl too hard for too long and that she might be on the brink of a slow slide down towards epic failure.

Wondered if anyone else has had similar experiences, and how it worked out in the end?

Cheers

maybe drop the overclock to 4ghz ish and the voltage with it, might help prolongue it rather than have it continue to degrade at current speed? i could be talking nonsense

but yeah 4ghz i5 is more than enough for gaming framerates
 
I had a P-II x6 running at 4 to 4.2Ghz over 2 or 3 years, early on in its life it spent hours and days on end stressed, by the time it got to being 2 years old it wouldn't clock as high anymore and needed higher and higher volts to maintain speeds, before eventually it was difficult to keep stable full stop.

CPU's do degrade over time, if you overclock them that time is less, if you overclock and hammer them a lot that time is short, just a few years.
 
humbug thats still few years so who cares if it dies after 3 year warranty :D Just sell it off before :)
 
Back
Top Bottom