• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

i5 2500K - When Will It Die?

Simples - Intel have not really had competition for 5 years and have been pushing 4 threads as being mainstream.

Perhaps from this year when we are going to see 12 and 16 threads being far more the norm game designers will start using more CPU capability.

GPU's over the 7 years have massively improved which is where game designers have focused. CPU's not so much.
 
For most games, the 2500k is still pretty great. There are still gains to be had, though I can only really notice in some games (battlefield 1 loves threads in multiplayer, so my main rig feels noticeably smoother even with the same GPU)
 
Psycho Sonny;30483681 said:
I have a 2500K @ stock (overclocking has never been my thing) what CPU uses the same socket and would be worth buying now second hand? is there any?

My 2500k went to 4.2 with just a multiplier change. Didn't need to change or add voltage or anything else and was solid for three years and is still going strong. Do it!
 
I replaced mine last January, but yes it's still a decent gaming CPU.

I gave mine to a buddy for a €100 with a decent motherboard and 16gb of ram. He got a steal.
 
I went from 3570k to a 7600k and noticed a slight improvement although that could be helped by the newer chipset and faster ram etc. The main difference I noticed is how much quitter this pc is compared to the last one.
 
Just waiting to see if the single-threaded performance of Ryzen is going to be better than the 2500k.

If it's going to be £300 for a 5% increase then I'll stick with the 2500k. I don't need "moar cores!" :p
 
Still rocking a 2500k, on a P67 Gigabyte I think. I have only ever upgraded the GPU. Built dozens of better systems for other people and getting a bit jealous now. Was going to drop a little cash for a money no object PC but waiting for AMD to ruin Intel's day in march. I want to see what the 1700X and Vega is going to be like and how much Intel will drop their prices.

Running a 980ti atm and unfortunately only 1080p monitor so as you can imagine its runs everything just fine on ultra. I do sometimes notice my brothers 6700k runs Civ6 so much better than mine but for most games there is not much difference if they are not CPU bound. He only runs 290X crossfire anyway (one died recently lol)

I'm proud of my 2500k, its old and originally oc'd to 4.5ghz but ran at 4.2 most its life. I think it may be on the way out soon as I've had to run it at 4.0 and now 3.8 to keep stable. I think I bought it in 2011? correct me if that's impossible but I was fairly early adopter.
 
There are some games or parts of games where a modern i7 will trounce a 2500k. Obviously this becomes more noticeable if you have real good gpu grunt to keep the frames high as well.

Just as an example, those grass areas on Crysis 3 - I went from a 2500k to a 3700k and the minimum fps skyrocketed in those parts.

Also, obviously for rendering/editing a 2500k is very slow these days compared to modern i7's/i7 E's.

It is a great cpu and I used to have one, but it is by no means still up there with the best of them.
 
new i5 vs 2500k in witcher was upto 40 fps quicker.

games that use cpu horse power you can see big gains.bf1 me and mate same gpu he has a i5 2500k vs my 5820k in 64 mp games my fps doesnt move.he has drops down to 40-50 fps in big fights action.hes ocd to 4.4.
 
2600k is a good and cheap upgrade for anyone with a 2500k. I made the upgrade a couple of years ago and it cost me about £20 to upgrade after selling the 2500k.

The extra threads on the i7 makes a big difference in games like BF1 where I see ~80% usage on all 8 threads.

I see no reason to upgrade for a long time yet. I'm not denying a new CPU wont make a difference, but I have zero issues with my current setup.
 
Sampsy;30467061 said:
I'm constantly reading how higher temps and volts will reduce the lifespan of a CPU and so many people put a lot of effort into lowering temps by a few degrees and volts by a few mv. But nobody seems to have a clue how much higher temps and voltage typically reduce CPU lifespan. As far as I can tell nobody has conducted a long-term study. My anecdotal evidence seems to suggest it makes far less of a difference than people think it does.

The main reason IMO is that historically the vast majority of enthusiasts (who by their nature crave speed) wouldn't run a CPU for over 5 years, they would have upgraded by then because something significantly better came along after a couple of years (by significantly better I mean 50-100% better cpu performance and probably better chipset features like faster RAM/USB/Graphics port etc). My attitude with overclocking has always been to hammer my cpu with excessive temps and fairly high voltages because I don't care if it cuts the CPU lifespan from 10 years to 5 years or whatever as I will have binned it by then.

It is only now we are seeing a viable case for running such old CPUs - I'm not quite on the 2500k bandwagon but not far behind (3570k).

Edit: It is worth noting I think my cpu may be starting to show a bit of fatigue as it used to be stable at 4.5ghz but I have had to drop to 4.4ghz now. Many years ago (in the 90s!) I saw a similar thing with a cpu that had suffered from the cpu fan getting stuck and being overclocked, basically it started needing more and more volts to keep stable and a drop in speed before finally packing it in.
 
Minstadave;30470251 said:
Plus the 2600K overclocks to a similar frequency. Most of the performance of the Kabylake chip seen in the benches is from the higher stock clock. Once both are running 4.8-5GHz the Sandybridge chip will be well ahead.

I glanced over the article but don't see a comparison of performance when both overclocked, that's what really matters with K chips.

Yeah I came away from that thoroughly underwhelmed; 6 years ago a dual core chip might have been viable but nowadays quad is actually getting utilised (even in games albeit not all games) so I really don't see it being worth the asking price. They cost like £180 which is a lot considering my 3570k was under £150 nearly five years ago and that's before you factor in presumably needing a new mobo. Indeed even with modern chips I'd probably rather go with a 7600k for an extra £50 or if budget really tight then even a non-K quad if it can be boosted enough via turbo / bclk.
 
People are kidding themselves if they think a sandy bridge can compete with a Skylake at 4.8Ghz/Kabylake at 5.0Ghz. The minimum FPS jumps massively, as well as the average FPS.

Of course you can continue using a 2500k for many years to come, just as there are some people still using Pentium 4's, claiming that it runs everything fine etc. Each to his own I guess.
 
Dave2150;30499435 said:
People are kidding themselves if they think a sandy bridge can compete with a Skylake at 4.8Ghz/Kabylake at 5.0Ghz. The minimum FPS jumps massively, as well as the average FPS.

Of course you can continue using a 2500k for many years to come, just as there are some people still using Pentium 4's, claiming that it runs everything fine etc. Each to his own I guess.

This depends heavily on the game to be fair.

There are a few now that run noticably better on a modern i7 though.
 
Dave2150;30499435 said:
People are kidding themselves if they think a sandy bridge can compete with a Skylake at 4.8Ghz/Kabylake at 5.0Ghz. The minimum FPS jumps massively, as well as the average FPS.

Of course you can continue using a 2500k for many years to come, just as there are some people still using Pentium 4's, claiming that it runs everything fine etc. Each to his own I guess.
You do realise that continuously entering threads about old CPUs just to say they are useless and can't compete is basically threadcrapping, right? We all know that in raw benchmarks Kaby Lake is faster than Nehalem and Sandy Bridge. That isn't the point and you know it, so I wish you'd give it a rest. :rolleyes:

The actual interesting points of discussion are mostly around how it's an unusual time because such old systems are still perfectly viable for modern applications so many years later, which has historically not been the case. Yes an i7-7700K @ 5 GHz is going to produce better gaming results than an i5-2500K @ 4.5 GHz in most cases but the point is how much better and for what cost? A new i7-7700K + Z270 + DDR4 rig costs rather a lot and at the moment there just aren't that many scenarios where such an upgrade is worth the money, particularly when that money could equally be spent on a new GPU, which'd produce a much greater jump in performance. This will continue to change over time to the point where Sandy Bridge becomes less and less viable but we're 6 years on and not there yet, which is pretty crazy. Of course if you already have a GTX 1080 and are looking for those elusive solid 16.6 ms frame times, then it's probably a good upgrade for you. For many people, it just isn't. Law of diminishing returns and all that. Let's not even start on the fact that more and more games are using more than 4 cores now and yet Intel still don't have a 6-core mainstream chip.

I don't use any Pentium 4 systems (never have actually) but my main PC at work is still a Core 2 Duo. When paired with an SSD it's surprisingly usable. It can't handle much in the way of video playback but there is a GPU for that. Any intensive processing is done on clustered servers and VMs. Wouldn't use it for gaming or anything though. :p
 
DragonQ;30499507 said:
You do realise that continuously entering threads about old CPUs just to say they are useless and can't compete is basically threadcrapping, right? We all know that in raw benchmarks Kaby Lake is faster than Nehalem and Sandy Bridge. That isn't the point and you know it, so I wish you'd give it a rest. :rolleyes:

The actual interesting points of discussion are mostly around how it's an unusual time because such old systems are still perfectly viable for modern applications so many years later, which has historically not been the case. Yes an i7-7700K @ 5 GHz is going to produce better gaming results than an i5-2500K @ 4.5 GHz in most cases but the point is how much better and for what cost? A new i7-7700K + Z270 + DDR4 rig costs rather a lot and at the moment there just aren't that many scenarios where such an upgrade is worth the money, particularly when that money could equally be spent on a new GPU, which'd produce a much greater jump in performance. This will continue to change over time to the point where Sandy Bridge becomes less and less viable but we're 6 years on and not there yet, which is pretty crazy. Of course if you already have a GTX 1080 and are looking for those elusive solid 16.6 ms frame times, then it's probably a good upgrade for you. For many people, it just isn't. Law of diminishing returns and all that. Let's not even start on the fact that more and more games are using more than 4 cores now and yet Intel still don't have a 6-core mainstream chip.

I don't use any Pentium 4 systems (never have actually) but my main PC at work is still a Core 2 Duo. When paired with an SSD it's surprisingly usable. It can't handle much in the way of video playback but there is a GPU for that. Any intensive processing is done on clustered servers and VMs. Wouldn't use it for gaming or anything though. :p

We're each entitled to our own opinion. There are no rules saying one opinion is banned and cannot be expressed, so best to refrain from demanding I stop posting my opinion in whichever thread I want to.

It's perfectly fine that some like to use old CPU's, such as X58 or Sandybridge CPU's. Many people like to use black and white TV's, listen to FM radio, use incandescent lightbulbs, walk everywhere. This is their own choice and there's nothing wrong with it.

It's also obvious that X58 and Sandy Bridge CPU's can still play every game out there, so can Pentium 4's, Core 2 Duo's etc. What must be realised is that they cannot provide anywhere close to the experience that a Kabylake @ 5.0Ghz+ can, and it's silly to claim otherwise.
 
Back
Top Bottom