• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

4 years on, do we have any major upgrades from i5-2500k?

The 2500k @ 4.5ghz was under the Alpenfohn K2 before moving to the H100i and the 2600k is now the same. 1.3v using offset +0.030.

I see you're point but it's 2015 and they're still releasing games which aren't optimized for multicore processors.

- Dying Light
- Watchdogs

as examples of the top of my head, pretty sure there are more!

Dying light is using 80% of my 4 cores (i have hyperthreading turned off at the moment).. I would call that pretty well done.. Sure it has a few performance bugs but they seem to be working on it.
 
Dying light is using 80% of my 4 cores (i have hyperthreading turned off at the moment).. I would call that pretty well done.. Sure it has a few performance bugs but they seem to be working on it.

Thats good to know as initial benches and testing said otherwise. Maybe they've patched it!
 
I've yet to see any benchmarks showing a CPU at the level of a 2500k becoming a bottleneck in games before the graphics card in any meaningful way, when the following happens I'll be upgrading...

crysis 4 with a 1080GTX and 2500k - 30fps
crysis 4 with a 1080GTX and 6500k - 60fps

But I just can't see it this console generation, 8 cores or not.

CPU is responsible for game logic, physics and shunting data about from RAM but the vast majority of work is done by the GPU hence why 2500ks are still going so strong even with multiple generations of graphics card released since.

Improved physics will hit the CPU but even then we're seeing it offloaded to GPU to an extent with PhysX etc so who knows.

I hope to still have this chip in 2020 tbh. Overclocked it up to 4.6ghz stable but then put back to stock because I realised there was no point lol. When games come out to challenge it I'll up it back to 4.2...then 4.6...then see about an upgrade :P
 
You guys with Sandybridge.. I'm still rocking i5 760! 4ghz paired with a 290 it seems to do alright.

Also. Wtf is with the walls of text from spoffle and Dave?? Grow up kids!
 
In terms of gaming performance, there's nothing out there that's significantly faster than a 2500k, which is disappointing really. Other chips may be faster in the future once games that are capable of utilising 6 or 8 cores come out, but for the time being, no.

It really does depend on the game, the amount of GPUs and the settings you're running it at.

For the most part and most people though it's definitely true.

I've yet to see any benchmarks showing a CPU at the level of a 2500k becoming a bottleneck in games before the graphics card in any meaningful way, when the following happens I'll be upgrading...

crysis 4 with a 1080GTX and 2500k - 30fps
crysis 4 with a 1080GTX and 6500k - 60fps

But I just can't see it this console generation, 8 cores or not.

I think we will, however a lot of benchmarks tend to ignore 2500K age chips and use only the more recent ones.

But as above, it depends on the games. This new console generation ha barely started, we've had the first wave of games, the next are still in development. I expect we'll start to see changes somewhat soon.

CPU is responsible for game logic, physics and shunting data about from RAM but the vast majority of work is done by the GPU hence why 2500ks are still going so strong even with multiple generations of graphics card released since.

Improved physics will hit the CPU but even then we're seeing it offloaded to GPU to an extent with PhysX etc so who knows.

It depends, the CPU tells the GPU what it needs to do (in simple terms). However this is the point of Mantle and DX12, the idea is to reduce the CPU overhead to bring minimums up with PCs that are using weaker CPUs.

With regards to PhysX, it really is an unfortunate situation. It's barely used at all. Very few games actually offload the physics to the GPU, and it's because nVidia are more interested in using it as a checkbox feature than actually putting some real good work in to it.

Plus, the proprietary nature of it means very limited take up and usage from developers anyway, as no developer can fully integrate GPU PhysX so that a PhysX cable GPU is a minimum requirement without damaging sales. As anyone who doesn't have an nVidia GPU couldn't play it.

So devs are limited to using it to add tacked on features that aren't game changing.

I hope to still have this chip in 2020 tbh. Overclocked it up to 4.6ghz stable but then put back to stock because I realised there was no point lol. When games come out to challenge it I'll up it back to 4.2...then 4.6...then see about an upgrade :P

I suppose it's one of those things, even if it's bottlenecking, if the performance you're getting is satisfactory, then it doesn't matter all that much.

Wtf is with the walls of text from spoffle and Dave?? Grow up kids!

Yeah, no. That's also offensive to kids, a lot of kids are actually happy to learn and taken in information and not completely reject it just because they don't like or understand it.

Dave didn't want to listen to the way games are developed. I see people complain about consoles and blame them for PC games not being more advanced than they currently are, as well as moaning and whinging about "console ports".

So I addressed it, and pointed out to him that games aren't made on consoles then converted from the console version to run on the PC. The guy didn't want to know, but tried to carry on insisting that his complete ignorance was how it was done.

This is something everyone should be bothered about, because when the majority of people think buggy games are because of consoles, the people who are responsible for shipping buggy games don't have to take responsibility for it because very few people are calling them out on it.

So the more people that realise how games are ACTUALLY developed, and the better, and they'll stop blaming consoles for crappy builds of games on PC and blame shoddy developers like Ubisoft for shipping crap.
 
Last edited:
To be honest since Sandy Bridge the only real upgrades available have required a lot more spending too, either with a £240 i7 CPU, or move up to the X820K CPUs and all the extra expense involved with those.

Really anyone who paid £170 for a 2500K won't find a worthwhile upgrade at the price, except maybe the odd 2nd hand Xeon with 8 threads. The 2500K is a great overclocker too.

To be honest the main reason for someone to upgrade from Sandy Bridge at the minute would be to upgrade the motherboard.

If AMD can start to compete again, then maybe we'll see Intel offer 6/8 cores for the X670 range of CPUs.
 
If OP is still following his thread, despite the bickering, then I would say that for a single GPU setup there really is no point upgrading his 2500K. Some people have dual high end cards and no issues.

I use a 2550K at 4.6GHz on a full loop with a 780 and I have never thought about upgrading other than because my stupid mobo's bottom slot only works if you have an ivy bridge CPU. But that is mine and the manufacturer's fault.

When I was buying my PC I had a choice between this 2550K or a 3570K and I still chose the 2550K cause it was cheaper (£155 compared to £180 for 3570K), didn't have the rubbish thermal paste and didn't have the risk of having a bad OCing chip.

By the way you should be able to clock higher than 4.2Ghz. Open a thread asking for OCing help in the overclocking section and they'll be able to guide you.

http://forums.overclockers.co.uk/forumdisplay.php?f=7
 
I just swapped from an i5 3570k to a 2700k for £30 total nice little upgrade maybe consider sourcing a 2600k/2700k its a small improvement but in some games it's night and day
 
Interesting thread for me, also have a 2500k at 4.4ghz and a 280X(which is mostly fine at 1080p).

I feel a GPU upgrade will be more worthwhile that a CPU upgrade at the moment.
 
Still gaming on an i7 930, 5 years old this year. Mobo failed and that got swapped out, PSU failed and swapped out. Upgraded to SSD's and upgraded from a 470GTX to a 780, upgraded case from an Antec P182 to a Corsair 750D.

Had 2 monitors during, 3 mice, 3 mouse pads and 2 keyboards

All that during the life of my i7 930 which is still ongoing, maybe to be retired at Skylake.
 
Spoffle is actually pretty spot on. Whilst the terminology "console port" has become quite wide spread and well used it isn't true.

Games are put together as code. Hell, any computer software is put together as code. In my case it's modules which are then all piped together before compiling.

So, a coder will have a massive set of code and then he compiles it and debugs it.

I have pretty much set on calling it "console slop". Mostly because the games are primarily designed for the consoles because that's where the money is. You can sell, for example, 40 million copies of a console game and only around 8 million copies of a PC game. So obviously the code is primarily written to be compiled onto the architecture of the console.

At which point that code is then changed and optimised to run on a PC, or rather, not.

But they're not ports. They are written in code and then compiled using a, believe it or not, compiler. That code is then re-used in order to compile it for the PC.

The issue now, as it has always been, is getting that code right. With a console you have one set of hardware. So there's the standard GPU, whatever that may be, then the standard CPU and so on and so on.

With a PC? there are millions of different combinations, drivers, OSes and so on. That's why they usually run like crap. Well, unless you spend a good chunk of time making sure that code runs well anywhere (kinda like Valve do...) but time = money. Most devs want the payday but don't really want to put in any more work than they have to, hence you get buggy PC games.

The same goes for multiple GPU support. I have an article somewhere where they reviewed the guy who wrote Bioshock and he said that if you intended to run the game with three or more GPUs they would literally need to rewrite the code to handle it and at the same time it would not run properly on a single GPU. This would mean basically having to release and sell separate versions of the game to run on different hardware.

That means printing different covers, burning off different DVDs etc etc. It's all waaaay to time consuming and expensive to do, and exactly how many copies for 3-4 way GPU systems would you actually sell?

How many people have been daft enough* to actually put more than two GPUs in their PC?

* Don't take offence by what I say, please. At the end of the day we all have our passions and our hobbies, but seriously, can you really see triple and quad SLI systems becoming mainstream and affordable?

You can add 3D to that too. Many games are not coded to work in 3D it's all a bodge by Nvidia. We've had very few actual, real 3D games. Mostly because hardly any one uses it....
 
Spoffle is actually pretty spot on. Whilst the terminology "console port" has become quite wide spread and well used it isn't true.

Games are put together as code. Hell, any computer software is put together as code. In my case it's modules which are then all piped together before compiling.

So, a coder will have a massive set of code and then he compiles it and debugs it.

I have pretty much set on calling it "console slop". Mostly because the games are primarily designed for the consoles because that's where the money is. You can sell, for example, 40 million copies of a console game and only around 8 million copies of a PC game. So obviously the code is primarily written to be compiled onto the architecture of the console.

At which point that code is then changed and optimised to run on a PC, or rather, not.

But they're not ports. They are written in code and then compiled using a, believe it or not, compiler. That code is then re-used in order to compile it for the PC.

The issue now, as it has always been, is getting that code right. With a console you have one set of hardware. So there's the standard GPU, whatever that may be, then the standard CPU and so on and so on.

With a PC? there are millions of different combinations, drivers, OSes and so on. That's why they usually run like crap. Well, unless you spend a good chunk of time making sure that code runs well anywhere (kinda like Valve do...) but time = money. Most devs want the payday but don't really want to put in any more work than they have to, hence you get buggy PC games.

The same goes for multiple GPU support. I have an article somewhere where they reviewed the guy who wrote Bioshock and he said that if you intended to run the game with three or more GPUs they would literally need to rewrite the code to handle it and at the same time it would not run properly on a single GPU. This would mean basically having to release and sell separate versions of the game to run on different hardware.

That means printing different covers, burning off different DVDs etc etc. It's all waaaay to time consuming and expensive to do, and exactly how many copies for 3-4 way GPU systems would you actually sell?

How many people have been daft enough* to actually put more than two GPUs in their PC?

* Don't take offence by what I say, please. At the end of the day we all have our passions and our hobbies, but seriously, can you really see triple and quad SLI systems becoming mainstream and affordable?

You can add 3D to that too. Many games are not coded to work in 3D it's all a bodge by Nvidia. We've had very few actual, real 3D games. Mostly because hardly any one uses it....

Wouldn't they just sell the same game to everyone and just give the option of which version to install?

Bit like glide and d3d.
 
Wouldn't they just sell the same game to everyone and just give the option of which version to install?

Bit like glide and d3d.

They could, but it all takes time and time is money. Good coders demand a King's ransom.

It's just not worth it for five gamers out of say a hundred thousand.

I did some testing with quadfire a few years ago and the only game that actually scaled and worked was BFBC2.
 
You guys with Sandybridge.. I'm still rocking i5 760! 4ghz paired with a 290 it seems to do alright.

Was running i5 750 at 4ghz, with 880 gtx, then 5770, then 6950, then 7950, now R9 290.

4790k bundle i spotted was too good to turn down so i thought why not !!

Might even go crossfire and really go OTT :)
 
Interesting thread for me, also have a 2500k at 4.4ghz and a 280X(which is mostly fine at 1080p).

I feel a GPU upgrade will be more worthwhile that a CPU upgrade at the moment.
Running basically the same. Intel kinda shot themselves in the foot with the 2500k for, what was it, £170? For single screen gamers there's still basically no reason to upgrade :D

Loling at the console port bickering, btw. Taking code originally designed for, and engineered around, a specific computing environment then changing it so it'll re-compile and run in another is basically the definition of a software port.
 
I'm seeing companies recommend I7's as CPU's for their games now. The new MK did yesterday, and AC Unity does too.

I posted about this a while back. Does an overclocked 2500K give the same performance as a 3770K at stock in a game where the devs suggest a 3770 at stock as the recommended CPU

Be nice if someone could shed some light on it. Already running a 980, so I know I'm not bottlenecked GPU wise.

You could argue that it's a good upgrade if the software uses all eight. Because you could turn off HT in the games that don't need it.
 
I'm seeing companies recommend I7's as CPU's for their games now. The new MK did yesterday, and AC Unity does too.

I posted about this a while back. Does an overclocked 2500K give the same performance as a 3770K at stock in a game where the devs suggest a 3770 at stock as the recommended CPU

Be nice if someone could shed some light on it. Already running a 980, so I know I'm not bottlenecked GPU wise.

You could argue that it's a good upgrade if the software uses all eight. Because you could turn off HT in the games that don't need it.

Remember the game requirements are generally assuming that a gaming pc will also have a variety of other programs running in the background. Skype, anti-virus, firewall, monitoring programs, tv/show/movie on 2nd monitor, other voip programs (mumble, teamspeak etc), lots of browser tabs, youtube videos etc.

All these additional programs allow the I7 to extend their lead of the I5's - it's just a pity that reviewers benchmarks don't take these factors into account.

Who realistically plays games these days with every single program or non essential service shutdown? Not many.
 
Thanks for the (somewhat impassioned!) replies, fellas. I'm going to stick with the 2500k and see what Skylake K brings. I am certain that I don't want to lose clock speed even if I gain cores, as some recent titles still seem to run everything on one core, so raw GHz is still a factor.
 
Back
Top Bottom