Upgrade from 2 year old 2500k, WTF? Has PC performance Plateaued?!

Associate
Joined
27 Jan 2013
Posts
4
So every 2 years about this time I upgrade to a new PC. It's my treat to myself, and I actually enjoy all the research of reading online tech sites deciding on the new hardware. PC hardware usually moves at a fast rate, and after 2 years my shiny new PC is significantly faster and better than the one it replaces.

However here's the problem. 2 years ago I bought an i5 2500k (o/c 4.3GHz) with 8GB DDR3 Ram. My limited research this weekend has suggested that PC performance has largely plateaued. It seems there are marginal gains to be made in upgrading the CPU or changing the RAM (8GB of DDR3 still seems plenty enough today!). The only core PC part that has significantly evolved is the video card.

In the last 10 years of PC upgrades this has never happened before. I'm thinking I should just swap in a modern day video card, save a ton of money and skip an upgrade cycle. Come back in the year 2015 and see if there's anything out there worth upgrading to from a 2500k!

It's almost a disappointment to be honest!
 
Well tbh the same is the case for the first gen of i5, i7.

I've not upgraded my x58 based system other than GPU and SSD since 2008. It's a blessing tbh, all you need to upgrade is GPU and wait until games start demanding more grunt than your 2500K can provide. Which might very well be 2015 :D
 
So every 2 years about this time I upgrade to a new PC. It's my treat to myself, and I actually enjoy all the research of reading online tech sites deciding on the new hardware. PC hardware usually moves at a fast rate, and after 2 years my shiny new PC is significantly faster and better than the one it replaces.

However here's the problem. 2 years ago I bought an i5 2500k (o/c 4.3GHz) with 8GB DDR3 Ram. My limited research this weekend has suggested that PC performance has largely plateaued. It seems there are marginal gains to be made in upgrading the CPU or changing the RAM (8GB of DDR3 still seems plenty enough today!). The only core PC part that has significantly evolved is the video card.

In the last 10 years of PC upgrades this has never happened before. I'm thinking I should just swap in a modern day video card, save a ton of money and skip an upgrade cycle. Come back in the year 2015 and see if there's anything out there worth upgrading to from a 2500k!

It's almost a disappointment to be honest!

yup. Don't seen anything troubling a 2500K. Especially overclocked.

I wnet for a 2700K because the deal was too good (£209).
 
I would wait to see what the next round of consoles bring, but for the time being I can't see a 2500K needing to be replaced for a year or so.
 
yup same here with a 2600k,

its getting on a bit now and its still up there with the best that intel is offering (talking maybe single figure % points once its wound up with a decent oc

the fact that the 2x00ks tend to clock faster and run cooler than the newest ivys actually makes them in some ways superior to the current tech. plenty of people getting there sandybridge cpus 4.5/4.6 and some well beyond this
 
Well, you could always grab a 3970x and get into video editing/3D rendering or good ol' epeen benchmarking if you really want to splash some cash and justify it :p
 
Lol well there's not much space to go anywhere demand wise so Intel are now focusing on making smaller chips for tablets etc :p
 
So every 2 years about this time I upgrade to a new PC. It's my treat to myself, and I actually enjoy all the research of reading online tech sites deciding on the new hardware. PC hardware usually moves at a fast rate, and after 2 years my shiny new PC is significantly faster and better than the one it replaces.

However here's the problem. 2 years ago I bought an i5 2500k (o/c 4.3GHz) with 8GB DDR3 Ram. My limited research this weekend has suggested that PC performance has largely plateaued. It seems there are marginal gains to be made in upgrading the CPU or changing the RAM (8GB of DDR3 still seems plenty enough today!). The only core PC part that has significantly evolved is the video card.

In the last 10 years of PC upgrades this has never happened before. I'm thinking I should just swap in a modern day video card, save a ton of money and skip an upgrade cycle. Come back in the year 2015 and see if there's anything out there worth upgrading to from a 2500k!

It's almost a disappointment to be honest!

I would suggest hold out on the video card too. What are you running at the minute? You may as well wait for the next line of cards from Nvidia & AMD.
 
I think it's amazing that for the first time in my adult life a computer I bought a couple of years ago is not obsolete. Not only that but spending a similar amount of money today would buy a largely similar performing PC than it did 2 years ago. The obvious exceptions being video cards and the rise of SSDs. For those of us that have been building PC's for decades it's unprecedented that a CPU we bought 2 years ago is essentially as good as anything we could buy for similar money today.
 
I would suggest hold out on the video card too. What are you running at the minute? You may as well wait for the next line of cards from Nvidia & AMD.

I'm actually running a GTX 460, the original PC had a GTX 260. CPU performance may have slowed right down, but GPUs are still evolving quickly.

These days it looks like you can go through several generations of video cards before you need to upgrade the rest of the system. Even my old PCIE 2.0 16x motherboard (which was old tech 2 years ago) seems fast enough not to limit current gen video cards? and we know even my 2 year old 2500k overclocked is not bottlenecking games..
 
In the last 10 years of PC upgrades this has never happened before. I'm thinking I should just swap in a modern day video card, save a ton of money and skip an upgrade cycle. Come back in the year 2015 and see if there's anything out there worth upgrading to from a 2500k!

That's what I would recommend to do, the problem is (hides beneath asbestos sheet) that AMD's current ability to rival Intel is pretty much at an all time low and as such Intel has slowed itself down because it doesn't need to race. Its a shame tbh speaking as a person who saw AMD's >100MHz 486's, the K6 era Pentium killers, the race to 1GHz, the Athlon XP, 64, etc.

When IBM/Cyrix/IDT/etc pulled out and the CPU race became a two horse affair the was always a danger one horse would get so far behind the one in front could just slack off and sadly were now there :(
 
With about the start of windows 7 and the advent of the Core 2 Duo/Quads with 4GB of RAM and 100MB/s+ hard drives, we hit the point where hardware finally caught up with software in terms of what most people want from a computer - responsive enough for normal use, does most things fairly quickly, able to have multiple windows open.

With the last couple of generations, things have been refined and improved - but I'd agree, there's not as much of a driving force any more. Graphics cards will continue to get more powerful, but the effort just isn't being put in to improving games by THAT much.

There will always be advantages to be gained for encoding/rendering, or for gaming at bigger resolutions, but I have the feeling that until we get another Crysis game (ie, "Oh my god, my PC can't even run this at a playable frame rate, never mind on high at 60fps") there isn't going to be anything to push development forward. Gains will come in efficiency, and things will keep getting faster - but how much more do you really need your CPU to do?
 
Pretty much what Audigex said, software has been improved over the years and so as hardware, sure we can carry on increasing the power of CPUs but nothing is going to be that demanding in general use - so instead we're seeing the power being delivered to smaller packages to increase efficiency but also make some powerful little devices.

That said it's not like we're seeing no productivity increases with newer CPU ranges, just less of an impact you'd notice as we now have multiple cores where as in the past you'd have 1 core doing all the work (which creates a queue of work) rather than distributing work as it is now.

Reminds me of when I used to do virus scans or play games, or perhaps use Photoshop I'd have to close down other applications so things didn't slow down - not anymore :)
 
I know people that still game with the i7 920 with ease, coupled with a wad of RAM and a decent GPU like the 7850, 7870 or GTX670 it will be more than capable for a good year, if not more.
 
Back
Top Bottom