• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

From a i5 2500k to a i7 6700k: my impressions

Was the 2500k hitting 100% in games? If not, then I don't understand the performance boost seen or perceived. How could min fps be higher unless the old CPU was maxing out? (I am asking here, not making a statement).

Yesterday I made a thread here in the CPU sub forum about this very subject. My i7 860 does not hit 100% in games, and because of that I am wondering what the point would be in buying a 6600k.
 
Was the 2500k hitting 100% in games? If not, then I don't understand the performance boost seen or perceived. How could min fps be higher unless the old CPU was maxing out? (I am asking here, not making a statement).
.

It's not that simple. CPUs are complex and individual components within the cpu could be the bottleneck.

If one core / thread maxes out for a few seconds - it's a bottleneck. The memory controller is on the CPU. If memory bandwidth maxes out it can be a CPU bottleneck and you might not have any of the cores at 100%. Game code threading can be complex too. One thread can be waiting for the output of another, which might also depend on memory bandwidth or something the GPU is doing. In such a cases you could effectively be CPU bottlenecked but well under 100% utilisation. It's more like a complex multi-man juggling act than simple relay race handover.
 
i went from a i5 3570k @4.2 to a i7 5820k in games like bf the difference was night and day with mp in big 64 player mp games. even when recording it was better than the i5.

some people just dont notice differences as much as others.:p
 
Agrr every time I read a thread like this I wonder whether I should make the jump.

Currently running an i7 3770 (none K variant) with 16GB ram and a GTX 980ti and do experience some stutter in some games, Fallout 4, Rise of the Tomb Raider as a couple of recent examples and I keep wondering whether an upgrade to a Skylake i7 k processor or even x99 would see any benefit (?)

Kinda thread hijacking here as didnt want to create another thread about upgrading :D

I know what I am about to say is hard to believe and it does make me think there must be another factor in it somewhere although I can't think what, but I actually upgraded from an i5 6600K Skylake to an i7 6700K and my average FPS in Fallout 4 went up from 38 to 67!! That's with a Fury X at 3440x1440. I also installed a H100i GT cooler and a Samsung PM951 256GB M.2 PCI-e 3.0 x 4 NVMe SSD which, is the slightly older one with 1000Mbs read and 280 write, at the same time as installing the chip. I did move Fallout 4 on to the M.2 drive but is was on a Samsung Evo 850 SSD before so I don't think that would be attributable for the performance increase. I know neither the chip or drive upgrade on their own or even both combined should have given this kind of performance upgrade but it did!! What else could it have been?

In contrast, the FPS in the ROTR benchmark increased by only 0.5 of a frame. I did see a bit of a boost in a couple of other games; Shadow of Mordor gave me an extra 3 FPS and in Grid Autosport I got an extra 6 FPS.
 
Last edited:
I went from an i5-750 processor to an i7-3770k and in the first few game benchmarks I ran I noticed no difference at all :D. Will be upgrading again soon so will be good to see if any difference this time around. Obviously there are other non-gaming benefits to upgrading....
 
I was getting slowdowns in some games with i5 [email protected] and 980ti where gpu usage would drop and now with [email protected] all those slowdowns are gone so yeah it is a noticeable upgrade.
Too bad I can't get my 6700K past 4.5ghz stable.
For stable 4.5 ghz it requires 1.4v. Not sure what to blame the chip or the motherboard.
 
Last edited:
So is it worth me going from current 2700K @ 4.5Ghz to a 6700K?

I've started to stream on Twitch a bit more, play DayZ and Arma 3 BR..

Sorry to slightly hijack thread :)
 
So is it worth me going from current 2700K @ 4.5Ghz to a 6700K?

I've started to stream on Twitch a bit more, play DayZ and Arma 3 BR..

Sorry to slightly hijack thread :)

basically : NO. For gaming there is no point upgrading from a Sandybridge at 4.5 to anything out there. Anyone who suggests otherwise is lying.
 
basically : NO. For gaming there is no point upgrading from a Sandybridge at 4.5 to anything out there. Anyone who suggests otherwise is lying.

Ok but what about video rendering and streaming? I also use a third monitor plus an HDTV hooked up to the iGPU on the motherboard.
 
It's not that simple. CPUs are complex and individual components within the cpu could be the bottleneck.

If one core / thread maxes out for a few seconds - it's a bottleneck. The memory controller is on the CPU. If memory bandwidth maxes out it can be a CPU bottleneck and you might not have any of the cores at 100%. Game code threading can be complex too. One thread can be waiting for the output of another, which might also depend on memory bandwidth or something the GPU is doing. In such a cases you could effectively be CPU bottlenecked but well under 100% utilisation. It's more like a complex multi-man juggling act than simple relay race handover.

Thanks. So given all of that, any point in upgrading from a 4.0ghz i7 860 to a 6600k (which I will over clock too)?
 
Ok but what about video rendering and streaming? I also use a third monitor plus an HDTV hooked up to the iGPU on the motherboard.

lol my rendering time is a third of what it was.trust me if you stream render videos it will be vastly difference.
 
Thanks. So given all of that, any point in upgrading from a 4.0ghz i7 860 to a 6600k (which I will over clock too)?

Well there was quite a big jump in IPC to Sandy Bridge and further improvements since then.

It all depends what you're actually bottlenecked in as to whether it's worth it.
 
Well there was quite a big jump in IPC to Sandy Bridge and further improvements since then.

It all depends what you're actually bottlenecked in as to whether it's worth it.

Remember the thread I made last week that you helpfully responded to? The highest CPU core in fallout 4 was 87% and GPU was at 100%. It's hard to say if the CPU is a bottle neck going off the figures.
 
I have had a similar experience upgrading from an i7 860 to an i5 6600k. I have gained a frankly unbelievable performance increase in games using the same GPU. I made a thread about it, but very few seemed to care. Maybe I wrote it badly and didn't get my point across well:
https://forums.overclockers.co.uk/showthread.php?t=18736047

It confirms that even if your CPU isn't at 100% utilisation, you still get bottlenecks from older ones. I will never wait that long before upgrading (6 years) again. New CPU every 3 for me now.
 
Back
Top Bottom