• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU upgrade performing worse in game benchmark

Associate
Joined
21 Apr 2020
Posts
65
tl;dr / BOMBSHELL UPDATE:

I upgraded from 2500K to 2600K, everything else the same, 2600K benchmark (AC Origins) significantly less than expected and notably lower than 2500K!

Changed back to 2500K and tested, 2500K benchmark's now far worse than it was before taking it out for the 2600K!

During / after putting in the 2600K, something has decimated CPU performance.

Update of update: With a different crappy board, less RAM and slightly lower speed, the 2600K is performing way better and as expected... in Origins... CPU-Z and Cinebench r15 and 20 are similar... whatever...

==============================


Playing the CPU-intensive AC Origins on a 2500K @ 4.6GHz and the benchmark score was 6634 with an average FPS of 57.

Now I have a 2600K, but the results are worse. For the same overclock as the 2500K, the FPS is 51 or 52.

Is it just the game being crap or is this a sign something's wrong? To top it off, there's a 2018 video of someone with same CPU, GPU and graphics settings getting 69 FPS / 8063 score.

BIOS is up to date and already cleared CMOS.

Update #1: Turning off hyperthreading saw an improvement in 2600K to 62 FPS and 7361 score.

Update #2: It's not Spectre / Meltdown (benchmarks identical).
 
Last edited:
You didn't do a memory upgrade at the same time, right?

What about other benches, are they too low as well?

Nope.

What can I compare it to? Origins is the only thing I have for comparison directly with my 2500k and that YT video.

80 degrees is insane for just gaming , something is wrong , get those temps under control

AFAIK CPU doesn't throttle at 80C, so that's not the problem.
 
3DMark - There was a 2600k review that showed Time Spy scores. The review's CPU score at stock was 3242, mine's 2893.

Shadow of the Tomb Raider
- The vid on YouTube is probably using a 144Hz monitor, I'm just using 60, so naturally I'm gonna get a lower score.

But if someone on a 144Hz monitor always stays above 60 FPS, shouldn't I always be at 60 FPS with no dips? Mine stayed at around 62 FPS, but a couple of times it dipped it was 10 FPS lower than his, but his was always above 60 FPS.

cpubenchmark.net and the benchmark on CPU-Z are a couple I can think of

On stock without turbo boost I got 324.1 for single thread, reference for 2600k is 345. For multi-thread I got 1607.8, ref is 1686. With turbo to 3.8GHz it beats it decently.

That's not really an upgrade

Erm, 2500k is 4c 4t, 2600k is 4c 8t, it's an upgrade. But what does that have anything to do with the problem?
 
Last edited:
One solution online is to clean reinstall Windows... I really wanna avoid that though, doesn't seem worth it at all. That sounds insane, is that really a viable possibility?

Sounds like an unstable overclock - just because your 2500K would do 4.6Ghz, doesn't mean your 2600K will (it could just be a bad chip).
Start at stock speeds and go from there - in truely multithreaded/core limited programs you should still see an improvement.



You can't compare to someone else's system - they may have a different motherboard, different RAM, different BIOS and Driver versions, a different Antivirus running in the background - there are too many variables.



At the same clock speed a 2600K with hyperthreading disabled should perform almost identically to your 2500K (the only difference being a 2600k has 8MB cache vs 6MB). Something is wrong or you aren't comparing apples to apples if you are seeing that much improvement.



Depending on how you are measuring that temperature, it could be too high and throttling anyway.
Above 1.4v seems excessive anyway.



As above - stop comparing to youtube. Compare your 2500k and 2600k yourself in all of the benchmarks mentioned so far.
If you want to compare with others, use 3DMark's online leaderboard - there are thousands of 2500k and 2600k results rather than focusing on a single youtuber.



That it's not really much of an upgrade.

- How should I make sure I'm comparing apples to apples? Should I set them both to 3.4GHz (2600k's base speed, as 2500k is 3.3), turbo boost off? What voltage should be good? Will too much make it unstable?

- I know you can get differences, but regardless of mobo and with them having less RAM and older drivers, if they're getting FPS in 70s and 80s as it runs, shouldn't I expect 60 FPS most of the time with only the odd dip at worst?

- Ok, I'll do the tests with hyperthreading off and on. The improvement was with the 2600K off and on.

- Can get 4.5GHz on the 2500k below 1.4v. I wanted to make sure the 2600k had enough juice to briefly see how it compared to the 2500k (e.g. if 2500k @ 4.8GHz @ 1.4v wouldn't even reach desktop but the 2600K does fine).

- I don't think I can search by speed / overclock on the 3DMark leaderboard so it isn't too helpful. I can see the top 1000 results for a CPU / GPU combo and I can see my CPU score roughly but, without clicking hundreds, I see there are those of the same speed that do better, it's hard to draw conclusions other than other people have 2600Ks that have a better CPU score, but I'm sure there are others who have worse.

- I'm not sure what he hopes to gain by dropping the bombshell that a 2500K to a 2600K is "not really an upgrade"... Actually I am, but compensating aside it's off-topic. Suffice to say your reply is helpful and on-topic, his 5-word 'contribution' is not.
 
Looking at hwbot.org and using Cinebench R20, I got 1668 score @ 4.5GHz and on the site someone got 1676 marks, and I found another 2600K 4.5GHz result on Google that was 1688.

I'm noting the 3DMark Time Spy results on the site now and will try it at those speeds. Apparently Cinebench R15 is better for this CPU too so will try that and compare.

I just about to post that you should disable HT then noticed your update. I have long maintained that unless a particular program or game is not maxing out all the cores then you will see generally see same or worse results if all other things are equal. (I know of 1 game where that is the exception)

So for you to see an improvement you will have to be playing a game like BFV or encoding/transcoding with x264. where the program will saturate your cores.

You are also making a rod for your own back by having 1 video of somebody running a 2600K and getting a particular score and thinking you will achieve the same.

There are so many variables to consider as to why their score is higher than what you are getting.
  • Do you know how many processes they had running in the background when doing the test?
  • Do you know what speed memory they were using and what the timings were? etc.
Unless your system matches theirs in every way and it is the only one that you have for comparison then trying to achieve what they've got is pretty futile.

At the same clock speed then a 2600K is only an upgrade on the 2500K if the program/game you are using was continually maxing out all the cores on the 2500K.

For what it's worth, the 2500K was at 100% for games like Origins and videos comparing 2500K v 2600K v 3770K on several games always seemed to show a bump between 4 threads and 8 threads (it's why I was keeping an eye out for such an upgrade). So it was surprising to see worse performance, at least in this one game.
 
I have the 2500k back in... and this is utterly bizarre... tried the Origins benchmark three times...

2500K @ 4.5 or 4.6GHz before installing 2600k - Frames score: 6634, FPS: 57, CPU: 16ms, GPU: 17ms

Just now @ 4.5GHz after reinstalling 2500K - Frames score: 4984 / 5253, FPS: 43 / 44, CPU: 23ms / 22ms, GPU: 20ms / 19ms

WTF? I undid the 2004 Windows 10 update but that wasn't it. Maybe the mobo's gone funny... why does this have to be complicated?...
 
Last edited:
Luckily I bought a CPU + mobo combo... JFC I'm going to have to change the entire motherboard aren't I? All this for easily popping out a CPU and putting a new one in.

Someone elsewhere is going on about 'spectre' and 'meltdown', he's saying the patch is now applied to both CPUs...?

There's something funny with that bench, most of your other results make sense, so I'd just ignore it myself.

How can I ignore something going wrong? Somehow after removing my 2500K the CPU area is underperforming. I get not comparing it to some YT video, but the 2600K AND the 2500K are now performing far worse than the 2500K originally!

After you cleared the cmos did you set your ram to the same speed and timings as before?

I've never touched the RAM settings so it'll be the same.
 
For the 2600K most of the results you shared were fine weren't they, Cinebench, passmark, CPU-Z?

I don't know what constitutes fine, what kind of range, but trying to wade through all the results to compare, I suppose they're the low end.

But we know for sure that the 2600K is performing worse than the 2500K was, and now that 2500K is performing far worse than it was originally.

I'm looking into whatever this meltdown and spectre patch is about that has possibly applied during the upgrade.
 
You don't know that in Cinebench, passmark or CPU-Z, only in one bench that keeps throwing out random results. Am I rite?

I don't know what constitutes random. When comparing the results on hwbot it seems it's perhaps at the low end.

Just tried the inSpectre thing, disabled both, restarted, and benchmark was identical to before disabling. So now it's a case of deciding whether I'm clean reinstalling windows or reinstalling the whole motherboard... or cry myself to sleep...
 
Last edited:
I'd just pop the 2600K back in, turn on HT and play some games. If Windows is whacking it with the nerf bat, you can fix that later. It IS a faster CPU, there's no doubt about that, funny business can't undo this fact.

Except it has. Something decimated universal CPU performance when that 2600K went in. Whatever I put in now is going to be going at 75%.

What are you expecting to happen between now and later that will enable me to fix it?
 
It boils down to me not trusting a benchmark that pops out results as consistent as my throwing a dart at a dartboard. If your gaming experience is decent then for one thing you'll have some fresh energy to tackle this, but for another it'll confirm there's not something majorly broken, especially if you do find that overall smoothness/responsiveness has improved in the scenarios you upgraded the CPU for.

But its benchmarks have been consistent.

2500K @ 4.5 / 4.6GHz before 2600K installation: 57 FPS
2600K @ 4.5: 51
2500K now @ 4.5: 44
 
That's exactly what I mean, it's one benchmark and those results are illogical, whereas from what I can gather, the other benchmarks you used suggest it does perform in the same ballpark as a 2600K. You're also basing the performance off one prior run and we're not even 100% sure what the clock speed was, so how do we know the other settings were consistent, or even if the benchmark was the same version? If you trust that enough to give yourself a headache that's up to you, you know a lot more about the circumstances than I do, I've never even run this benchmark, but at a minimum I'd expect my scores to stay the same between runs.

Why have you made those assumptions. One prior run? Not sure what the clock speed was? (Are you expecting 100MHz to make a 10 FPS difference?)

I'm betting it's unliekly that Ubisoft yesterday decided to stealthily go back to an old game and change its benchmark.

Sure the scores fluctuate a bit, but this isn't worrying about a difference of a few FPS. It's gone consistently getting say 57, now consistently getting 48 with the exact same CPU, seeing a bigger drop in FPS at certain points than before every time.
 
Call it a guess :D 100 Mhz no, but there could be something you forgot, especially if the overclock has been in place a long time. Difficult to say what that could be, maybe the overclock not hitting all the cores, or something random like the GPU overclock turning itself off cos it thought the computer crashed when you restarted with a new CPU.

Pretty unlikely, but it's possible, it could also be that e.g. the quality settings were different, since I don't know what is configurable in this benchmark, or if there's a risk that changing your configuration triggered an automatic re-detection, especially if the benchmark is embedded in the game.

True, I just don't like benchmarks that fluctuate like this, I don't care if it is 5% or 25%.

Like I said though, you have a better knowledge of the circumstances than me, if you trust it enough to continue delving, who am I? Just some random geezer who doesn't like game benches.

The only thing in the bios I ever changed was the multiplier and voltage, and the fan control profile. The overclock had only been in place a few weeks and I never overclocked anything else. Everything's exactly the same, except for that Windows 10 2004 update, but reverting that did nothing.
 
Yup, it was the board!

So lucky I bought a CPU / mobo combo rather than just the CPU. At a slightly lower speed, bent pin, wrecked GPU socket, reused thermal paste, less RAM and all, the 2600K is now performing as expected in Origins!

The CPU-Z, Cinebench R20 multi-thread and R15 benches are pretty much the same as the other mobo (guessing <100MHz and a little extra RAM won't make a big difference). So... whatever... not gonna try the 2500k nor my original board again to find out...
 
Back
Top Bottom