• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

HardOCP compare a 2600K with a 7700K

Soldato
Joined
1 Apr 2014
Posts
19,423
Location
Aberdeen
Article here. The results aren't pretty for Intel. TLDR: there's little point in upgrading the CPU if you can overclock your old CPU.
 
So if you've got a stable K sandy bridge upwards, it's not worth the upgrade.

Interesting finish to the article:

AMD, it is your turn. We think we know where you are in terms of IPC; you had best get your pricing structure in line. Don't get greedy, deliver a solid non-beta platform, expand on core-width and chipset functionality going forward, and you are going to win a lot of us enthusiasts back. You get me close to parity with my Haswell, and I am building a new Ryzen system just on enthusiast principle alone.
 
the low settings is to remove the GPU as a bottleneck. At higher settings the 2 CPUs would be much closer in performance.
Exactly, a hypothetical and entirely unhelpful comparison. The usual productivity benchmarks already show raw performance differences anyway.
 
There are ways to show CPU limitation in games without resorting to legacy resolutions like that. This is sadly the state of journalism in this industry...
 
Unless you're playing Bioshock Infinite at 640x480 with Very Low settings, apparently :rolleyes:. Does dg run this website? ;)

aha people just either get it or dont.

if you think a 2600k is close to a 7700/6700k you live in a dream :p

what pains me is the benchmarks or games often chosen.how they do it.
 
The differences are much greater than what is shown, especially regarding minimum fps and frame time consistency (which are far more important attributes than whatever HARDOCP are showing)

Going from stock i5-3570k clockspeed to a 4.5GHz OC makes games far more fluid for me, especially in more recent games such as "The Division" and not the relics they were benchmarking.
 
Exactly, a hypothetical and entirely unhelpful comparison. The usual productivity benchmarks already show raw performance differences anyway.

They're trying to create a best case scenario for Kabylake and it's still showing uninspiring gains.

Be nice to see some more modern games, at normal resolutions and with minimum FPS to get a real world feel of the difference.
 
Going from stock i5-3570k clockspeed to a 4.5GHz OC makes games far more fluid for me, especially in more recent games such as "The Division" and not the relics they were benchmarking.

If you had read the article you would have spotted that they are overclocking the 2600K.
 
Pointless waste of time to test multiplatform shooters at unrealistic settings.
PC exclusive RTS such as StarCraft or Age of Empires 3 would have been far more interesting for CPU testing.

I doubt even 5Ghz Skylake or it's identical re-brand "Cabbage Lake" is fast enough to prevent 0 FPS freezing in large online battles.
 
Christ on a bike, i do wish that site would change it's colour scheme, sends my eyes funny reading high contrast white text on black background.
 
If you had read the article you would have spotted that they are overclocking the 2600K.

I was using my OC as an example of improvement in framerate consistency. I'm sure games would feel smoother again if I could get it to 5GHz, and even at that frequency it would be slower than a equivalent kaby lake at 4.5.

It's the above the article doesn't even attempt to cover.
 
I was really interested to read that, given that it's more or less the conundrum that I'm running through in my head.

But 640x480? Seriously?

I know that using high resolutions isn't as CPU-focused, but let's face it, it's what we're using. I'm playing a 2560x1440 with a modern graphics card. It would have been nice to know if the CPU would make any significant difference.

Hopefully that will be addressed since they're hanging onto it for a bit longer.
 
I was really interested to read that, given that it's more or less the conundrum that I'm running through in my head.

But 640x480? Seriously?

I know that using high resolutions isn't as CPU-focused, but let's face it, it's what we're using. I'm playing a 2560x1440 with a modern graphics card. It would have been nice to know if the CPU would make any significant difference.

Hopefully that will be addressed since they're hanging onto it for a bit longer.


Kyle and the HardOCP boys like to do their own thing, they're nice guys but a bit hot headed. Not too long ago they were setting off a storm about how BWE degrades at an accelerated rate compared with HWE, which is not true. They had not got the weight to back claims like this up. No scope captures, or any real facts. Controversy is their thing.

Anyone can be a hardware journo these days. It doesn't take a genius to work this one out. Showing maximum results at 640x480 is about as pointless as it gets for people wanting real answers.
 
If you're serious about rendering you should be getting one of the many-cored Xeon CPUs.

Depends on the editing software used. Adobe Premier hardly uses more than 4 cores efficiently, so a highly clocked 6700k or 7700k will easily beat a 22 core Xeon E5-2699V4. Linus did a great video showing his mistake in using Xeons in their work flow, not in the sense it was terrible, but a machine that cost a fraction of the cost could easily beat a top of the range Xeon setup.
 
Back
Top Bottom