• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Core i7-11700K beats Ryzen 9 5950X by 8% in Geekbench 5 single-core benchmark

Soldato
Joined
15 Oct 2019
Posts
11,866
Location
Uk
I think AMD would be more worried about the 11600K if it beats out the 5600X and comes in considerably cheaper, maybe they're holding back the 5600 non X to compete or will drop the 5600X by £100.

Also this time B560 will have unlocked memory overclocking which should give Intel a big advantage in the sub £200 price range with chips like the 11400F.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
I think AMD would be more worried about the 11600K if it beats out the 5600X and comes in considerably cheaper, maybe they're holding back the 5600 non X to compete or will drop the 5600X by £100.

Also this time B560 will have unlocked memory overclocking which should give Intel a big advantage in the sub £200 price range with chips like the 11400F.

How much improvement do we expect from 11600K over 10600K to even match the 5600X, when the latter is 50% faster in some apps than the former?

PassMark - AMD Ryzen 5 5600X - Price performance comparison (cpubenchmark.net)
PassMark - Intel Core i5-10600K @ 4.10GHz - Price performance comparison (cpubenchmark.net)
 
Associate
Joined
4 Feb 2009
Posts
1,372
I applaud Intel's attempt to be relevant.

But I'll be damned if my next CPU has less than 12 cores. Till they can compete there, they can go hang.
 
Associate
Joined
14 Nov 2005
Posts
1,557
I suspect a few of them have requested it gets deleted already, quite a sad state off affairs.

The quieter majority will just buy the fastest gaming CPU when it's available, which will be Rocket Lake ;) Can't play Cinebench as a game, and the vast majority have no use for multi-thread performance beyond 8 cores.
You say it WILL be Rocket Lake, have you anything to substantiate this or are you just basing it on a leaked Geekbench result? Also AMD may have to use TSMC but at least they are not stuck in 2014 producing 14nm parts. As games move forward they will be requiring more and more cores / threads so even if this does end up winning in some gaming benchmarks it will not provide longevity as new title are released
 
Associate
Joined
14 Nov 2005
Posts
1,557
Those 1080P games you mention, that love single thread CPU performance, often only have 2-4 threads, I.E., they only use 2-4 cores of a CPU, regardless of the amount of cores the CPU has. The power consumption of a CPU is determined by how many of the cores/threads at at load. In this scenario, where an 8 core CPU is only using 2-4 cores, the CPU will be running at low levels of power consumption...

If we fire up blender, or Cinebench (applications that use all cores, threads etc), then the CPU will be running at it's maximum power target, consuming the most power.

I'm sure rocketlake will consume more power overall compared to Ryzen in games, though it will be running those games faster, which is the important point. As the Nvidia 3000 series have shown us, the masses don't care about power as long as it takes the performance crown.

Based on your sig i do not see a new Intel chip being any better for you than the 5000 series from AMD. Unless of course you run your LG and 3080 at 1080p which would be totally stupid but even then i doubt Intel improve enough to show any real relevance. This thread is nothing more than an Intel fan boy thread trying to troll AMD users
 
Soldato
OP
Joined
31 Oct 2002
Posts
9,912
You say it WILL be Rocket Lake, have you anything to substantiate this or are you just basing it on a leaked Geekbench result? Also AMD may have to use TSMC but at least they are not stuck in 2014 producing 14nm parts. As games move forward they will be requiring more and more cores / threads so even if this does end up winning in some gaming benchmarks it will not provide longevity as new title are released

Nope, developers will develop for the common spec out there, which is usually determined by the current generation of consoles. The new consoles have 8 cores, 16 threads, so it's entirely logical to expect all games to gradually require a similar CPU for the PC ports. This will take a few years though, so still some life left in 6core/12 thread CPU's.
 
Soldato
OP
Joined
31 Oct 2002
Posts
9,912
Based on your sig i do not see a new Intel chip being any better for you than the 5000 series from AMD. Unless of course you run your LG and 3080 at 1080p which would be totally stupid but even then i doubt Intel improve enough to show any real relevance. This thread is nothing more than an Intel fan boy thread trying to troll AMD users

We're all free to spend our money as we please mate. Ill be buying the fastest CPU for gaming, which will be the 11700/11900k. At 4K, I agree there'll be very little difference between a Ryzen 5000 and 11 series CPU, though there will be a difference. For those 1% lows at 4k, an extra few FPS can make a difference.

I also think it's a good idea to get a CPU that supports AVX-512, as I'll likely keep this CPU for several years. I'm sure Intel will be pushing developers to implement AVX-512 code, as this will artificially cripple AMD. Nvidia do the same thing, with Gameworks, RTX, DLSS etc. Ugly, but usually the companies with the most money win.
 
Associate
Joined
14 Aug 2017
Posts
1,195
he new consoles have 8 cores, 16 threads, so it's entirely logical to expect all games to gradually require a similar CPU for the PC ports

Actually it's more logical IMHO that by the time developers have got to 8 cores/16 threads, they expand engines to be able to use an arbitrary number of cores. Reserve a couple for critical path stuff, farm out tasks to threads running on others as needed.

Ill be buying the fastest CPU for gaming, which will be the 11700/11900k

But isn't at the moment and we don't know for sure it will be on release. Right now it's AMD.

I also think it's a good idea to get a CPU that supports AVX-512

Games are unlikely to require it, given how patchy support is across the rolled out processor estate, and on many processors that do have it it absolutely borks clock-speed. It's more of an AI/HPC thing than a desktop thing.
 
Last edited:
Soldato
Joined
4 Feb 2006
Posts
3,223
You say it WILL be Rocket Lake, have you anything to substantiate this or are you just basing it on a leaked Geekbench result? Also AMD may have to use TSMC but at least they are not stuck in 2014 producing 14nm parts. As games move forward they will be requiring more and more cores / threads so even if this does end up winning in some gaming benchmarks it will not provide longevity as new title are released

The single core benefit in games will only show up in low resolutions like 1080P or less. At 1440P and above the gpu will be the limiting factor unless you go for a topend 3080 or 6800XT. I currently have a 3600 and having compared with my brother's 5600X, I see no real reason to upgrade at all. The fps difference is negligible at 1440P 144Hz. I would get barely a couple of fps improvement for a rather expensive £300 upgrade. My next upgrade will be something that can give more than 20% at 1440P and it looks like a gpu will be the better bet.
 
Last edited:
Soldato
Joined
18 Oct 2002
Posts
10,951
Location
Bristol
I applaud Intel's attempt to be relevant.

But I'll be damned if my next CPU has less than 12 cores. Till they can compete there, they can go hang.
And Intel power consumption, 5Ghz on 14nm is just hopeless. The best thing about my 5600X is its performance per watt.
 
Associate
Joined
14 Nov 2005
Posts
1,557
We're all free to spend our money as we please mate. Ill be buying the fastest CPU for gaming, which will be the 11700/11900k. At 4K, I agree there'll be very little difference between a Ryzen 5000 and 11 series CPU, though there will be a difference. For those 1% lows at 4k, an extra few FPS can make a difference.

I also think it's a good idea to get a CPU that supports AVX-512, as I'll likely keep this CPU for several years. I'm sure Intel will be pushing developers to implement AVX-512 code, as this will artificially cripple AMD. Nvidia do the same thing, with Gameworks, RTX, DLSS etc. Ugly, but usually the companies with the most money win.
You just contradicted yourself. In reply to my previous comment you said developers will develop for the common spec which is console based 8 core 16 thread. Yet above you say Intel will push them to use AVX512 and Nvidia will do the same. Yet both consoles are using AMD so no AVX-512 and DLSS
 
Soldato
Joined
26 May 2014
Posts
2,959
The only significant increase seems to be in crypto performance, which is propping up an otherwise underwhelming score. What's even more odd is that it actually scores worse than my 10600K in the multi-threaded crypto test - 8269 vs 9017. Some heavy power or thermal throttling going on I guess, which is the norm for Intel CPUs these days.

I also think it's a good idea to get a CPU that supports AVX-512, as I'll likely keep this CPU for several years. I'm sure Intel will be pushing developers to implement AVX-512 code, as this will artificially cripple AMD.
AVX-512 has zero relevance for game development, and frankly almost zero outside of that as well. The tasks it aims at are far better done on a GPU, but Intel don't make (competent) GPUs so obviously are desperate to provide an alternative so people keep buying their chips. As for Intel forcing game developers to use it, that's nothing more than a fanboy fever dream, given their waning grasp on the enthusiast CPU market and the desire of developers to sell their games to as wide an audience as possible. The list of games which use even the original AVX instructions from 2011 is tiny, and shorter than the list of games that did use them but quickly patched them out in order to keep people with non-AVX CPUs as customers. Cyberpunk 2077 is the latest entry on that list, by the way, with AVX being removed in the first hotfix.
 
Last edited:
Soldato
OP
Joined
31 Oct 2002
Posts
9,912
You just contradicted yourself. In reply to my previous comment you said developers will develop for the common spec which is console based 8 core 16 thread. Yet above you say Intel will push them to use AVX512 and Nvidia will do the same. Yet both consoles are using AMD so no AVX-512 and DLSS

I did no such thing, you just assumed I meant game developers. Games don't use AVX, well hardly any do, only one I remember from age old debates was Serious Sam 3. AVX is used more for productivity applications, examples of which include the following:
  1. Adobe Photoshop.
  2. Adobe After Effects.
  3. Adobe Premiere.
  4. Photoshop / After Effects / Premiere Plug In's.
  5. PeaZip / 7 Zip / WinRar.
  6. x264 / FFMPEG (Used by many desktop applications, for instance, Firefox and VLC Player).
  7. Excel.
  8. MATLAB / Octave / Python (Numpy, Scipy, TensorFlow, etc...).
  9. Julia.
  10. Numeric Libraries (Used intensively in many applications).
I imagine if Intel are successful in getting software developers (not game developers) to implement more AVX-512 instructions, it won't matter how many additional cores AMD have, as they'll simply not be able to use the more efficient, faster AVX path. Raja Koduri of Intel seems to believe it's a much loved feature for the HPC community, AI community and data centre customers.

https://www.pcgamer.com/uk/intel-defends-avx-512-against-torvalds/

AVX isn't important to me currently, though as it comes free of charge it's nice to tick that box, and have a CPU that supports it. If Intel are successful in increasing AVX-512 support, it will pay off down the line.

I feel we're going off at tangents here so will be withdrawing from this discussion, I'm just excited to pickup the fastest gaming CPU in a couple of weeks/months, hope everyone is happy with the competition that we now have between Intel and AMD, enjoy whichever CPU you go for :)
 
Associate
Joined
14 Nov 2005
Posts
1,557
I did no such thing, you just assumed I meant game developers. Games don't use AVX, well hardly any do, only one I remember from age old debates was Serious Sam 3. AVX is used more for productivity applications, examples of which include the following:
  1. Adobe Photoshop.
  2. Adobe After Effects.
  3. Adobe Premiere.
  4. Photoshop / After Effects / Premiere Plug In's.
  5. PeaZip / 7 Zip / WinRar.
  6. x264 / FFMPEG (Used by many desktop applications, for instance, Firefox and VLC Player).
  7. Excel.
  8. MATLAB / Octave / Python (Numpy, Scipy, TensorFlow, etc...).
  9. Julia.
  10. Numeric Libraries (Used intensively in many applications).
I imagine if Intel are successful in getting software developers (not game developers) to implement more AVX-512 instructions, it won't matter how many additional cores AMD have, as they'll simply not be able to use the more efficient, faster AVX path. Raja Koduri of Intel seems to believe it's a much loved feature for the HPC community, AI community and data centre customers.

https://www.pcgamer.com/uk/intel-defends-avx-512-against-torvalds/

AVX isn't important to me currently, though as it comes free of charge it's nice to tick that box, and have a CPU that supports it. If Intel are successful in increasing AVX-512 support, it will pay off down the line.

I feel we're going off at tangents here so will be withdrawing from this discussion, I'm just excited to pickup the fastest gaming CPU in a couple of weeks/months, hope everyone is happy with the competition that we now have between Intel and AMD, enjoy whichever CPU you go for :)
I reckon a lot of the use cases you mentioned above will see adoption of apple silicon in the new imac and mac pro
 
Soldato
Joined
6 Feb 2019
Posts
17,831
I reckon a lot of the use cases you mentioned above will see adoption of apple silicon in the new imac and mac pro

Bingo

Dave doesn't realise that Apple's M1 is busy eating his Intel lunch - I feel sorry for anyone who recently bought an Intel based laptop/MacBook cause your devices has been pwnd by the M1
 
Last edited:
Back
Top Bottom