• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Base clocks on Intel CPUs appear to be decreasing each generation. Why?

Soldato
Joined
30 Jun 2019
Posts
8,065
TDP is basically a guarantee that the CPU will consume no more power than for example 65w/125w at base clock speed... So, clock speeds above this point will require increasing amounts of power and voltage for each 100mhz (or at least, that is my assumption).

Generally, the unlocked K chips have the highest base clocks.

For the 9th Gen, there's the 8 core 9900KS, with a base clock of 4.0ghz and a tdp of 127w.

For the 10th gen, the 8 core 10700K has a base clock speed of 3.8ghz, tdp = 125w. Also, the 6 core 10600K has a base clock of 4.1ghz, at the same tdp of 125w.

The 11th gen was bad in a lot of ways, but also had lower base clock speeds at a tdp of 125w than the last 2 generations.

For the 12th gen, the 8 P core 12700K has a base clock of 3.6ghz, with a tdp of 125w. The higher spec 8 P core 12900KS has a lower base clock of 3.4ghz at 150w!
That is despite the more advanced, more transistor dense 10nm fabrication technology.

So, am I correct in thinking that the 13th generation is likely to have similarly low base clock speeds, for the same or higher rated tdp?

Is the reason partly due to the generational increases in L3 cache (and L2 in the future)?
 
Last edited:
Because more cores and if you strictly adhere to TDP, as some OEMs will do, base clock is very low.

The same is true for AMD.
 
My impression is that Intel's 10nm process (used in the 12th and 13th generation) hasn't helped them to increase base clocks at a tdp of ~125w.

I just found an article that suggests upto a 40% improvement in power usage, depending on the voltage, for Intel's 7nm EUV ('Intel 4') process.

Graph here:
intel-4-780x439.jpg


The graph suggests that at 1.1v, Intel's new 7nm EUV process will allow a base clock of ~3.65ghz at a normalised power usage of ~3.9x. Intel's 10nm process appears to be limited to a base clock of around 3.3ghz, at a normalised power usage of around ~4.6x. They can push it up a bit higher, at the cost of efficiency.

Intel mentions optimization of their new process at high voltages (1.3v or higher), so it sounds like there could be a possibility of base clocks near to 4.0ghz, hopefully at a tdp of ~125w.

Full article here:

Considering that the 8 core Ryzen 5800X has a base clock of 3.8ghz, and 12 core 5900X has a base clock of 3.7ghz both at a tdp of 105w, I'd say it's pretty likely that Zen 4 equivalents (on the new 5nm EUV process) will overtake these and run at a base clock of 4.0ghz or more, at a similar tdp.

So, Intel will have some catching up to do in 2023.
 
Last edited:
My impression is that Intel's 10nm process (used in the 12th and 13th generation) hasn't helped them to increase base clocks at a tdp of ~125w.

My understanding is that 11th and 12th gen cores use more power than Skylake, they made them a lot fatter, or something (way beyond my knowledge level :cry: ), but since they're also a lot faster they're still more efficient in multi-threading, so beating their clocks wasn't necessary anyway.
 
I definitely don't think lower base clocks is a deliberate move from Intel. If people want lower idle clocks, they can set that themselves in the BIOS with relative ease.

Or, just buy a 12900 or 12700 instead, these have significantly lower base clocks and a tdp of just 65w. the 'T' variants of the CPUs have even lower base clocks and tdps of just 35w.

Like the graph above shows, Intel's most recent 10nm process starts to show increased power usage at around 3.3ghz. So, I think this is the limitation. I would guess that they put the base clock up as high as it will go, before a tdp target is reached, such as 125w, or in the case of the 12900KS, 150w.
 
Last edited:
I definitely don't think lower base clocks is a deliberate move from Intel. If people want lower idle clocks, they can set that themselves in the BIOS with relative ease.

Like the graph above shows, Intel's most recent 10nm process starts to show increased power usage at around 3.3ghz. So, I think this is the limitation. I would guess that they put the base clock up as high as it will go, before a tdp target is reached, such as 125w, or in the case of the 12900KS, 150w.

The TDP limitation certainly seems like the main reason, but Intel do have a history of stagnation/sandbagging, so I don't think it would bother them at all to go backwards, if performance is otherwise improved gen by gen. The reason I'm fairly confident it is not just process, but architecture related is that 11th gen has substantially higher power consumption than 9th or 10th gen, even though it uses the same process.
 
It's interesting to compare Intel's last few generations to the Zen series, in general, AMD have managed to maintain or slightly increase the base clocks, from the first generation, but the first couple of generations did use the less advanced 14nm and 12nm processes.

It does look like Intel's 7nm EUV process will be a big deal for Intel, it will help them to achieve efficiency at clocks near to 4ghz, so desktop CPUs should struggle significantly less if overclocked to 5.0ghz on all cores. Also, lower power devices and laptops might be able to run at 3.0-3.5ghz at lower tdps like 45-55w.

Meteor Lake will almost certainly have more L2 and/or L3 cache, so they will need an improved process to help reduce higher power consumption from this.

I think the E-Cores probably only use another 20 or 30 watts. If you compare the i5 12600 to the 12600K, there's only a 33 watt difference in the turbo boost power rating, with the 12600 boosting slightly lower.
 
Last edited:
Intel base their TDP on the base clocks.

So 12900K 125 watts advertised, that's at 3.2Ghz, not 5.1Ghz.
From a marketing perspective it looks better to write 125 watts TDP on the box, 241 watts doesn't look as good.

Its completely arbitrary, they could write 95 watts on the box if they wanted to, which would be a true statement, at 2.8Ghz. The more cores they add the lower that base clock needs to be if what they want to do is advertise it as a 125 watt CPU.
 
Last edited:
Meteor Lake will almost certainly have more L2 and/or L3 cache, so they will need an improved process to help reduce higher power consumption from this.

I think the E-Cores probably only use another 20 or 30 watts. If you compare the i5 12600 to the 12600K, there's only a 33 watt difference in the turbo boost power rating, with the 12600 boosting slightly lower.

I don't think the cache adds as much power as you might think, a few MB is not much in the grand scheme of things. I'd say it is mainly the voltage, because e.g. if you compare an i7-12700K, i5-12600K, i5-12400 and i3-12100 all running at the same P-clock, the difference in power consumption in most games is surprisingly small. As is the difference between a similarly clocked G6900, G7400 and i3-12100. AMD optimised their architecture for power efficiency, but I don't think Intel did (or if they did, it doesn't seem to have worked very well). Without 10nm I'm inclined to believe Golden Cove would have been less efficient than 10th gen, but unlike 11th gen, better performing. I think you're correct about the e-cores, which is likely why they're going that way with the 13th gen CPUs.
 
Intel base their TDP on the base clocks.

So 12900K 125 watts advertised, that's at 3.2Ghz, not 5.1Ghz.
From a marketing perspective it looks better to write 125 watts TDP on the box, 241 watts doesn't look as good.

Its completely arbitrary, they could write 95 watts on the box if they wanted to, which would be a true statement, at 2.8Ghz. The more cores they add the lower that base clock needs to be if what they want to do is advertise it as a 125 watt CPU.


I'm pretty sure a 12900k will boost higher than 3.2ghz with its TDP limited to 125w in the bios, so that doesn't make any sense. Some site several months ago decided to test the 12900k at various power limits from 35w all the way up to 240w and they found there was only 5-10% performance difference between 125w and 240w, which again doesn't line up with what you're saying because how can a 12900k only improve by 5-10% going from 3.2ghz to 5.1ghz
 
Its completely arbitrary, they could write 95 watts on the box if they wanted to, which would be a true statement, at 2.8Ghz. The more cores they add the lower that base clock needs to be if what they want to do is advertise it as a 125 watt CPU.
I think the tdp is supposed to show that you can get power efficiency out of a CPU upto a certain clock speed. For example, with the 12900K, you can limit it to 125w (PL1 and PL2) and still get around 99% (on average) of the CPU's performance, at least in games:

Limiting PL1 and PL2 to 190w will apparently result in close to 100% performance...

I think Intel has muddied the waters a bit by putting the tdp of the 12900KS at 150w though... They probably think no one will run this CPU at it's base clock anyway.
 
Last edited:
I'm pretty sure a 12900k will boost higher than 3.2ghz with its TDP limited to 125w in the bios, so that doesn't make any sense. Some site several months ago decided to test the 12900k at various power limits from 35w all the way up to 240w and they found there was only 5-10% performance difference between 125w and 240w, which again doesn't line up with what you're saying because how can a 12900k only improve by 5-10% going from 3.2ghz to 5.1ghz

Boost clocks have to be disabled to make the CPU stay at base clocks and base clocks would be calculated based on the shittiest samples. Depending on how they configured the board, the CPU can be pretty effective at managing the boost behaviour to optimise performance, usually you don't see much difference in performance outside of things like extended render runs (that use all the cores).
 
I think the tdp is supposed to show that you can get power efficiency out of a CPU upto a certain clock speed. For example, witthe 12900K, you can limit it to 125w (PL1 and PL2) and still get around 99% (on average) of the CPU's performance, at least in games:

I think Intel has muddied the waters a bit by putting the tdp of the 12900KS at 150w though... They probably think no one will run this CPU at it's base clock anyway.

From what I've seen, only the most demanding games and settings can get upper-end 12th gen CPUs to break their PL1, they're usually sub 100 watt in typical gaming load, with midrange CPUs often running sub 70 watt.
 
I think based on the (quite limited release of) Tiger Lake 10nm desktop CPUs, we can conclude that Intel gained no advantage in base clock speed on 10nm, compared to 14nm:

The 6/8 core unlocked chips both have a base clock frequency of just 3.3ghz.
 
I'm pretty sure a 12900k will boost higher than 3.2ghz with its TDP limited to 125w in the bios, so that doesn't make any sense. Some site several months ago decided to test the 12900k at various power limits from 35w all the way up to 240w and they found there was only 5-10% performance difference between 125w and 240w, which again doesn't line up with what you're saying because how can a 12900k only improve by 5-10% going from 3.2ghz to 5.1ghz

That Techpowerup article shows huge differences on old fashioned CPU render software, h264/265 encoding and java compile. But then very little from other productivity software - you might have just looked at the averages.

This thread's question is about the baseline advertised minimum guaranteed clock which has to go with the worst case scenario. Most people will never see close to that figure unless they do long video edits.
 
My 5800X has written on the box.

3.8Ghz Base
4.7Ghz Maximum Boost

It never drops below 4.6Ghz all core, no matter what i do to it.
The maximum Boost is actually 4.85Ghz. In games it will hold that on all cores.

Some reviewers had a lot of criticism of Zen 2, complaining that its never actually reached the advertised boost clocks, they were usually short by 100Mhz.
Which i find ironic as there are worse fake CPU specification advertisements than that, which they never talk about.
So this may be why they put "maximum boost clock 4.7Ghz" on these when they actually boost 150Mhz higher than that.
 
Last edited:
I might be talking rubbish but I vaguely recall reading something about it being an energy/emissions thing, whereby having a lower 'headline' power consumption (at base clock) allows them to hit targets, pay less tax or some other regulatory advantage.

My son's PC has a 11900F which has a crazy low 2.5ghz base clock for an I9 (consider that Intel had faster clock speeds 20 years ago) but it doesn't really run at that speed, it's usually boosting to at least double that.
 
The 13900K (or high end CPU) has a base clock of just 3.0ghz, according to this:

The TDP is apparently 125w:
2022-07-13_19-48-48-1480x738.png


That would be the lowest base clock for an unlocked mainstream desktop CPU (i3/i5/i7) in probably 10 years or more.

Maybe they will improve increase it a bit more, before release?
 
Back
Top Bottom