• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Alder Lake IPC?

Soldato
Joined
30 Jun 2019
Posts
8,110
I thought Intel's 12th gen would be at least 10% ahead of the competition, in terms of IPC. It looks like once again, Intel's IPC claims were exaggerated. Now, I'm not so sure.

R15 Cinebench IPC (CPUs clocked at 3.5ghz) chart here:
https://www.guru3d.com/articles_pages/core_i5_12600k_processor_review,7.html

UPDATE - R20 Cinebench (CPUs clocked at 4.0ghz) IPC results do seem to be much more favourable towards Alder Lake:
https://static.tweaktown.com/content/9/9/9974_34_intel-core-i9-12900k-alder-lake-cpu-review.png

So, these results indicate that Alder Lake has an IPC that is around 16.4% higher than Zen 3 CPUs, which is closer to what I'd thought Intel might achieve with the 12th gen.
 
Last edited:
before I even click your link I'm I'll take a guess for $1000 that your "ipc" is taken from a. Single application instead of using a suite of 20 or 30 different applications that use different algorithms and instructions to measure IPC. Prove me wrong

Cinebench 15 single threaded, all CPUs clocked 3.5ghz. I'm not saying Alder Lake isn't faster than Zen 3, but there isn't much in it. Reviews on Techpowerup showed an average of 5-10% higher framerate in games on average, depending on if 720p or 4K resolution is set, comparing Alder Lake and the Ryzen 5800X (which seems to be holding up quite well).

It looks like Alder Lake's main advantages come from higher clock speed and core count (to a lesser extent), when compared to CPUs like the 5800X.

There's actually more convincing IPC results here:
https://www.youtube.com/watch?v=xvFZjo5PgG0
 
Last edited:
There's a few games that see a noticeably higher 1% low framerates, examples here:
https://static.techspot.com/articles-info/2351/bench/WDL.png
https://static.techspot.com/articles-info/2351/bench/AoE4.png

Quite a useful 10 game average chart here:

Average.png

So, around 8% higher 1% low frame rates, comparing the 12900K (with DDR5 6000mhz) to the 5900X.
 
Last edited:
That ipc test done by guru3d was done on alderlakes E core.
Haha. I think not, the result shows performance that is still 21% higher core, per clock, compared to Comet Lake. Intel said the E-cores would be within 1% of Comet Lake cores, in term of IPC.

Why the heck would they run the test on one of the slower cores :rolleyes:

We are still looking at a CPU series that offers around 8% higher 1% low framerates on average (1080p), and upto 10% higher average framerates, vs Zen 3 CPUs.

Average framerates at 720p and 4K here:
https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-12th-gen/15.html
https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-12th-gen/18.html

At 4K, the average framerate is only about 2% higher than the 5800X.

Edit - I think it's worth pointing out that the load temperatures (tested in Blender) of the 12700K and 12600K are better at stock settings, compared to the 5800X, so well done Intel.

But the 92 degrees Celsius stock settings temp of the 12900K is just beyond reason. I suppose you need a mega cooler. What's weird is that disabling the E-cores doesn't seem to lower temps. Link here:
https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-12th-gen/21.html
 
Last edited:
Not sure, the 5800X and 5600X did pretty well in games, in my opinion, AMD was ahead in gaming and overall performance until Alder Lake was released. Latency didn't seem to affect these CPUs that much in games, they have more cache than the 11th and 10th Intel generations. They don't support RAM over 3600Mhz at the moment though if I recall correctly, which I think will have to change for the next gen, to compete with DDR5.
 
Last edited:
One of the few games that is actually predominantly CPU bound.
The thing is, it's quite easy to increase CPU utilization in games, just by dialling down the resolution to 720p, or even lower. Same thing also happens when DLSS is used, more CPU utilization. I've found this effect seems to be even greater in DirectX 12 games.

Maybe someone could try this with AOE4?
 
Last edited:
Back
Top Bottom