• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Nova Lake (16th gen) on next gen platform/socket (LGA-1954)

What matters mostly is the performance. Nvidia went the brute force method with the 5090 - it sells really well despite being having such high power draw.

I'd prefer architectural innovation and increased performance per watt, like we used to get in the past, though this seems impossible until a major new architectural innovation or process technology comes along.
Or maybe what sells mostly is performance?

Still, I do not like this brute-force approach at all.

At the end of the day these parts mostly come pre-overclocked or clocked to the max.

They are running far outside the underlying silicon process' "sweet spot". Often the last 10%, 5% or even 2% of extra "from stock" performance uses 40% or 30% of the power.

This over-clocked performance is then used to justify their price.

It used to be that people would take parts and run them near the edge and suffer the consequences - now the manufacturers do this "for us" so they can justify their prices.

Problem, is once one manufacturer goes down this route, the others will follow.

And this during power shortages in many places - the idea of renewables was to replace fossil fuels, not to allow high power usages while keeping the old stuff burning too!
 
Or maybe what sells mostly is performance?

Still, I do not like this brute-force approach at all.

At the end of the day these parts mostly come pre-overclocked or clocked to the max.

They are running far outside the underlying silicon process' "sweet spot". Often the last 10%, 5% or even 2% of extra "from stock" performance uses 40% or 30% of the power.

This over-clocked performance is then used to justify their price.

It used to be that people would take parts and run them near the edge and suffer the consequences - now the manufacturers do this "for us" so they can justify their prices.

Problem, is once one manufacturer goes down this route, the others will follow.

And this during power shortages in many places - the idea of renewables was to replace fossil fuels, not to allow high power usages while keeping the old stuff burning too!

Ironically Intel CPU's are much more efficient at idle and very low loads than Ryzen CPU's. I have a 14900KS that sips power at idle/low loads, it uses less than my 9800X3D and 9850X3D systems at such loads!

I completely agree about the brute force performance method being wasteful and ugly though - but there is no alternative if you want to enjoy games at 4k at 120-240hz on lovely OLED displays.
 
Ironically Intel CPU's are much more efficient at idle and very low loads than Ryzen CPU's. I have a 14900KS that sips power at idle/low loads, it uses less than my 9800X3D and 9850X3D systems at such loads!

I completely agree about the brute force performance method being wasteful and ugly though - but there is no alternative if you want to enjoy games at 4k at 120-240hz on lovely OLED displays.
Will be interesting to see how well Intel's patchwork multi die chips handle idle. I suspect they'll still be better than chiplet Ryzens. Plus at least the new Intel chipsets seem to bring plenty of new features.

4K high refresh on OLED... Just lower the settings!
 
Ironically Intel CPU's are much more efficient at idle and very low loads than Ryzen CPU's. I have a 14900KS that sips power at idle/low loads, it uses less than my 9800X3D and 9850X3D systems at such loads!

I completely agree about the brute force performance method being wasteful and ugly though - but there is no alternative if you want to enjoy games at 4k at 120-240hz on lovely OLED displays.

This is true with multi-chip desktop designs, not for laptops.

The problem is the SoC chiplet, even at idle it pulls about 10 watts, its the bit that keep's everything connected together so it need's to be powered on and active at all times.

Now.... the thing about this argument using 10 watts more while completely idle on a desktop is nothing compared to using up to about 100 watts more while actually is use.

No one cares about the former, it does not matter, the latter does. Intel enthusiasts like to point to it to distract from what actually maters. Its not ironic and its one reason why AMD dominate the DIY desktop market, where people make their own choice they choose efficiency and performance over.... well not the opposite to that.

On the topic of Nova Lake, Intel are going for high core counts, 52 cores / 52 threads (No HT) vs AMD with 24 cores / 48 Threads, there is no doubt in my mind that Intel will beat AMD in Cinebench, video encoding.... stuff like that.
That's all well and good and it will earn some hype and kudos from reviewers who will use hyperbole Youtube thumbnail's like "Intel smashes AMD"
But when the Cinebench slides are all said and done people will want to know what the real world performance is for them and how much the CPU costs, its going to be a very large chip made on the latest and greatest TSMC node, its going to be expensive for Intel to make it and that cost will be passed on to you, Intel will be hoping the initial hype and click bait thumbnails will have people eating out of their hands for £700 CPU's, to that extent it will not work.
Right now on the rain forest out of the 12 best sellers 11 are 6 and 8 core CPU's, only a vanishingly small number of people care about Cinebench, what they care about is the best and most effective pairing with my GPU, even those with vast amounts of money to burn buy 8 core CPU's, because they are the best CPU for their RTX 5090.
If Intel cannot also be that their fortunes will not change and they have a lot of ground to make up against AMD's current CPU's, who are also going to launch a brand new CPU around the same time as Intel.
 
Last edited:
No one cares about the former
Some do care, if you have 100+ office units it adds up. Most of the time PC's will be at idle-->low load unless its just powered up, used, shutdown. for me, it was a big shock how bad AMD are for idle-->low load. Also, if the 52 core is £700 or less, they will sell out fast. That said, the board and RAM price may stop a lot of people making the jump.
 
Some do care, if you have 100+ office units it adds up. Most of the time PC's will be at idle-->low load unless its just powered up, used, shutdown. for me, it was a big shock how bad AMD are for idle-->low load. Also, if the 52 core is £700 or less, they will sell out fast. That said, the board and RAM price may stop a lot of people making the jump.

Did you only read the bit that you quoted?

You're out of context, i specifically said the DIY desktop market, the corporate market is entirely different with different priorities that are quite often nothing to do with the hardware or its capabilities.
 
Some do care, if you have 100+ office units it adds up. Most of the time PC's will be at idle-->low load unless its just powered up, used, shutdown. for me, it was a big shock how bad AMD are for idle-->low load. Also, if the 52 core is £700 or less, they will sell out fast. That said, the board and RAM price may stop a lot of people making the jump.

Not just office use - the Ryzen Z and 7000/8000 series mobile chips for example commonly used in handhelds have decent power consumption for the performance when gaming but if you are using the device for other purposes like say watching a bit of YouTube they are 2-3 times less power efficient than an Intel CPU due to the minimum 4-5 watt SoC/IO/memory controller load even when idle.

And unless you are using a desktop PC with the CPU flat out for most of its powered on time the total power consumption over a longer period often balances out between CPUs especially after PSU inefficiencies.
 
Not just office use - the Ryzen Z and 7000/8000 series mobile chips for example commonly used in handhelds have decent power consumption for the performance when gaming but if you are using the device for other purposes like say watching a bit of YouTube they are 2-3 times less power efficient than an Intel CPU due to the minimum 4-5 watt SoC/IO/memory controller load even when idle.

And unless you are using a desktop PC with the CPU flat out for most of its powered on time the total power consumption over a longer period often balances out between CPUs especially after PSU inefficiencies.
I've got mine down to ~20-25W(using iGPU) when on YouTube (ECO 65+PowerSaverMode), using EXPO/XMP instantly adds 10W+, I don't know if its the same for Intel with XMP?
 
I don't know if its the same for Intel with XMP

Depends on your config, and I suspect the same is true for AMD, the actual overhead for enabling XMP itself is pretty negligible but if that means your RAM is running much faster than with it off then that will have a power consumption difference.
 
Now.... the thing about this argument using 10 watts more while completely idle on a desktop is nothing compared to using up to about 100 watts more while actually is use.

This is just blatantly incorrect. No, Intel does not use 100 watts more when under load for the same performance as Ryzen and that even applies to the 14900k compared to the 7950x. Even the 14900k does not use 100 watts more than the 9950x for a cinebench 2024 multicore score of 2000.

For reference here is the power scaling:

1729796076533-png.2678344

Actual source here: https://www.computerbase.de/artikel...-ultra-200s-285k-265k-245k-test.90019/seite-5

Intels latest top end CPU, the 285k, is more efficient at all power levels compared to the 9950x and a lot more efficient at and below 100 watts. The 285k offers about 70% more performance than the 9950x at a power limit of 50 watts.
 
This is just blatantly incorrect. No, Intel does not use 100 watts more when under load for the same performance as Ryzen and that even applies to the 14900k compared to the 7950x. Even the 14900k does not use 100 watts more than the 9950x for a cinebench 2024 multicore score of 2000.

For reference here is the power scaling:

1729796076533-png.2678344

Actual source here: https://www.computerbase.de/artikel...-ultra-200s-285k-265k-245k-test.90019/seite-5

Intels latest top end CPU, the 285k, is more efficient at all power levels compared to the 9950x and a lot more efficient at and below 100 watts. The 285k offers about 70% more performance than the 9950x at a power limit of 50 watts.

Do you run your £500 Intel CPU at 45 Watts?

The core Ultra 200 series is decent in terms of power efficiency, it aught to be as its built on a newer more advanced TSMC node, 3nm vs 4nm for the Ryzen 9000 series, if it wasn't more power efficient there really would be something seriously wrong with it.

Having said that out of the box the 285K is no more efficient than the 9950X, despite being on a newer better node. Your chart doesn't show exact scores so i got those from somewhere else.

Gamers Nexus.

7-Zip Compression.
9950X: 978 Mips per watt. 100% (204 Watts)
285K: 1108 Mips per watt. 113% (162 Watts)
9950X3D: 1533 Mips per watt. 157% (124 Watts)

7-Zip Decompression.
285K: 1252 Mips per watt. 100% (162 Watts)
9950X: 1332 Mips per watt. 106% (204 Watts)
9950X3D: 1358 Mips per watt. 108% (204 Watts)

Gaming.

Starfield.
265K: Frames per watt. 100% (144 Watts)
285K: Frames per watt. 110% (147 Watts)
9700X: Frames per watt. 150% (80 Watts)
9950X3D: Frames per watt. 170% (99 Watts)

So at best the 285K is 13% more efficient than the 9950X none X3D, that's ignoring the X3D which blows them all out of the water.

I was talking about a pairing with GPU's on the DIY market, IE gaming.

You're right the 285K does not pull 100 Watts more, its not a 14900KS which is what i had in mind and that is the faster gaming CPU from Intel, i mean if you're going to buy a gaming CPU from intel for your high end GPU, because reasons.....? you would buy the 14900K, its faster and cheaper than the 285K.

But its still not good is it? if you compare the 265K to to the 9700X the 265K pulls 64 watts more power (80%) for 20% more FPS.
If you compare the 9950X3D to the 285K the later pulls 45 watts more power (48%) while the 9950X3D pushes 20% more FPS.

Again the 265K and 285K are built on TSMC 3nm, the 9700X and 9950X3D are built on an older TSMC 4nm.

Intel was able to achieve this because they went from making the 14000 series on their own 7nm node to having the 200 series built on TSMC's 3nm node, its still no better and in a lot of cases much worse than AMD's older node CPU's.




PS: if your looking for a 45 Watt CPU for rendering you would buy a 45 watt CPU. There are different options from Intel and AMD, CPU's that are optimised for that power envelope, ones that would do a better job in that category than limiting a £500 CPU to 45 Watts, the point that computerbase are making is academic and completely pointless, its trying to say something about these products that while true is isolated only to its self, think about it for more than 3 seconds and it just becomes apparently idiotic.

Also: i've seen these Youtubers that try to combat the narrative about Intel 200 series by clocking the most expensive ram they can find to the absolute limits, unlock the power limits on the CPU and then clock that to the moon just to push it past the 14900K in games... then point and say "lOok SeE itS actuUUUUaly a gOOd CpU, rEvieWers jUst haTe My-inTeL"

Yeah now your CPU rig costs 3X as much as a 14900K, pulls even more power and probably only last about 6 months before it fry's its self, point well made....
I think Youtubers who do this to CPU's and normalise it is partly to blame for a lot of 14000 series CPU's ending up fried, they push these CPU's to very unsafe level's, tell their viewers this is normal and proclaim Intel the best overlockers, their viewers buy these chips, use these videos as a reference and 8 months later the chip is cooked.
And i've been saying it since long before these chips started cooking themselves.

I've been around for a long time, some of the voltages some of these people put in to these CPU's makes me wince..... in 35 years of doing this i have never had a CPU fry on me, not one not ever. This is a very different generation, none of them have a clue what they are doing.
The rigs that i built and tweak run flawless for more than half a decade.

Yes i actually like the 14900K more. Well 14700K.....
 
Last edited:
Do you run your £500 Intel CPU at 45 Watts?

The core Ultra 200 series is decent in terms of power efficiency, it aught to be as its built on a newer more advanced TSMC node, 3nm vs 4nm for the Ryzen 9000 series, if it wasn't more power efficient there really would be something seriously wrong with it.

Having said that out of the box the 285K is no more efficient than the 9950X, despite being on a newer better node. Your chart doesn't show exact scores so i got those from somewhere else.

Gamers Nexus.

7-Zip Compression.
9950X: 978 Mips per watt. 100% (204 Watts)
285K: 1108 Mips per watt. 113% (162 Watts)
9950X3D: 1533 Mips per watt. 157% (124 Watts)

7-Zip Decompression.
285K: 1252 Mips per watt. 100% (162 Watts)
9950X: 1332 Mips per watt. 106% (204 Watts)
9950X3D: 1358 Mips per watt. 108% (204 Watts)

Gaming.

Starfield.
265K: Frames per watt. 100% (144 Watts)
285K: Frames per watt. 110% (147 Watts)
9700X: Frames per watt. 150% (80 Watts)
9950X3D: Frames per watt. 170% (99 Watts)

So at best the 285K is 13% more efficient than the 9950X none X3D, that's ignoring the X3D which blows them all out of the water.

I was talking about a pairing with GPU's on the DIY market, IE gaming.

You're right the 285K does not pull 100 Watts more, its not a 14900KS which is what i had in mind and that is the faster gaming CPU from Intel, i mean if you're going to buy a gaming CPU from intel for your high end GPU, because reasons.....? you would buy the 14900K, its faster and cheaper than the 285K.

But its still not good is it? if you compare the 265K to to the 9700X the 265K pulls 64 watts more power (80%) for 20% more FPS.
If you compare the 9950X3D to the 285K the later pulls 45 watts more power (48%) while the 9950X3D pushes 20% more FPS.

Again the 265K and 285K are built on TSMC 3nm, the 9700X and 9950X3D are built on an older TSMC 4nm.

Intel was able to achieve this because they went from making the 14000 series on their own 7nm node to having the 200 series built on TSMC's 3nm node, its still no better and in a lot of cases much worse than AMD's older node CPU's.




PS: if your looking for a 45 Watt CPU for rendering you would buy a 45 watt CPU. There are different options from Intel and AMD, CPU's that are optimised for that power envelope, ones that would do a better job in that category than limiting a £500 CPU to 45 Watts, the point that computerbase are making is academic and completely pointless, its trying to say something about these products that while true is isolated only to its self, think about it for more than 3 seconds and it just becomes apparently idiotic.

Also: i've seen these Youtubers that try to combat the narrative about Intel 200 series by clocking the most expensive ram they can find to the absolute limits, unlock the power limits on the CPU and then clock that to the moon just to push it past the 14900K in games... then point and say "lOok SeE itS actuUUUUaly a gOOd CpU, rEvieWers jUst haTe My-inTeL"

Yeah now your CPU rig costs 3X as much as a 14900K, pulls even more power and probably only last about 6 months before it fry's its self, point well made....
I think Youtubers who do this to CPU's and normalise it is partly to blame for a lot of 14000 series CPU's ending up fried, they push these CPU's to very unsafe level's, tell their viewers this is normal and proclaim Intel the best overlockers, their viewers buy these chips, use these videos as a reference and 8 months later the chip is cooked.
And i've been saying it since long before these chips started cooking themselves.

I've been around for a long time, some of the voltages some of these people put in to these CPU's makes me wince..... in 35 years of doing this i have never had a CPU fry on me, not one not ever. This is a very different generation, none of them have a clue what they are doing.
The rigs that i built and tweak run flawless for more than half a decade.

Yes i actually like the 14900K more. Well 14700K.....
What does it matter which node they are on? When AMD was on a better node, which was the case for Zen 3, 4 and part of Zen 5, it was not talked about but now suddenly Intel has better efficiency because it's on a better node. Guess what - on the laptop side Intel is on 18A which also is a better node than the one AMD is using now.

For measuring efficiency you have to power limit to the exact same wattage and assess performance that way. The productivity and gaming power consumption you provided is meaningless if not power limited.

I have power limit my 265k with XTU to 60 watts and its performance is amazing for anything office, programming, productivity and multimedia related at that wattage. There is no point in buying a CPU that is capped at lets say 45 watts when you can just power limit a CPU that has a way higher ceiling.

You will also notice that power limiting the 14900k to for example 100 watts will not cause much performance loss.

Also you actually do not need over 8000 MHz RAM for arrow lake. Running 6400 MHz the 265k is equal or slightly better than Zen 5 non x3d. AMD only does better in gaming because of the x3d models.
 
The Core Ultra 7 on Nova Lake should be very interesting in terms of price/performance, I'll be keeping an eye on that one and might even change from my Zen 4 to it, it all depends on RAM though, I don't know how Intel are going to expect people to upgrade if it requires 8000 speed, same for AMD really. Prices are going to have to come down just before release and we don't even know if they will.

I don't mind keeping my current 6000 CL26 kit but there will probably be decent gains with the extra supported speed.
 
The Core Ultra 7 on Nova Lake should be very interesting in terms of price/performance, I'll be keeping an eye on that one and might even change from my Zen 4 to it, it all depends on RAM though, I don't know how Intel are going to expect people to upgrade if it requires 8000 speed, same for AMD really. Prices are going to have to come down just before release and we don't even know if they will.

I don't mind keeping my current 6000 CL26 kit but there will probably be decent gains with the extra supported speed.

The two kits of RAM which I have bought lately, the 6000C26 and 8000C38, were the same price (Both G.Skill Trident Z Royals). I suspect by the tail end of the year, they should drop in price, but I am not sure we will get close to how they once were for a while.

Also, Asus boards tend to have memory profiles on them, which are normally load and go (On the lower M/T end & CPU IMC), depending on whether you are running that memory IC.
 
Why would you suspect the prices will drop? Come the end of the year, the unhealthy supply that still existed at least, will not any more.
Prices will increase if anything. No one has made any changes that would affect the consumer RAM market (in a positive way) this year.
 
Back
Top Bottom