• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Core Ultra 9 285k 'Arrow Lake' Discussion/News ("15th gen") on LGA-1851

That is the thing - at moderate utilisation levels that fudge works, which isn't something represented in most reviews which broadly test in things like gaming and canned heavy utilisation benchmarks. So if you are gaming for say 80% of the system powered on time you will get very different results over a longer time period to if a system is used for say office work 70% of the time and gaming 30% of the time.
but P+E cores wasn't a 'ground up' design and has inherent problems. It's still a fudge.
 
but P+E cores wasn't a 'ground up' design and has inherent problems. It's still a fudge.

Not saying it isn't a fudge, I'm no fan of the E cores at all, but for the context of what I'm talking about it works. And these days with Windows 11 scheduling issues are pretty much an edge case.

EDIT: Actually Windows itself is often to blame for poor power consumption, it gets far too busy in the background when it thinks the user is idle - on my low powered mini PCs I use for server type tasks Windows 10/11 uses almost double the power over a day compared to Windows 7 or Debian/Ubuntu :(
 
Last edited:
Not saying it isn't a fudge, I'm no fan of the E cores at all, but for the context of what I'm talking about it works. And these days with Windows 11 scheduling issues are pretty much an edge case.

EDIT: Actually Windows itself is often to blame for poor power consumption, it gets far too busy in the background when it thinks the user is idle - on my low powered mini PCs I use for server type tasks Windows 10/11 uses almost double the power over a day compared to Windows 7 or Debian/Ubuntu :(
windows vs ubuntu on any platform agreed.
 
gentlemen - we agree to differ

Thing is people get an impression of the 14th gen from reviews, etc. and sure you can thrash them with Cinebench or whatever and see 100s of watts used (I've got my fans set to silent hence thermal throttling with a full stress test and 286 watt turbo limit) - but in other situations like my normal day to day driving with a web browser, Spotify, 3D modelling software and Visual Studio, etc. etc. it is mostly sitting there between 2 and a dozen or two dozen watts for the CPU itself:

mhcxvfS.png


(This is without any of what Jigger is referring to as "gimped" settings)

Same thing on a 7800X3D will be between ~20 watt and ~40 watt, maybe 12-16 watt baseline with slower RAM and a bit of tuning - over double the average of the Intel system. Though that isn't the whole story as the whole system draw at the wall is what counts and that generally is much closer.
 
Last edited:
Thing is people get an impression of the 14th gen from reviews, etc. and sure you can thrash them with Cinebench or whatever and see 100s of watts used (I've got my fans set to silent hence thermal throttling with a full stress test and 286 watt turbo limit) - but in other situations like my normal day to day driving with a web browser, Spotify, 3D modelling software and Visual Studio, etc. etc. it is mostly sitting there between 2 and a dozen or two dozen watts for the CPU itself:

mhcxvfS.png


(This is without any of what Jigger is referring to as "gimped" settings)

Same thing on a 7800X3D will be between ~20 watt and ~40 watt, maybe 12-16 watt baseline with slower RAM and a bit of tuning - over double the average of the Intel system. Though that isn't the whole story as the whole system draw at the wall is what counts and that generally is much closer.
achieved by a mix & match of cores - not for me. and again that situation does not apply to me.
I'll visit Intel's power use once they update thier uarch, till then they are not for me
 
achieved by a mix & match of cores - not for me. and again that situation does not apply to me.
I'll visit Intel's power use once they update thier uarch, till then they are not for me

Was kind of my original point though not aimed at you - the actual power profile is going to depend person to person on what they want and there is a lot more to it if you actually want to lower your power usage than the reviews tend to reveal.
 
Cyberpunk is a bit of an outlier, at least on AMD. It benefits greatly from SMT off, this is despite the developers supposed 'fix' for SMT on AMD CPUs.

I'm not holding out any hope of the developer ever fixing it either, given their recent (lack of) effort with FSR3 in this title.

9950X Low Preset SMT ON


9950X Low Preset SMT Off
Extra try hard with Win10 there :cry:
Win 11 not stronk enough. :(

9950X Low Preset SMT ON


9950X Low Preset SMT Off


Win 10 SMT On 7% faster than Win 11 SMT On.
Win 10 SMT off 6.4% faster than Win 11 SMT Off.
 
Last edited:
Was kind of my original point though not aimed at you - the actual power profile is going to depend person to person on what they want and there is a lot more to it if you actually want to lower your power usage than the reviews tend to reveal.
I can live with Zen + Zen 'c' but I would steer clear of AMD if they mixed different cores. IMHO unnecessary complications for lack of engineering. there is already a huge amount of time & energy wasted on coherence as it is. Over complication is never efficient.
 
Fine for gaming, but the 7800X3D is similar performance wise to a 12700K for anything outside of gaming, and that is for the average - some stuff it is more contemporary with Intel 11th gen (offset by better than average performance in some areas the 3D cache can be applied). Personally I'd take higher power usage for more balanced performance.

I'm using a 7950X3D - it crushes a 12700k in MT performance as well as gaming performance and is much more power efficient. 9950X3D will be it's replacement, which will be the flagship for mixed workloads.

7950X3D is £578.99 @ OCUK and has been cheaper in last few months I believe, really a bargain considering it's performance.
 
I can live with Zen + Zen 'c' but I would steer clear of AMD if they mixed different cores. IMHO unnecessary complications for lack of engineering. there is already a huge amount of time & energy wasted on coherence as it is. Over complication is never efficient.

I've had zero issues with the Ryzen Z1/7000U series with C cores so far, same as E cores or split CCDs really with Windows 11 any issues are edge cases.
 
Was kind of my original point though not aimed at you - the actual power profile is going to depend person to person on what they want and there is a lot more to it if you actually want to lower your power usage than the reviews tend to reveal.
I went with AMD for the possible drop-in CPU upgrade in a few years. I was not concerned about power as they are not far apart when tuned (power at the wall). I did the drop-in upgrade on AM4 and was thinking of doing it again to a 5950x but the x370 chipset was just too old (PCIe gen 3). The B650E-E should age well as PCIe Gen 5 is still not really been used, in a few years it probably will be but I should be OK until AM5 is dead.
 
Win 11 not stronk enough. :(

9950X Low Preset SMT ON


9950X Low Preset SMT Off


Win 10 SMT On 7% faster than Win 11 SMT On.
Win 10 SMT off 6.4% faster than Win 11 SMT Off.

Do you game at 100p low though that is my question :p I still stand by past comments that game benchmarks at low everything at 1080p especially is pointless as it doesn't represent gaming performance at what someone is actually playing at.

Would be interesting to see both 1080p at RT Overload preset and RT Overload at 4K as well. I know the X3D will best the 12700KF at 1080 ion both, as it's a CPU biased resolution , but does it stack up at 4K too?

1080p low
RbsXliw.jpeg


1080p RTOD:
EdjDIan.jpeg


4K RTOD:
vrEC2gg.jpeg


And CPU utilisation during 1080p low
nmJMBxC.png
 
According to some users Intels failing CPUs are down to propaganda from the media.

(Inserts nothing to see gif)

I haven’t seen any of the reviews yet, but let guess. More, more of the same with bells on?
I don’t think it’s propaganda but rather it’s a smaller issue than the media played it out to be.
 
Do you game at 100p low though that is my question :p I still stand by past comments that game benchmarks at low everything at 1080p especially is pointless as it doesn't represent gaming performance at what someone is actually playing at.

Would be interesting to see both 1080p at RT Overload preset and RT Overload at 4K as well. I know the X3D will best the 12700KF at 1080 ion both, as it's a CPU biased resolution , but does it stack up at 4K too?

1080p low
RbsXliw.jpeg


1080p RTOD:
EdjDIan.jpeg


4K RTOD:
vrEC2gg.jpeg


And CPU utilisation during 1080p low
nmJMBxC.png
No, I don't game at 1080P low but in order to become CPU bound you need to run a lower resolution/preset. Even 1080P high preset is not completely CPU bound.

As soon as you enable any RT in this game you just become GPU bound, even though RT can put more strain on the CPU. When GPU bound the CPU makes very little difference, especially at 4K.

RT Overdrive is all about GPU strength as shown below.

4K
 
Last edited:
Oh indeed I'm fully aware of the reasons why RT/higher res are less useful to demonstrate CPU reliance, and this was genuinely interesting, see if I was looking at a platform upgrade and I as a 4K gamer only saw the 1080P benches and paid no mind to the resolution but only the massive fps gap between what I had and what I could have, then I'd be sorely disappointed when I paid for it all, built it and fired up a game only to see 14fps average gain vs the 100fps+ seen in the 1080p results.

That's what I mean, so many review sites do this and I understand why as it's the only way to show the difference between CPUs meaningfully as a mere 15fps in realistic numbers doesn't sell a CPU and platform upgrade to the average buyer who may well be spending in excess of £500 to make that upgrade from their ~3 year old system.

Now also bear in mind I'm on DDR4 3600 as well so mem for me, the gap would be smaller by a small amount still if I were on fast DDR5, not that DDR5 gives a huge gain anyway due to the same above.

Which means the only acceptable upgrade even from a 12th Gen platform POV for gaming is a GPU upgrade, not CPU :p
 
Last edited:
Oh indeed I'm fully aware of the reasons why RT/higher res are less useful to demonstrate CPU reliance, and this was genuinely interesting, see if I was looking at a platform upgrade and I as a 4K gamer only saw the 1080P benches and paid no mind to the resolution but only the massive fps gap between what I had and what I could have, then I'd be sorely disappointed when I paid for it all, built it and fired up a game only to see 14fps average gain vs the 100fps+ seen in the 1080p results.

That's what I mean, so many review sites do this and I understand why as it's the only way to show the difference between CPUs meaningfully as a mere 15fps in realistic numbers doesn't sell a CPU and platform upgrade to the average buyer who may well be spending in excess of £500 to make that upgrade from their ~3 year old system.

Now also bear in mind I'm on DDR4 3600 as well so mem for me, the gap would be smaller by a small amount still if I were on fast DDR5, not that DDR5 gives a huge gain anyway due to the same above.

Which means the only acceptable upgrade even from a 12th Gen platform POV for gaming is a GPU upgrade, not CPU :p

pretty sure HUB does 1080p , 1440p and 4K benchmarks , and yeah the higher the resolution the more the GPU plays a part
 
Back
Top Bottom