But you’re the only one talking misleading nonsense. Intel are just miles off the mark and that’s just the reality of it. Sorry.
Oh give over. Can you stop with this childish BS for at least one day.
Last edited:
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
But you’re the only one talking misleading nonsense. Intel are just miles off the mark and that’s just the reality of it. Sorry.
but P+E cores wasn't a 'ground up' design and has inherent problems. It's still a fudge.That is the thing - at moderate utilisation levels that fudge works, which isn't something represented in most reviews which broadly test in things like gaming and canned heavy utilisation benchmarks. So if you are gaming for say 80% of the system powered on time you will get very different results over a longer time period to if a system is used for say office work 70% of the time and gaming 30% of the time.
but P+E cores wasn't a 'ground up' design and has inherent problems. It's still a fudge.
windows vs ubuntu on any platform agreed.Not saying it isn't a fudge, I'm no fan of the E cores at all, but for the context of what I'm talking about it works. And these days with Windows 11 scheduling issues are pretty much an edge case.
EDIT: Actually Windows itself is often to blame for poor power consumption, it gets far too busy in the background when it thinks the user is idle - on my low powered mini PCs I use for server type tasks Windows 10/11 uses almost double the power over a day compared to Windows 7 or Debian/Ubuntu![]()
Shame that new motherboards are getting ridiculously expensive.
gentlemen - we agree to differ
That's more than I have paid for my Strix motherboards that I bought few years ago.The Asus prices are crazy the Strix Z890-E GAMING is £200 more than the Strix X870E equivalent.
achieved by a mix & match of cores - not for me. and again that situation does not apply to me.Thing is people get an impression of the 14th gen from reviews, etc. and sure you can thrash them with Cinebench or whatever and see 100s of watts used (I've got my fans set to silent hence thermal throttling with a full stress test and 286 watt turbo limit) - but in other situations like my normal day to day driving with a web browser, Spotify, 3D modelling software and Visual Studio, etc. etc. it is mostly sitting there between 2 and a dozen or two dozen watts for the CPU itself:
![]()
(This is without any of what Jigger is referring to as "gimped" settings)
Same thing on a 7800X3D will be between ~20 watt and ~40 watt, maybe 12-16 watt baseline with slower RAM and a bit of tuning - over double the average of the Intel system. Though that isn't the whole story as the whole system draw at the wall is what counts and that generally is much closer.
achieved by a mix & match of cores - not for me. and again that situation does not apply to me.
I'll visit Intel's power use once they update thier uarch, till then they are not for me
Cyberpunk is a bit of an outlier, at least on AMD. It benefits greatly from SMT off, this is despite the developers supposed 'fix' for SMT on AMD CPUs.
I'm not holding out any hope of the developer ever fixing it either, given their recent (lack of) effort with FSR3 in this title.
9950X Low Preset SMT ON
9950X Low Preset SMT Off
![]()
Win 11 not stronk enough.Extra try hard with Win10 there![]()
I can live with Zen + Zen 'c' but I would steer clear of AMD if they mixed different cores. IMHO unnecessary complications for lack of engineering. there is already a huge amount of time & energy wasted on coherence as it is. Over complication is never efficient.Was kind of my original point though not aimed at you - the actual power profile is going to depend person to person on what they want and there is a lot more to it if you actually want to lower your power usage than the reviews tend to reveal.
Fine for gaming, but the 7800X3D is similar performance wise to a 12700K for anything outside of gaming, and that is for the average - some stuff it is more contemporary with Intel 11th gen (offset by better than average performance in some areas the 3D cache can be applied). Personally I'd take higher power usage for more balanced performance.
I can live with Zen + Zen 'c' but I would steer clear of AMD if they mixed different cores. IMHO unnecessary complications for lack of engineering. there is already a huge amount of time & energy wasted on coherence as it is. Over complication is never efficient.
I went with AMD for the possible drop-in CPU upgrade in a few years. I was not concerned about power as they are not far apart when tuned (power at the wall). I did the drop-in upgrade on AM4 and was thinking of doing it again to a 5950x but the x370 chipset was just too old (PCIe gen 3). The B650E-E should age well as PCIe Gen 5 is still not really been used, in a few years it probably will be but I should be OK until AM5 is dead.Was kind of my original point though not aimed at you - the actual power profile is going to depend person to person on what they want and there is a lot more to it if you actually want to lower your power usage than the reviews tend to reveal.
I don’t think it’s propaganda but rather it’s a smaller issue than the media played it out to be.According to some users Intels failing CPUs are down to propaganda from the media.
(Inserts nothing to see gif)
I haven’t seen any of the reviews yet, but let guess. More, more of the same with bells on?
No, I don't game at 1080P low but in order to become CPU bound you need to run a lower resolution/preset. Even 1080P high preset is not completely CPU bound.Do you game at 100p low though that is my questionI still stand by past comments that game benchmarks at low everything at 1080p especially is pointless as it doesn't represent gaming performance at what someone is actually playing at.
Would be interesting to see both 1080p at RT Overload preset and RT Overload at 4K as well. I know the X3D will best the 12700KF at 1080 ion both, as it's a CPU biased resolution , but does it stack up at 4K too?
1080p low
![]()
1080p RTOD:
![]()
4K RTOD:
![]()
And CPU utilisation during 1080p low
![]()
Oh indeed I'm fully aware of the reasons why RT/higher res are less useful to demonstrate CPU reliance, and this was genuinely interesting, see if I was looking at a platform upgrade and I as a 4K gamer only saw the 1080P benches and paid no mind to the resolution but only the massive fps gap between what I had and what I could have, then I'd be sorely disappointed when I paid for it all, built it and fired up a game only to see 14fps average gain vs the 100fps+ seen in the 1080p results.
That's what I mean, so many review sites do this and I understand why as it's the only way to show the difference between CPUs meaningfully as a mere 15fps in realistic numbers doesn't sell a CPU and platform upgrade to the average buyer who may well be spending in excess of £500 to make that upgrade from their ~3 year old system.
Now also bear in mind I'm on DDR4 3600 as well so mem for me, the gap would be smaller by a small amount still if I were on fast DDR5, not that DDR5 gives a huge gain anyway due to the same above.
Which means the only acceptable upgrade even from a 12th Gen platform POV for gaming is a GPU upgrade, not CPU![]()