• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Core Ultra 9 285k 'Arrow Lake' Discussion/News ("15th gen") on LGA-1851

These are first world problems but with electricity prices rising saving power is actually needed to save a good amount of money when thinking of electricity price aver a whole year.

It's not just that to be fair as I like tuning systems to be power efficient at idle and low load. Tuning systems for being efficient at full load is easy in comparison.

Unfortunately I found out my specific RTX 4090 is not as efficient at idle (25-26 watts) as the measurements I have seen online (22 watts) but I've also seen 38 watts for msi models.

I actually think separating by use case is the best although not as convenient which means keeping 9800x3d for gaming and high load workloads and a laptop for everything else as my laptop idles at 20 watts on desktop with 2 4k monitors, docking station and webcam.

Just one of your 2 4K monitors will be pulling times more power than your system. Some AIOs can pull 25 watts alone and the parasitic losses of a highend power supply under low load can be 10-25watt.
 
Just one of your 2 4K monitors will be pulling times more power than your system. Some AIOs can pull 25 watts alone and the parasitic losses of a highend power supply under low load can be 10-25watt.
Each of the 4k monitors draws 32 watts of power.

Yes and for this exact reason I'm using an air cooler instead of an AIO.

EDIT: My power supply is also efficient at 10 - 100 watts according to cybenetics and my own testing. It's a Corsair RM850X.
 
Last edited:
Each of the 4k monitors draws 32 watts of power.

Yes and for this exact reason I'm using an air cooler instead of an AIO.

I doubt the monitors pull that little TBH. Avoiding Intel and highend GPUs will help lower power use. An 8700GE and A620 board is good option and so is a 7800X3D A620, but to really push the efficiency envelope you have consider the total system and all the peripherals as there is much greater gains to be made.
 
I doubt the monitors pull that little TBH. Avoiding Intel and highend GPUs will help lower power use. An 8700GE and A620 board is good option and so is a 7800X3D A620, but to really push the efficiency envelope you have consider the total system and all the peripherals as there is much greater gains to be made.
The monitors draw 32 watts. I measured it with a power meter at the wall. The play is probably using my 265k system without the rtx 4090 for work and the 9800x3d+rtx 4090 system strictly for gaming but that solution is not optimal.
 
The monitors draw 32 watts. I measured it with a power meter at the wall. The play is probably using my 265k system without the rtx 4090 for work and the 9800x3d+rtx 4090 system strictly for gaming but that solution is not optimal.

What monitors are they? Your meter may display 32 watts, but it’s almost certainly reading very low. The play for power use is significantly more than the CPU, but a 7800X3D system can offer some silly performance for its power use. Outside of maybe working with Java for 10 hours a day I wouldn’t even consider the Intel system.

A Minisforum UM790 Pro is probably the ultimate low power system for the money currently.
 
What monitors are they? Your meter may display 32 watts, but it’s almost certainly reading very low. The play for power use is significantly more than the CPU, but a 7800X3D system can offer some silly performance for its power use. Outside of maybe working with Java for 10 hours a day I wouldn’t even consider the Intel system.

A Minisforum UM790 Pro is probably the ultimate low power system for the money currently.
No, it's reading correctly. It's the LG 32GR93U and HUB measured it more or less the same. My 48 inch 4k TV measures in at 55 watts so even 32 is kind of high for a 32 inch 4k monitor.

I do something similar to working with Java for 10 hours a day, though it's more like 8 hours.
 
No, it's reading correctly. It's the LG 32GR93U and HUB measured it more or less the same. My 48 inch 4k TV measures in at 55 watts so even 32 is kind of high for a 32 inch 4k monitor.

I do something similar to working with Java for 10 hours a day, though it's more like 8 hours.

It’s almost certainly the meter is off by around a factor two. What is the meters maximum rating?

Ezio flex scan are some of lowest power draw monitors I’ve found with good IQ. Not cheap, but they are many times more power efficient than most. The problem is with Java the Intel system will use 2-3 times the power than some of the Zen parts. Ezio flex scan monitors and something like a Ryzen 7945H or 7800X3D will massive improve your power efficiency.
 
It’s almost certainly the meter is off by around a factor two. What is the meters maximum rating?

Ezio flex scan are some of lowest power draw monitors I’ve found with good IQ. Not cheap, but they are many times more power efficient than most. The problem is with Java the Intel system will use 2-3 times the power than some of the Zen parts. Ezio flex scan monitors and something like a Ryzen 7945H or 7800X3D will massive improve your power efficiency.
It's rated for 2% accuracy and a maximum of 3680 watts. With Java I don't mean compiling mainly, I just mean programming and with that Intel is about 30 watts more efficient.

I probably switch back to AMD in 2-3 weeks just to be able to judge the difference better.
 
It's rated for 2% accuracy and a maximum of 3680 watts. With Java I don't mean compiling mainly, I just mean programming and with that Intel is about 30 watts more efficient.

I probably switch back to AMD in 2-3 weeks just to be able to judge the difference better.

The issue with most of the readily available power meters is nearly all of them are calibrated* at 50% of the maximum power rating and at 220-230 volts, while most of the UK is well over 245v. My supply is 246~ for instance, so right of the bat at 1840watts the meters 2% accuracy is more like 14% and the further away from 1840 (50% max) watts you measure the less accurate the reading.
 
The issue with most of the readily available power meters is nearly all of them are calibrated* at 50% of the maximum power rating and at 220-230 volts, while most of the UK is well over 245v. My supply is 246~ for instance, so right of the bat at 1840watts the meters 2% accuracy is more like 14% and the further away from 1840 (50% max) watts you measure the less accurate the reading.
The UK voltage supply 230V and is allowed a tolerance of -6% to +10%

Mine, for example, varies between 236 to 240V
 
The UK voltage supply 230V and is allowed a tolerance of -6% to +10%

Mine, for example, varies between 236 to 240V

The reality is when we moved to 230v we just changed the stickers on the switch gear. Most of the measurements I’ve made are 245~ but I’ve seen higher pretty regularly.

Plus or minus 2% of 1840 watts at 230v 50hz and with a power factor 1 is a swing of 73.6 watts. That’s in an ideal world scenario with everything else on the buildings circuit disconnected. In your case it could be a swing of 239.2 watts.

The meter is almost certainly reading low, but anything designed to handle 4000 watts will really struggle at the very low end of the scale.
 
The issue with most of the readily available power meters is nearly all of them are calibrated* at 50% of the maximum power rating and at 220-230 volts, while most of the UK is well over 245v. My supply is 246~ for instance, so right of the bat at 1840watts the meters 2% accuracy is more like 14% and the further away from 1840 (50% max) watts you measure the less accurate the reading.

People have done studies on this i.e. https://www.sciencedirect.com/science/article/pii/S2772671123000323, quite a few off the shelf consumer energy monitors are <3% accuracy across pretty much their entire range for constant loads, others can be way off. Generally though they are reasonably accurate at measuring constant idle loads even down to single digit wattages - what the consumer devices tend to struggle with is rapidly changing power use and inrush current. I've tested for example a Tapo energy monitoring plug against more expensive power monitoring hardware and it wasn't out by any meaningful amount - though I found a post by someone who has tested 10 of the same model and found that 8 were pretty close to accurate but for some reason 2 were pretty terrible for no apparent rhyme or reason (possibly not calibrated [properly] when manufactured).
 
People have done studies on this i.e. https://www.sciencedirect.com/science/article/pii/S2772671123000323, quite a few off the shelf consumer energy monitors are <3% accuracy across pretty much their entire range for constant loads, others can be way off. Generally though they are reasonably accurate at measuring constant idle loads even down to single digit wattages - what the consumer devices tend to struggle with is rapidly changing power use and inrush current. I've tested for example a Tapo energy monitoring plug against more expensive power monitoring hardware and it wasn't out by any meaningful amount - though I found a post by someone who has tested 10 of the same model and found that 8 were pretty close to accurate but for some reason 2 were pretty terrible for no apparent rhyme or reason (possibly not calibrated [properly] when manufactured).

Methodology aside, if you read the conclusion they are all pretty terrible.

Nothing under a few hundred quid would be calibrated, and it’s unlikely anything under many hundreds would have the means to be calibrated which is what would be required to see accurate readings. Unless you cough up some serious cash for test equipment and understand what you’re doing, ball park figures are the best you can achieve.
 

Intel Arrow Lake S/HX Refresh only to bring clock boost​



As expected, Intel will just use this as something to launch this year. The next gen part with substantial performance increase comes with Novalake in 2026, which will be very exciting.

Arrowlake/LGA1851 is the dead end platform we expected all those months/years ago and pales in comparison to Zen5X3D.
 
Absolutely embarrassing this has the cream of the crop mobos sitting on shelves, whist AM5 mobos pale in comparison. I wonder if manufactures have took the hint yet and go all in on AM5.
 
Just remarked on this in the Zen 6 thread, its dodgy dealings has to be that likes of Z890 still gets best mobo's and they rot on shelves.
 
https://community.intel.com/t5/Blog...-PC-Optimization-Partnership-for/post/1706776 :eek:

"Intel and EA announced their full PC partnership for Battlefield 6 – the latest entry in the long running Battlefield series, launching in October 2025. The partnership includes significant collaboration between Intel and EA to optimize Battlefield 6 for Intel® Core™, Intel Core Ultra, and Intel Arc-powered PCs and handhelds, including support for Intel technologies such as XeSS 2."
 
Last edited:
I suspect that the thousand dollar Core 2 Duo X series weren't exactly flying off the shelves.

The difference is they were completely and totally unmatched by anyone at the time. So lots of people did pay a premium to get the very best. Can’t say the same for the current lineup.
 
Back
Top Bottom