Mini/Micro PCs.

Associate
Joined
3 May 2018
Posts
604
tldr;

How do the modern, Celeron J class microPCs fair under normal desktop/office type use? The most I'm looking for is a fairly smooth 4K Youtube and browsing experience.

Do you have an recommendations for the sub £300 region?

Basically anything that can produce 2xHDMI 2.0 4k@60Hz displays and not chug out when running a few YouTube windows.


----------------
I'm back again and still looking to reduce power draw of my tech. Electric goes up 30% again 1st of Oct.

Presently, now with monitoring and graphs.

The 24/7 server consumes 50W or 70W depending on whether the disk pack has timed out and shut down or is awake*.

My main desktop/gaming PC draws between 95W and 108W idle. By "idle" I mean where the graph bottoms out on average. There are spikes when things light up the CPU or GPU, but generally it's sitting around 100W.

My work laptop on it's docking station pulling about 20W 8 hours a day.

When you add a bunch of other gadgets like monitors, routers, switches, etc. The total consumption when all are on is around 280W. Run a game and that jumps to 580W.

The elephant in the room is, as always the gaming PC. 100W is too high. It's a waste. When I'm "working", I only use my personal desktop for, well, personal stuff, like email, FB, watch Youtube and doing internet searches that are blocked in work. (Like github, believe it or not!)

I'm sure I can get this kind of basic desktop performance out of a 20W mini-pc.

*-keeping the USB disk pack in standby is proving difficult. Current issue is the software RAID waking the raid1 pair VAULT to update it or something. Wakes it up f or about 5 minutes every 10!.

GdGXuVm.png
 
A similar thought keeps crossing my mind.
I too run my work laptop & personal desktop throughout the day, and the 100W-110W idle from my desktop to just browse web / connect to VDI / watch youtube is a bit heavy...

However... I do hate having multiple PCs setup to do the same things...

I was planning to use the smart meter to see what the real cost difference really is. So spend a day working as I do now, then the following day dont power on the personal PC. See how much of a saving it is in actual £ terms.
 
I bought a J4125 based mini PC recently (MeLE PCG35) I use it as a low power way to sync files/storage pools but it does OK with average desktop tasks, quite well in fact. I've not tried 4K YouTube on it and currently don't have it hooked up to a 4K display to try that but it handles 1080p without breaking a sweat and sits there using sub 2 watts idle (that includes having an additional Samsung 980 NVME in there) Unfortunately with Windows 10 or 11 on there the average power consumption while still very low is a bit higher than with Windows 7 or Linux where it barely sips power.

Struggles with gaming though - older titles fine but anything remotely recent you are talking single digit frame rates at 720p with medium settings.
 
The elephant in the room is, as always the gaming PC. 100W is too high. It's a waste. When I'm "working", I only use my personal desktop for, well, personal stuff, like email, FB, watch Youtube and doing internet searches that are blocked in work. (Like github, believe it or not!)
You can currently buy a 10th gen optiplex (10500t - 6c/12t) on the bay for £280 with dell warranty, be more than ample, they use about 9-11w at idle and I don't think what you're using it for, will be kicking out much more power as its not intensive. (Or 8500t version £215)

Other option, iPad/tablet/laptop? Get one that lasts a full day then charge at night which depending on your electric provider, will usually be cheaper. If you want to knock power consumption down, whilst 2 monitors is nice you don't really need 2*4k screens for what you're doing, do you? :p
 
I have considered the ewaste route, such as the Dell optiplex SFFs and the now getting to sensible price USFF. I already have 4 Optiplexes :)

Certainly the 2-6th gen versions seem to idle around 20W. Maybe the newer 8th-10th gen USFFs will idle at half that you think?

On the lower end of the spectrum, a raspberry pi wouldn't cut it. Maybe a PI4, but they are like hens teeth.

I have an old laptop, but it's a little too old. Like a 2nd gen I3 with 6Gb RAM.

As to switching. I don't really mind switching monitor inputs. Switching/sharing keyboard/mouse is not nice though. So I suppose another £150 for a decent dual monitor KVM... which could itself pull 3+W.
 
My Dell 3020m (4th gen i3) is great for working from home/light usage, I'm sure with your budget simply picking up a more recent version would cover your needs, that or anything similar from HP/Lenovo.
Have you seen the Project Tiny Mini Micro series from STH on YouTube?
 
I have a 5600g which idles at 21W and manages 25W at the wall for 2K video playback. [picopsu, single nvme drive, no case fans needed but currently 32gb]

I've seen a lot of posts about running servers on 3200/3400s and claiming mid teen figures at the wall. Maybe they have better PSUs on top of running just 1 stick of memory and no usb peripherals. Also I think the 4 core AMD chips idle a bit lower than the 6 cores. I tried to pick up a 3400g on Ebay, but 2 came with bent corner pins. What with a 3900x off Members Mart also arriving with bent corner pins I've given up on 2nd hand AM4 cpus for now.

My gaming/rendering machine has an idle of 80 Watts mostly due to the huge 30W idle power use of the 3090. Start doing anything in Windows and thats 90W. It is however, very efficient at rendering due to being able to complete work quicker and the acceleration of the work space screen.

Overall I find the idle power usage of modern PCs disappointing. I had a 2nd gen Intel dual core laptop that managed to drive a 1080p screen on 11 Watts and had thought things were moving in the miniaturization direction what with the bur-going mobile/table era kicking in then.
 
[snip]

Overall I find the idle power usage of modern PCs disappointing. I had a 2nd gen Intel dual core laptop that managed to drive a 1080p screen on 11 Watts and had thought things were moving in the miniaturization direction what with the bur-going mobile/table era kicking in then.

The problem is mainly that standard ATX PSUs are very inefficient at low load and have high overheads (like 6-7 watts just for being turned on). Most OEM systems have 12VO type PSUs, so they can achieve much lower idle figures. If the motherboard has all the power features enabled and is designed for it, then you can get mid teens with most AMD APU and Intel IGP systems and a standard PSU, but otherwise it'll be in the 20s, possibly 30s with high-end memory.
 
I've seen a lot of posts about running servers on 3200/3400s and claiming mid teen figures at the wall. Maybe they have better PSUs on top of running just 1 stick of memory and no usb peripherals. Also I think the 4 core AMD chips idle a bit lower than the 6 cores. I tried to pick up a 3400g on Ebay, but 2 came with bent corner pins. What with a 3900x off Members Mart also arriving with bent corner pins I've given up on 2nd hand AM4 cpus for now.
I'm running a 3400G in an HP Elitedesk Mini (USFF) as my Home Assistant server. Draws 18W at the wall all day long - it did spike up under load here and there to double or more. But having averaged the consumption over a month it's 18W too, so that must only be momentary bursts.

It does well - there's a ton of headroom for more server tasks that would make the efficiency "better" as I'm sure most of the 18W is just idle system overheads.
 
The problem is mainly that standard ATX PSUs are very inefficient at low load and have high overheads (like 6-7 watts just for being turned on). Most OEM systems have 12VO type PSUs, so they can achieve much lower idle figures. If the motherboard has all the power features enabled and is designed for it, then you can get mid teens with most AMD APU and Intel IGP systems and a standard PSU, but otherwise it'll be in the 20s, possibly 30s with high-end memory.

Yep. They more or less have the same slowly rising, gentle crest, gently falling efficiency curve. Been a while since checked, but I'm fairly sure at the lower end you can be looking at only 60% efficiency.

"Lower end" will of course then be relative to the MAX output and the peak efficiency output.

So my Corsair 850W PSU, while the PC is pulling only, say 50W, will actually pull more like 90W out of the wall and emit 40W of heat. Sure when it's gaming, pulling 450W the PSU is in it's efficiency plateau and only pulling 500W from the wall.

If I put a 300W PSU in it, and it pulled 50W it might only pull 70W from the wall. However, as soon as I fire up the 3080 the PSU faults and the system shuts down.

This is why those small, corporate desktop ewaste boxes are so appealing. They are partly designed and built to make real power saving impacts in businesses that run 1000s of them 24/7.

If I had loads-a-money(tm) I'd buy a big fat dual board, dual PSU box with a built in KVM. With prices of new gear where they are, that is still going to run me well over £1000 and that includes reusing everything from my current build!
I've run "new stuff" builds on OC shop and I can't get much under £400. That's picking a 5600g and a micro-atx board, 8Gb RAM.
 
So I dug out my old laptop. Tried it, checked it's specs and it's an i3-32xxS something.

However I remembered I still had the Optiplex 7010 (USFF) that used to run the living room tele, but could not do 4K, so got replaced with an Optiplex 990 (SFF)+Gtx1030.

I got lucky. Not only is it's spec better, (an i5 3440, 8Gb RAM), but I lucked out and the IntelHD gfx does actually support my 3440x1440 monitor.

It will take a bit of time to be sure, but it looks like it's going to work fine for anything but gaming or VR. The box itself consumes about 20W. So a net saving of around 90W during the day.

In fact, in the evenings, my energy monitor graph went "green" meaning less than 300W house wide!
 
Spoke a little too soon. I had a little "chewiness", hard to explain, not like "desktop'ing through treacle", but something was off.

The current setup with a DP output through an HDMI converter to an HDMI2.0 port gives me only 30Hz at 3440x1440.

That might not be fixable, but I have to try direct DP-DP connection to the monitor. That will be a pain as the full DP port is used by the gaming PC :( Meaning I'll definately need a KVM to switch back and forth. Current I only need to move 1 USB and the audio jack if I want wired audio on the gaming PC.
 
Spoke a little too soon. I had a little "chewiness", hard to explain, not like "desktop'ing through treacle", but something was off.

The current setup with a DP output through an HDMI converter to an HDMI2.0 port gives me only 30Hz at 3440x1440.

That might not be fixable, but I have to try direct DP-DP connection to the monitor. That will be a pain as the full DP port is used by the gaming PC :( Meaning I'll definately need a KVM to switch back and forth. Current I only need to move 1 USB and the audio jack if I want wired audio on the gaming PC.
What's the DP to HDMI converter? Active or passive?

I've managed to get 2560x1440 at 50Hz over DP to HDMI passively but had to go DP-DP for the full 95Hz my monitor supports. Your resolution being higher might be out of HDMI 1.4 range.
 
It's passive (cheap). I think you are right the adapter only supports 1.4 which tops out at something just over HD at 60Hz. The input port is HDMI 2.0 but its not worth risking the cost of an active 2.0 converter.

I did a bit of doodling with pen and paper and decided to upgrade one of the Dell eWaste machines. I will need to do a bit of moving SDDs about but I think I can get all the Dell eWaste boxes where they should be without too much hassle. Windows 10 'should' be fine, all the boxes have licenses, just need to swap product keys and hope that Windows 10 doesn't have a panic attack on a new (very similar) box. Linux won't care. I'll just need to check the disk UUIDs.

The upgrade was because... when I shut the gaming PC down, I realised I lost not just my gaming rig, but the bits of pieces of development environments on there. Like VSCode, Pycharm, Eclipse, MQTT Explorer, etc. etc. I don't mind spinnng up new install Windows boxes, I can get it down in under an hour. But reinstalling and re-setting up dev environments is a right pain in the wazoo.

Solution: Move the dev environment to a VM. Run the VM on the 24/7 server, then it doesn't matter which "client" box I have booted, I can RDP/VNC to the dev VM and work on it from any room in the house!

Problem: The server only has a i5-3470 and 8Gb of RAM. So launching a Windows 10 VM with 4Gb of RAM, over commits the system, leaving too little for disk cache.

Plan: I bought an i7-4770 with 16Gb RAM which will replace the server and should give me enough room for 1 or 2 4G VMs. The i5-3470 then upgrades my new "office" pc from an i3-32nn. This is a bonus as the current one has a dead CMOS battery and it can be a bear to boot after a power out. The BIOS cannot enable the CPU GFX properly. So it presents the monitor with a VERY, VERY basic VESA framebuffer which the Windows installer determines as 480x320 grey scale. Only 1 monitor I have supports that res. So to boot it I have to get an old 22" monitor out of the attic! AND a wired USB keyboard! I'll happily return that to collecting dust.

I have to check while I'm at it, as I have a suspicion the least used one, in the livingroom may also be a i7 3rd gen. Might do a bit more shuffling around.

The i7-4770 has Intel HD 4500 I think, but I checked and it will do 4k 60Hz over DP at least. I know the others do 4k 30FPS on HDMI, as they needed 1030's to get 4k 60fps. But I might get more lucky with direct DP-DP.
 
Upgrade the server with increased ram and deploy a VDi with zero clients in every room :p

I did consider the "thin client" "fat VM host" approach. To do that properly though, the gaming PC with the 5800X and 32Gb of RAM would be called for. That would make the VDIs more powerful than the Dell eWaste clients. However the current server idles at about 21W, not 120W.

The other issue is I believe NVidia have put their vGPU drivers behind a paywall now. So no GPU virtualisation with ££££.

However, I am using VDI for development... backed up by the Linux host underneath of course. The 4770 being delivered supports Hyper-T and -x should it will support nested hypervisors and can then run WSL (and android studio) on the same VM.

Although.... I wonder what the options are for RDP from a Smart TV. If I could get 4K remote desktop at 60Hz playing Netflix without an eWaste media centre, that might be nice. Somehow I doubt it. RDP will not perform under that kind of load surely.
 
Last edited:
Funny thing. I worked in one of the corporate mega giants that used these eWaste boxes as "thin clients".

The irony was, at the time I recall the thin client had far more power than my VDI did! The thin client had a quad core with 8Gb of RAM. My remote VDI started with 2 cores and 2Gb. It took me 3 months to upgrade it slow to 8 cores and 8Gb of RAM when they finally told me no more was available. (Now they just give you one request called MAX Specs, because anyone who cares about their performance always ends up there anyway )

Oddly the very next year I was working for a different corp mega giant and they gave me my dev VDI straight up with 16 cores and 64Gb of RAM. Interesting how different ones have different policies.
 
Last edited:
Oh... there is the option to just ship HDMI (and USB) around the house Linus LTS style. All my smart switches support multimedia VLANs, but at 1Gb/s I would need to upgrade the relevant ones to 2.5Gb

Just sounds too expensive right now.
 
Back
Top Bottom