Suggest replacement for 2700 based server

Suspended
Joined
20 Feb 2011
Posts
4,073
I currently run a 24/7 Unraid server which is used mainly for Plex, Sabnzbd, Sonaar and five Windows based VMs. It has four 6Tb drives (three for storage, one parity) and a 512Gb SSD for cache. The CPU is a Ryzen 2700 coupled to 4x8Gb of RAM.

I'm investigating replacing the CPU with a more power efficient model. The motherboard can be replaced if an Intel is the way to go.

The VMs are used to run a Java based front end for a database. It is important that these run simultaneously with no slowdown. Plex should have some capability to transcode. These are the only prerequisites.

The question is, is there a suitable power efficient replacement?
 
Last edited:
Assuming that’s an i7 2700K (I can think of four desktop CPU’s 2700 covers) and you want to literally drop in another CPU, then no - the idle power states between all desktop CPU’s within the generation in question are nearly identical.

I have mixed feelings on this one without some detail on your average CPU usage/workloads/power numbers at present. My initial thoughts are that if power saving is your motivation, then don’t bother - you’ll need to be running at max TDP 24/7 for a hell of a long time to ‘save’ anything vs the outlay first new parts. On the other hand if you need more performance then it may make more sense, but surely you would have mentioned that as a priority unless you’re asking us to justify a choice you’ve already made.

What you have is a 4c/8t @ 3.5Ghz CPU with 95w TDP and makes 8.6k of CPU mark (say 4-5 transcodes in Plex), no iGPU.

If you go with an 9900K you get 8c/16t @ 3.6Ghz with a 5Ghz boost, 95w TDP (that’s at 3.6Ghz so add on for boost) and 20.2K of CPU mark or 10+ simultaneous transcodes in Plex or another 10+ using the iGPU (requires a PlexPass). So in real terms for CPU mark, if you compare 4c/8t of each CPU, it’s 8.6k vs 10.1k for a 7 generation gap.

While something like a Ryzen is more power efficient, has better performance in multi-core workloads etc. and is a lot cheaper than a 9900K, unless you are running the existing CPU flat out, you will save bugger all power by changing unless you have a specific task in mind that for example benefits from a new hardware instruction set not present in the existing CPU. Intel are masters of power gating, idle cores use bugger all power.

So the question becomes is this about power saving? If so post some numbers, and/or work out how many years the hundreds of pounds you will spend are likely to take to recoup, or if it’s about performance, then give us a better idea about what area you would like to see gains in. Also as you mention Plex, why not use a GPU to do the heavy lifting? A 950/1050/1650 will produce very acceptable results and take a large chunk of the overhead off your CPU if you use hardware transcoding.
 
It’s a Ryzen 2700 and that pretty much answers the question based on what you’ve said. Pretty much also confirms my own suspicions in that the CPU is while not the most power efficient CPU out there, it isn’t worth replacing at the moment.

FYI I’d use hardware decoding if necessary (which is why I was entertaining using an Intel chip) but at the moment it’s not proved necessary.
 
In that case, you have what is the most power frugal 8c/16t CPU of it’s generation, in order to save even a small amount of power, you’d need to spend £320ish on a 3700x or better, you’d likely save a little in power, but if you use a power meter you’d likely see 50w ish (+/- depending on what you have in terms of GPU/RAM/drives) on what you have now, that’s 1.2Kw/day or about 15.6p inc VAT for me, saving 10-20% (£5.69-11.38/yr) doesn’t seem like a great reason to spend £320 to me.
 
I use a TP Link smart plug with a built in meter. Over the past month the server averages 2.26Kwh per day. I was looking to bring that figure down a bit but given that the hardware I have is pretty minimal (for a server) I don’t think that’s going to be possible.
 
That’s not it’s idle, but it is actually more useful. Some rough numbers:

3700x £330
Used 2700 value £130
Your current usage 2.26Kw
My unit price 13.713 p/Kwh inc. 5% VAT (ignore the standing charge, you pay it regardless).

So based on those you currently spend 31p/day (rounded) or £113.15/yr. It’s a £200 cost to change if you sell the 2700 and buy a 3700x. Savings wise if we use 20%, then that saves you £22.63/yr, it would take you almost 9 years to save £3.67. You’d get a better return on shoving the money in a bank account paying out 1% interest.

If you wait, the 3700x will get cheaper, especially when the 4xxx chips launch, your 2700 will be worth less, but not enough to offset the reduction on the 3700x. If we assume a 3700x will be somewhere around the 2700x is now, then it makes more sense to upgrade. Right now, you pay relatively little to run what you do, £2.17 a week won’t buy you a pint in many pubs or a danish in a coffee shop, generally people who spend hundreds on hardware/licences and need to run VM’s in the way your post suggests aren’t that focused on saving 6 pence a day, you can do that by spending a few minutes less in the shower or driving 2-3 miles less a week. If you want more cores/CPU power, then you have options, but if you want better power efficiency, then look at the way you work. Are all the VM’s running off the SSD so the array is not spun up? Things like passing a disk through to a VM work better than hitting the array directly for performance and power. Have you looked at core/memory allocation on the VM’s? What about the BIOS and it’s power states, have you tweaked the fan(s) profile? Also consider under volting, you may be surprised at what you can achieve at zero cost with a little time invested.
 
Back
Top Bottom