Everything except serious gaming. I have a so far unfinished gaming desktop build based around a i5-3550 and IGB Radeon 7850 but I need something cheaper to run for everyday use. My 10 year old Dell laptop (used as a desktop) with Win XP is showing signs of its age and will need to be retired to backup when MS XP support ends.
I thought about just buying a new laptop but you've got to spend at least £350 to get something decent new and as I had a Radeon 7770 GPU from the initial build of the gaming rig it just seemq stupid not to use it. So really I want to put together the best performance /cheapest running cost build I can get at the same sort of price.
CAT-THE-FIFTH
Your calculations look good and I had come to a figure of around £12/year extra cost using the FX-6300. You'd also have to factor in the GPU power consumption on top as for the day to day stuff I'd use the i5's or i3's integrated graphics instead of the 7770.
Monitor is a 35W max Dell VGA only 5:4 LCD which I use with my current laptop and the PSU for the build will be a Coolermaster Elite 330U case/500W PSU package.
Lets call it £15 extra a year for the FX-6300 and I expect it would actually be more than that as the "Active Idle" power figures I've seen suggest that the AMD CPU uses twice as much power as an i5.
Now if I'm going to be using the rig for three years at least (I'd hope to be able to use it for much longer) and almost certain electricity prices rise it is not unreasonable to think that £50 initial cost price difference between the FX-6300 and i5-4440 will have evaporated, In other words because of the higher running costs, small as they may seem, over such a period they mean at that point the real cost is actually very similar. After this the i5 becomes increasingly cheaper in comparison.
If you accept the figures then why would anyone want to use the FX-6300 in a build like this? Similar cost over the minimum time period expected but the i5-4440 performs better in almost every benchmark.
The similar priced FX-8350 performs significantly better only when OCed and even at stock uses 41W more than the i5-4440 (125W againts 84W) so that rules itself out without even having to do the calculations.
It is the i3-4340 that is still interesting me. Undoubtedly it is cheaper initially and over time too but does it provide a significantly better overall build cost to performance ratio solution than the i5-4440?
Loads of people I know use the FX6300. I only see your reasoning in justifying a more expensive CPU upfront. So a more expensive CPU performs better than a cheaper one. For most people on a budget getting a FX6300 and spending the extra on a better graphics card makes more sense. It costs you less in the long term.
My power consumption figures are based on gaming power consumption figures from reviews which tested games and not rubbish like LinX testing.
Moreover,what about games when the power consumption difference is less,it will take something like a decade?
Also,I am assuming the power consumption difference between a Core i3 and a FX6300,not a Core i5 which will consume more power than a Core i3.
So its probably less than £10 a year.
That is assuming you are using a 990FX motherboard,not a lower end more energy efficient 970 based one.
You do realise you could save more money plonking on a jumper more often,not boiling too much water in your kettle,or simply not using the oven too much??
Moreover,you need to really look at usage habits too. Keeping your PC on idle 24/7 or for longer than needed will add to you power consumption more over time.
Also,if you already have a main rig why are you just wasting money on another rig then??
Just stick with your Core i5 3350P rig and HD7850 1GB.
In fact better still. Sell your HD7770 and HD7850 1GB while they have value still. The 1GB VRAM on both is a limitation already and will be over the next two years. Get a better card and use it with your Core i5 3350P.
Forget about secondary rigs to save power.
My Xeon E3 1220 and a GTX660 based rig which is a mini-ITX system consumes around 50W to 70W at idle and low load,which matches an HD7850 2GB I had in for a while. That is with an old Corsair HX520W.
There is no real point in getting another rig,since another Core i5 and an HD7770 will consume about the same.
You are saving no more.
Only a rig with an IGP will save money and even then it would be something like one of the AMD chips which have a decent IGP and quite decent low load and idle power consumption.
An A6 3670K based rig I built with a £20 FSP 400W jobbie(not the most efficient at low loads),consumes between 35W to 47W at the wall when idling and when web browsing.
Even if you were to buy or build a rig which you got the idle and low load power consumption down to 25W to 35W it would only save you £12 to £21 a year.
So if you even only spent £300 to £400 on a secondary rig it would take at least 15 years to get the money back if you have your rig on for 8 hours each day,every day of the year.
Even some of those Intel and AMD nettops which probably consume 10W to 15W at the wall in low load conditions,will only save you £19 to £28 a year. They cost around £200 if you include Windows so again it will take at least 5 years or more to get the upfront purchase cost back in energy savings.
Also,switchable graphics only is available on some Intel motherboards and they are pricier than normal anyway.
So again what is the point??
You are spending pounds to save pennies.
If it is making you worried so much,then just get a tablet.
Yes,a tablet.
They consume mere watts(even less than a watt) and NO desktop or normal laptop can match them.
They are also portable too.
Edit!!
The Coolermaster Elite 500W is a horrible power supply. That is the PSU bundled with the 330 cases.
Its Solytech rubbish:
http://www.realhardtechx.com/index_archivos/Page364.htm
It does not even have 80+ certification which is worrying and it is under 80% efficient:
http://www.coolermaster.com/powersupply/office-home-elite-power/elite-power-v2-550w/
However,more importantly it probably uses crap capacitors and a cheap cooling fan.
That is why I would ditch it.
That under £20 FSP jobbie its more efficient and is probably better made.
Use part of that £350 budget towards a better quality PSU.
I would sell both your current graphics cards. Get something with more VRAM and which is a bit faster.
Pocket the rest of the cash and it will mean the electricity bill for your PC is paid for a few years.
Sorted.
IMHO of course.
Second Edit!!
I looked around for reviews comparing the FX6300 and a Haswell Core i5.
Pcgameshardware has some figures.
The difference between an FX6300 and a Core i5 4570 is around 15W to 45W at the wall,dependent on the conditions,ie,idle,video encoding,image editing and running Crysis3.
The 45W figure is when running Crysis3 which will use upto six threads. A more lightly threaded game will see even less of a difference.
The FX6300 consumes around 14W to 83W more at the wall than a Core i3 3220 under similar conditions,but is 25% faster in the game.
Funnily it seems the 45W Core i5 4570T dual core(runs upto 3.6GHZ),consumes more power than the Core i3 3220. Its faster(its only 11% slower than the FX6300) but the power consumption difference under Crysis3 is around 67W at the wall. It makes me wonder if the bog standard Core i3 which has a higher TDP will consume a bit more.
This is not surprising as there seems to be know 2C GT2 die,only a 2C GT3 die. So it means the Haswell Core i3 CPUs are salvage parts of either,most likely the 4C GT2 parts instead of custom parts like with IB.