Pro workstation build

Soldato
Joined
6 Oct 2009
Posts
4,128
Location
London
Hey guys

Two years ago I asked for help with a workstation build and you all helped me a lot, and that machine (with some minor issues) has been working excellently. Now it's time to add a second FP64 optimized machine as our business has grown.

So here are my chosen parts:

My basket at Overclockers UK:

Total: £6,154.40
(includes shipping: £12.60)



Some points:

  • I need a 10Gbps network card. I think it will be around £200. Nothing in OcUK. ASUS should freaking just put one on these motherboards.
  • M2 SSD for OS/software. SATA ones will be on RAID0 for workspace.
  • I need a good RAID card as well. I guess it also will be around £200-250, nothing in OcUK.
  • PSU is a little overkill, but might add another GPU down the road so better play it safe. Also doesn't hurt to have some headroom given this will be working almost 24/7.
  • No watercooling this time. This one will be shoved into a server room. Noise isn't an issue.
  • Didn't choose a fancy for CPU cooler, just something that gets the job done well. These CPUs are 85w.
  • I wanted to keep it under £6000 but it doesn't so without major sacrifices. Ideas on cutting a little from the price?

All thoughts and comments welcome! Thanks a lot.
 
I have two comments to start you off.

1. You should definitely be using ECC memory. I'm surprised and sorry nobody mentioned it last time.

2. That motherboard is the wrong tool for the job. It's expensive and missing features. For about the same price you could get for example a SuperMicro X10DRL-IT. Dual socket, ECC memory, onboard RAID, and dual 10GBase-T ethernet (no M2 though). Another similar example would be the ASUS Z10PE-D16 EEB but I wouldn't recommend that as it's non-ECC.

Edit: 3. I'd be extra concerned about data integrity if I were you, so would probably use ZFS for the main storage. Have a skim of this http://www.zdnet.com/article/zfs-data-integrity-tested/ and this for why ECC is essential https://hardforum.com/threads/data-integrity-the-risks-of-not-using-ecc-with-zfs-a-study.1689724/
 
Last edited:
We have ZFS on our file servers. Is it necessary to have it on the workstation? From what I understand it's about data integrity and protection. Nothing will be permanently stored on this workstation. What's the benefit in this case?

Crucial has 16GB DDR4-2133 ECC modules. I guess those will be fine.
 
For what are you going to be using this workstation? What software are you going to be using? Your previous thread says computation and financial analysis and specifies Quadro-specific software but you're now thinking of getting a FirePro. That does not compute. You might want to wait until Nvidia release their Pascal Tesla cards, or budget for replacement when they do.

If it's computation-intensive and benefits from multiple cores, consider the many-cored Xeons, like the Xeon E5-2680 v4 or 2690 v4.

If the thing is going to be in a server room, get a rack-mountable case and a redundant PSU.
 
For what are you going to be using this workstation? What software are you going to be using? Your previous thread says computation and financial analysis and specifies Quadro-specific software but you're now thinking of getting a FirePro. That does not compute. You might want to wait until Nvidia release their Pascal Tesla cards, or budget for replacement when they do.

Tesla cards are out of our budgets. Besides the only Pascal Tesla card announced is the P100, it's not out in the market yet (and won't be for many more months, maybe not even this year). It's price will be out of our budget too. They haven't announced cheaper Pascal Teslas. Older Teslas also don't give as much performance im the sub £2500 range.

This one doesn't necessarily need a Quadro/FirePro. I picked the FirePro W9100 for its high FP64 performance. It's the highest non-tesla out there. I guess we could have gone with GTX Titan Blacks like last time but they're no longer in the market.


If it's computation-intensive and benefits from multiple cores, consider the many-cored Xeons, like the Xeon E5-2680 v4 or 2690 v4.

Ideally yeah but those are way too expensive for us. Have to keep overall cost below or around £6000.

If the thing is going to be in a server room, get a rack-mountable case and a redundant PSU.

We're a small company. Our server room is a normal room where we put the cases in.:D
 
Last edited:
2. That motherboard is the wrong tool for the job. It's expensive and missing features. For about the same price you could get for example a SuperMicro X10DRL-IT. Dual socket, ECC memory, onboard RAID, and dual 10GBase-T ethernet (no M2 though). Another similar example would be the ASUS Z10PE-D16 EEB but I wouldn't recommend that as it's non-ECC.

This one's great. But the RAID doesn't seem to be any different than the ASUS one. They're both off of the C612 chipset, so identical. It will saturate quickly, that's why I need a dedicated RAID controller. It doesn't have enough PCI-E for a RAID card, a PCI-E SSD since it doesn't have M2, and the GPU with the possibility of future expansion.

SuperMicro X10DAX is good, lacks 10Gbps networking, but has more PCI-E, and no M2. Seems like all these are heavily compromised products.

Gigabyte MD80-TM0 seems decent as well. 10Gbps networking, M2, but less PCI-E and more RAM slots that we need.
 
Last edited:
Ideally yeah but those are way too expensive for us. Have to keep overall cost below or around £6000.

If you check their prices you'll find they're not that expensive - OCUK want £1417 for the E5-2690 (assuming you can reclaim the VAT). So that's £1834 for two CPUs. If your software will use all the cores, you should consider it.

We're a small company. Our server room is a normal room where we put the cases in.:D

Floorspace is expensive. How much are you paying per square metre of floorspace? So you could take the opportunity to free up some of that floorspace by installing a rack. UPS at the bottom, servers in the middle, switch and patch panel at the top. A rack is a few hundred quid - it could easily repay its cost inside a year.
 
If you check their prices you'll find they're not that expensive - OCUK want £1417 for the E5-2690 (assuming you can reclaim the VAT). So that's £1834 for two CPUs. If your software will use all the cores, you should consider it.

Our £6000 budget is for inc. VAT prices. We can reclaim the VAT but then our budge is £4800 now if we consider ex. VAT prices.

I think you got the prices wrong. E5-2690 is £1699.99 inc. VAT for one CPU. £3400 inc. VAT for two (or 2720 ex VAT). Not £1834. I don't know how you got that number.


Floorspace is expensive. How much are you paying per square metre of floorspace? So you could take the opportunity to free up some of that floorspace by installing a rack. UPS at the bottom, servers in the middle, switch and patch panel at the top. A rack is a few hundred quid - it could easily repay its cost inside a year.

Zero. The space is owned by our main investor. But racks are a good idea. I'll bring it up with others.
 
I think you got the prices wrong. E5-2690 is £1699.99 inc. VAT for one CPU. £3400 inc. VAT for two (or 2720 ex VAT). Not £1834. I don't know how you got that number.

Sorry, that was a typo - I meant £2834 - 2x the ex. VAT price.
 
We have ZFS on our file servers. Is it necessary to have it on the workstation? From what I understand it's about data integrity and protection. Nothing will be permanently stored on this workstation. What's the benefit in this case?

"Necessary" is up to you, but if I were running a lot of I/O and needed confidence in the results I'd go for it. Plus you wouldn't need to worry about the RAID capabilities of the mobo/another card.

This one doesn't necessarily need a Quadro/FirePro. I picked the FirePro W9100 for its high FP64 performance. It's the highest non-tesla out there. I guess we could have gone with GTX Titan Blacks like last time but they're no longer in the market.

Depending on how many GFLOPS you want there might be better value options. For example:
  • W8100 has 20% less performance but costs nearly 40% less, still got over 2 TFLOPS, save a grand.
  • Titan X is better "value" still, 1400 GFLOPS, only £840. If 1400 isn't enough you could get two and beat a W9100, and they would cost £820 less.
  • Absolute best value is from 2, 3, or 4-way R9 390s. 1300, 1900, or 2600 GFLOPS for £500, £800, or £1000, respectively. As long as you have the slots, the power, and the parallelisation. I.e. spend the same as a Titan X on three 390s and get 500 GFLOPS more.

2cpt25j.png


Edit: oops, wrong numbers for Titan X, those are old Titan performance numbers, soz.
 
Last edited:
If you're looking at that do take into account the electricity cost. It adds up. Four 390s will cost you 8p/hour. That doesn't sound much but it's £700 a year assuming they're run 100% full pelt. and you need the beefy expensive PSU to start with. And you need to look at heat issues much more carefully. A single W9100 will cost about a quarter of that to run.

You pays your money....
 
It looks pretty good..

To save some cash I would swap the case for a Phanteks Eclipse P400 (-£50~) PSU for a XFX ProSeries 1050W 80+ Gold (-£60~) and shop around for the fire pro card. You could save a lot of money on that.
 
The SM951 is the same M2 drive as the 950 pro - you could get a 512gb 951 for not much more than the cost of the 950 pro 256gb. (or save money getting the SM951 256gb)
Only difference is the 950 pro comes with a black pcb, a box and a 10 year warranty vs 2 years on the 951 and a green pcb.
You can get a pci-e slot adaptor for M2 drives if you're mobo doesn't have one - I've got an adaptor but haven't tried it yet (using M2 slot on mobo)
 
Hope you are not going to be attempting to boot that 950 Pro on that board in that m.2 slot as it doesn't work correctly if at all. iirc it cannot boot from a PCIe device unless it has a bootloader/firmware like the Intel 750 Nvme drives. We build using those boards daily.

Make sure you get the v4, you may have to wait a little while but the like for like should be the same price but slightly better performance.

The board can be a little picky at times with memory but using Samsung/Crucial/Kingston usually isn't a problem as long as its 2133Mhz ECC Reg.

I would also do a search for Supermicro SNK-P0050AP4 coolers. They may only be 92mm fans but they are a fantastic cooler and extremely reliable.
 
"Necessary" is up to you, but if I were running a lot of I/O and needed confidence in the results I'd go for it. Plus you wouldn't need to worry about the RAID capabilities of the mobo/another card.



Depending on how many GFLOPS you want there might be better value options. For example:
  • W8100 has 20% less performance but costs nearly 40% less, still got over 2 TFLOPS, save a grand.
  • Titan X is better "value" still, 1400 GFLOPS, only £840. If 1400 isn't enough you could get two and beat a W9100, and they would cost £820 less.
  • Absolute best value is from 2, 3, or 4-way R9 390s. 1300, 1900, or 2600 GFLOPS for £500, £800, or £1000, respectively. As long as you have the slots, the power, and the parallelisation. I.e. spend the same as a Titan X on three 390s and get 500 GFLOPS more.

2cpt25j.png

I'll look into ZFS further.

I'm pretty sure this chart isn't reliable. Titan X doesn't have 1500 GFLOPS of FP64. NVIDIA made sure of it when they made the card 1/32 of FP32 rather than 1/3 like original Titan and Titan Black.

390X value seems to be correct though. That got interesting. Around 750 GFLOPS and much cheaper. We need four of them to get the same level of performance but it will be much cheaper. I guess I'll go that route. I'll post a revised basket.
 
Sorry, that was a typo - I meant £2834 - 2x the ex. VAT price.

Yea. Way out of our price range. That leaves only £2000 for everything else.


If you're looking at that do take into account the electricity cost. It adds up. Four 390s will cost you 8p/hour. That doesn't sound much but it's £700 a year assuming they're run 100% full pelt. and you need the beefy expensive PSU to start with. And you need to look at heat issues much more carefully. A single W9100 will cost about a quarter of that to run.

You pays your money....

Interesting thought. I guess I'll have to go for some middle ground. Maybe a W8100 and two 390X. So we only send loads to the 390X when we need saturate the W8100. I'll have to do the calculations.

It looks pretty good..

To save some cash I would swap the case for a Phanteks Eclipse P400 (-£50~) PSU for a XFX ProSeries 1050W 80+ Gold (-£60~) and shop around for the fire pro card. You could save a lot of money on that.

I don't want to go cheap on PSU. £60 isn't much given a dead PSU is just too much of an inconvenience. I'll get a backup PSU anyway but still.

Hope you are not going to be attempting to boot that 950 Pro on that board in that m.2 slot as it doesn't work correctly if at all. iirc it cannot boot from a PCIe device unless it has a bootloader/firmware like the Intel 750 Nvme drives. We build using those boards daily.

Make sure you get the v4, you may have to wait a little while but the like for like should be the same price but slightly better performance.

The board can be a little picky at times with memory but using Samsung/Crucial/Kingston usually isn't a problem as long as its 2133Mhz ECC Reg.

I would also do a search for Supermicro SNK-P0050AP4 coolers. They may only be 92mm fans but they are a fantastic cooler and extremely reliable.

Excellent advice. Many many thanks.
 
I'm pretty sure this chart isn't reliable. Titan X doesn't have 1500 GFLOPS of FP64. NVIDIA made sure of it when they made the card 1/32 of FP32 rather than 1/3 like original Titan and Titan Black.

390X value seems to be correct though. That got interesting. Around 750 GFLOPS and much cheaper. We need four of them to get the same level of performance but it will be much cheaper. I guess I'll go that route. I'll post a revised basket.

Oops! You're right, that's the old Titan. The X has a miserable 192 GFLOPS as you're well aware.

Do note Quartz's point though, about power consumption.

Edit: speaking of old Titans, the last five that sold at auction on ebay went for £285 (+/- £16). That's almost 5 GFLOPS/£ for comparison to my table above - sensational value.
 
Last edited:
Oops! You're right, that's the old Titan. The X has a miserable 192 GFLOPS as you're well aware.

Do note Quartz's point though, about power consumption.

Edit: speaking of old Titans, the last five that sold at auction on ebay went for £285 (+/- £16). That's almost 5 GFLOPS/£ for comparison to my table above - sensational value.

Yeah I've been looking around now. Used Titans and Titan Blacks give the best value here.
 
You can get W8100's for under 1k these days with VAT, so that puts it @ 2.2fp64/£ in your chart.. Only thing is they are 8GB instead of the 9100's 16GB memory.
 
You can get W8100's for under 1k these days with VAT, so that puts it @ 2.2fp64/£ in your chart.. Only thing is they are 8GB instead of the 9100's 16GB memory.

Great point. I'm going to have to look at the logs to see how memory intensive the tasks are.
 
Back
Top Bottom