Home Server Board

Associate
Joined
3 Jan 2007
Posts
462
Location
London
Apologies for the double post (this also in Hardware>Motherboards) but I forgot this section was here :)

Right, I'd go into detail, but it would start as a sob story about my worst fortnight with hardware in my life, and probably descend into a full-blown rant until I get to the part about nVidia and their shoddy chipsets at which point something noisy and probably messy would happen.
Suffice to say I find myself in need of a motherboard.
Needs:
S775/Core2 support
ICH10R southbridge

Would be nice:
Integrated graphics so I don't have to pranny about with finding a new card, so I've been looking at the G45 chipset.
£120 max. but this can be adjusted for the perfect bit of hardware

A server-class motherboard would be best as it is to be used in a 24/7 home server. I've seen things close, like the Supermicro C2SEA, but that needs DDR3, and the Asus P5Q-EM, but that isn't a server-class mobo with it's non-solid state caps and limited expansion).
Any suggestions?

Also, any experience with using the RAID function on an ICH10R, or would the software RAID from Server 2003 do me fine?
 
If you are seriously considering server class board for the reliability then surely you aren't going to rely on the ICH Raid or software raid.

I will use it for my games rig where just RAID0 some drives but wouldn't rely on it for anything I wanted to keep.

For the time being, yes. I can afford £100-150 on a reasonable board, but can't spring £300+ for a decent RAID card on top at the mo as I've just bought two other mobos (fortnight of hardware hell, remember?) :D
What I want is a solid core. A mobo that'll stay up and stable for weeks at a time, and aside from the odd driver update I might do, still be reliable in 2-3 years time. After I've got that, then I'll sort out dedicated RAID later on in the year.
Having said that, how reliable is dedicated hardware RAID anyway? At least with software RAID if the board dies I can drop the drives in another S2k3 machine and recover. If a £400 Adaptec card dies and it's out of warranty, what do I do then?
 
"In many ways, you'd do better to buy two identical cheap boards and if one dies simply swap them over."

That was the plan two years ago. Unfortunately, I picked two nVidia 680i SLI boards; one for the server with a RAID5 array, and one for the gaming rig for mad overclocks. The list of issues with this quite frankly 'beta-at-best' chipset and weekends filled with harsh language, multiple reinstalls, lost arrays, and RMAs mean that this is their last gasp. After I've recovered my current array, I'm getting shot of both boards.
I do take the point that reliability is what I need, rather than a fully fledged server board. I have no need any more for PCI-X slots and don't have the cash or the need for dual processors and multiple banks of memory, I just figured that a 'proper' server board would be built to be the most reliable. In that vein, I'm thinking the Supermicro C2SEA is a good halfway house; a board from a company with a good rep for reliability, according to the reviews a very stable mobo, low cost, just with the small penalty of demanding DDR3. Opinions?
 
Eriedor:
I have to agree with the others here: At work we play with VMs every now and again and get by fine on some pretty slow C2Ds with less than 4GB RAM. Unless you're running several concurrently I doubt you'll need the kind of spec you're looking at.

Skidilliplop:
Good points well made, but for the time being it's the initial set-up costs that mean I can't go with hardware RAID. Seems to me that hardware RAID is quicker and more reliable, but has larger initial costs (to the tune of £100s extra) and when the hardware fails you will need either a spare card on standby, time to wait for an RMA assuming it's still under warranty, or cash for a replacement card, and if the model you were using is no longer available what do you do then? Software RAID is more prone to breaking and a lot slower, but at least with a Server 2K3 RAID, if the machine breaks you can drop the drives in another machine and rebuild from there.
Out of interest, anyone know if the ICH10R chipset is proper hardware RAID or this 'hardware-assisted software RAID'?
 
Ta. Probably won't bother with it then. After spending two years fighting with nVidia's RAID, it just seems to add an extra layer to the whole shebang, and an extra layer is an extra thing to break. Entirely software or antirely hardware RAID or nothing, I think.
Was keen on the Supermicro C2SEA, as all the official reviews are good, but after reading the user reviews at Newegg I'm not too sure...
 
Ordinarily, I'd say no, but considering I had/have (unknown until mobo replaced) a RAID5 array on my server syncing to a RAID5 array on my NAS and between hardware failures still managed to lose both at the same time, I'd say I'd prefer to have them rather than not. If in spite of the hardware and redundancy I've thrown at my data I can still lose the lot, I'd say moving from two syncing arrays to two syncing JBODs will only increase the frequency of the losses so I'll stick with RAID.
 
Well... Basically, I had a 4x500GB RAID5 array in my server connected to the nVidia chipset's RAID controller. I then had 3x750 discs in a Thecus NAS, again RAID5d. Then (home server on a budget, remember) a robocopy script copying everything from the server to the NAS every Monday night. Well, one of the HDs in the NAS failed (Seagate firmware problem), and when I pulled it I also noticed another of the HDs was actually the wrong model, albeit a 750 too, so I ended up sending them both back to where I bought them, hence no more NAS for the time being. I figured after a year of 24/7 flawless operation, the server would stay up for a week or two without. A week later, the 680i board in my PVR box died with the 'double dash of death', so no server board backup (getting nervous now, but what are the chances of two boards failing before the NAS is up again, right?). Then, a week later, NAS still not back up cos Seagate are dragging their feet with the firmware fix for RMAd drives, and the server 680i fails with the same double dash error. So, in the space of two weeks, no NAS backup, no backup mobo to swap out if it all goes to hell, and then no server board :) Ta-daa!!!

I've read lots of reviews now and looked at loads of Supermicro and Tyan boards, and the user reviews on Newegg are pretty damning of the lot. Considering what you guys say too, I'm now thinking maybe a decent desktop board (Asus P5Q-E?) is reliable enough for me while being expandable. Software RAID will have to do for a month or two until I can scrape together the readies for an Adaptec 31205 or 3805. You then reckon leaving the NAS as JBOD, right?
 
Back
Top Bottom