Anyone thinking of Racking folders

Yup, thats them, looks like the datacenter is where quite a few of the big boys are so definitely a reseller. So going to go check out the setup, given what I do I think I'll be a reasonable judge, well actually a bloody finicky one but I'm not putting up a mission critical BCP site for a multinational here :)
 
Do let us know how you get on, it's the only provider i've found with even remotely decent rates. Everything else is completely out of my price range, even when clubbing together with a friend.
 
Just to point out, those prices are ex VAT, so it becomes £300/month or £3000/year.
Having said that, seems a very good price overall, if we were to club together for a whole rack or something I'd be interested in a few U's, if that's the right term :p once I have some hardware to put in them.
 
The pricing is remarkably close to the electricity cost of running the computers at home, assuming 4A of power means about 900W. Thanks for the thread - additional motivation to start building things in rack-compatible cases from this point on. I've got nothing to put in one of the racks yet though, and wont for at least a few months.
 
Even if you can find a way of doing this economically, and have your hardware all set up suitable for racking, how easy would it be to physically get your machines to the colocation point? When they fall over, which they will, can you get at them conveniently to fix the fault, hardware as well as "simpler" software reconfigs?
 
Even if you can find a way of doing this economically, and have your hardware all set up suitable for racking, how easy would it be to physically get your machines to the colocation point? When they fall over, which they will, can you get at them conveniently to fix the fault, hardware as well as "simpler" software reconfigs?

You normally get 15 minutes per day of on site assistance. Not sure if the same there. So at least you could ask someone to reboot.
 
how easy would it be to physically get your machines to the colocation point?

Depends where you live in relation to the datacentre, surely?

When they fall over, which they will, can you get at them conveniently to fix the fault, hardware as well as "simpler" software reconfigs?

From the link above, they offer "free reboots" in working hours, and 10 mins/day of "remote hands" which to me sounds like they'll fix issues, do resets etc for you. Also, I would assume you'd set up something to remote in, either via RDP, VNC, or whatever takes your fancy. I certainly wouldn't just leave a server there running blind with no way to access it from home. As for hardware, they say "build room access available" which I would assume means somewhere to work on the machines? If not, just head down, remove the server, fix whatever's wrong, and pop it back in - I can't imagine they just lock your server away and don't allow you access to it.

Different providers will offer different terms so the points you've raised are definitely valid, and you'd have to check with the provider before committing really.

Colocation of servers isn't for everyone but if you don't have room at home, or electricity is expensive, or the datacentre can provide a better environment in terms of reliability and cooling for example, it can certainly make sense.
 
With regards to remote management we've never had any issues with linux servers having just SSH access and if need be they offer free reboots during the day so its unlikely to be an issue. As biffa said its not mission critical stuff so if theres an issue during the night just wait till working hours.

Luckily for us all the HP servers have iLO2 cards and the Dells have Intel Remote Management Modules so we'reable to perform cold reboots and emulate pressing the power button on the servers if they lock up. Also have BIOS access and remote KVM which is very useful especially if RDP or VNC locks up (which it does).

With regards to hardware faults it looks like they have a build room so its a case of driving down there and doing the work yourself or paying for remote hands. If its just swapping out a failed graphics card i'm sure you could fit that in the 15 minutes free that you get each day providing you courier it to them first.

Again this is all assumption here, providing the place checks out ok i'd certainly be interested in sharing some space. The reason its cheaper is because its an economy service, example this place only offers 5 minutes worth of UPS in the event of a power outage whereas other DC's would be offering full redundancy. Not really an issue for me personally as its not mission critical stuff.
 
On a closely related point, how do data centers feel about water cooling? There's the risk of a 4U box leaking and damaging everything below it, and conceivably of getting water on the incoming power cable. So on balance I expect resistance.
 
On a closely related point, how do data centers feel about water cooling? There's the risk of a 4U box leaking and damaging everything below it, and conceivably of getting water on the incoming power cable. So on balance I expect resistance.

If you take a full rack its your responcablitly what you put inside it. Really there's no need to water cool a PC in a rack since air flow is front to back with a nice 14-18 deg c cold side temperature you should have the coldest running machine you've had in a while!
 
Back
Top Bottom