Home render farm

Associate
Joined
27 Nov 2008
Posts
5
Hi

Just wanted to ask people's advice about a home render farm for visual effects and animation. I'm upgrading my setup to be faster and more space efficient.

At the moment I have an i7 3930k 32gb desktop as my workstation, 3 x i7 3770k 16gb desktops and a hp xe 8400 8 core Xeon with 16gb ram.

I kinda gathered this setup in a rush for an animation project which needed a lot of rendering. Now I'm trying to organise everything a bit better.

I just bought a qnap ts869 pro 8 bay nas for my storage, now I'd like to organise my render setup into a smaller space - ie a rack mounted setup. It has to be cost effective if possible reusing the hardware I already have and adding to it or if its cost effective to sell what I have and start again then that's fine I don't really want to spend more than another £1000 on render nodes, I've noticed some decent dell poweredge units
View item:
DELL PowerEdge C6100 4 Node 8x Xeon Quad Core 2.26GHz 32GB DDR3 Cloud Server VTx


But I'm aware I'd be using a home power supply and these things can be thirsty. Also ideally I need to keep the noise and heat at reasonable levels as ill be in the room with these machines.

If anyone has any tips or ideas on setups and hardware I'd really appreciate it I think I made the wrong choices last time around!

Thanks in advance

Marc
 
V-Ray in which package? Max? Most of my exposure is with V-Ray and some lesser known renderers for Revit and Sketchup. V-Ray, Arnold and Clarisse IFX are all multi-threaded CPU renderers natively, only V-Ray RT (Only works in Autodesk 3DS Max iirc?) allows for GPU-acceleration. All that means is you want lots of CPU Ghz and lots of CPU cores.

Given what you've said about not wanting to spend more and using the kit you already have:
Look into setting up a render farm (render slaves) by networking all the existing machines together. The implementation would vary between each of the aforementioned products. There's lots of documentation for setting up a V-ray Max render farm, I would think that's the easiest way forward.
Keep an eye on the hardware utilisation of each of your render nodes, you might be able to unlock a lot of additional performance easily by installing more RAM for instance. Network and disk aren't a bottleneck usually, but don't hold me to that. :)

In the future:
Take a look at online rendering services which charge per Ghz/Hour (GhzH). If they fit into your workflow they can be a really good solution for small studios. Internet connectivity is often the biggest issue, especially when working with texture-rich assets.

If that doesn't work out, the Dell VRTX is designed for small businesses to virtualise their services without taking up a ton of space / being fairly quiet. They are a good solution if it does what you need and not much more. Besides that, you could have a look at the Dell C-series but I think they're out of your price range at the moment.

All of this is completely finger-in-the-air without knowing your usage, but you shouldn't need to throw away your existing kit too soon.
 
Last edited:
I'm racking my brains trying to remember the manufacturer of them but years ago a company I worked for supported an animation company and those guys had two or three of these 1U boxes which were essentially blade servers - two individual servers in one rack-U. It was better on power and space than two servers but had all the advantages of being two boxes (eg for being tasked with different renders or just for when things were RAM-bound - the RAM to CPU ratio seemed to be a big performance tuning factor).

If I remember I'll post back :)
 
I'm racking my brains trying to remember the manufacturer of them but years ago a company I worked for supported an animation company and those guys had two or three of these 1U boxes which were essentially blade servers - two individual servers in one rack-U. It was better on power and space than two servers but had all the advantages of being two boxes (eg for being tasked with different renders or just for when things were RAM-bound - the RAM to CPU ratio seemed to be a big performance tuning factor).

If I remember I'll post back :)
Supermicro have tons of products like this. If you can find a reseller to support it then it's definitely an option. The largest non-technical issue with render farms is balancing cost with utilisation - like having a car which can do 200mph most of the time you don't need the speed, but it's great to have if you do.
 
Last edited:
Strangly enough ocuk will sell supermicro kit b2b. Otherwise workstation specialists (Derby) are friendly and know their stuff.

Rendering is usually cpu bound - have you considered a number of overclocked systems? I'm trying to justify water cooling a quarter rack full of single socket systems as a home compute cluster but that would be an expensive toy.

Edit: supermicro sell a 3U box containing eight single socket 2011 systems for about 3k, without ram or processors. Serious power to volume to price ratio. It would be crazy loud though (main reason I mention water above)
 
Last edited:
I don't want to overclock the render nodes as they'll be hot enough I have my 3930k watercooled and overclocked to 4.4ghz by yoyo tech London but I found it unstable over long periods of time especially when simming for several days I find out very frustrating losing work and have since removed the over clocking even though I paid a premium to buy it.
 
Heat, Cooling, Power and in your case, Noise, are going to be your worst enemies.

Before you go and blow a ton of money on more hardware, take a look at the software side of things and let that guide you towards the hardware. It's no good loading up on dozens of cores if it doesn't scale that well, or maybe you can get away with lower clocks with more cores which saves on power etc...

There's no rush. Do some research on what you want to achieve, what other people have done and what's the generally agreed best practice.
 
Last edited:
Back
Top Bottom