PC build for Animation use - Advice needed please

why is everyone suggesting gaming cards for 3ds max, you should be looking at quadro fx 1800 (should be fine) sort of thing, especially considering mental ray is now coded (not in max yet) to use cuda.

Ok I do this type of work for a living and for £1000-1500 you can go for something along the lines of
i7 920 - quads are needed now
Case, PSU, CPU Cooler, motherboard of choice etc
12GB Cas 7 Ram - personally more ram is better (could probably get away with 6GB), cas speed isn't any real difference but I always go for faster stuff. RAM is used in modelling and rendering, especially with large scenes/models and image sizes.
2x1.5TB drives - storage etc
An SSD - scratch disk/temp work so as to speed up accessing materials etc - this slows render times.
Optical
Quadro FX1800 - designed for CAD work in Max etc, you will see a difference while doing the work but not while rendering (at present)
Windows 7 x64.

If you need the money for a screen, knock out the SSD and 6GB ram and then buy the 23" samsung f2380 monitor for about 240 quid :)

That should be 1500 there abouts and to be honest that will last her all the way through uni, no need for upgrades except maybe ram. But this is NOT a gaming machine, this a 3DS Max/Photoshop etc orientated machine.
 
A minor thing to add to lsg1r's spec, which is by far the most sensible in this thread so far.

The 920 should be competently overclocked, probably to around 3.6ghz. At such low speeds it's possible to have considerable faith in stability, I'm thrashing mine at 4ghz and can't get it to fall over, but I'm on water and I can't get 4.2ghz stable. Faster the better as long as you can be absolutely sure that it isn't going to get any calculations wrong which it would get right at stock.

Following on from this, 12gb at cas7 is optimistic. The imc isn't going to like this, though it is possible. I'm using 1600mhz cas8 stuff, and it'll do 1600mhz at cas 8 at stock but when overclocked I can't yet get it above 1333 cas8. I don't suggest spending any extra money on the ram to get lower latencies, as above they contribute little and I believe they will have a ruinous effect on cpu clock frequency.

This is a build for someone else iirc. My missus couldn't overclock a computer, so if I made one for her I'd have to clock it myself. Either the person building this needs to spend a week or so (potentially much longer if new to x58) overclocking the new build, or needs to get ocuk to do it in which case 12gb isn't available anyway.
 
Sorry I'm going to have to open up the old debate - quadro cards just aren't worth the investment for the return. Maybe in a cad package, but for 3d, nah. I've got both a quadro 4500 and a 9800gt in similar workstations, and orbiting 1 million polys, the 9800gt manages 60fps, the 4500 50fps. With gaming cards at the speeds they run these days the cpu is the bottleneck, not the gpu. As long as you research your chosen app to make sure about driver compatibility, there's just no need to get anything more.
 
Get the i7 for sure dude you can overclock the nuts off it, plenty of guides on the net on overclocking and there's many people here who can help.

I would build along these lines..

i7 920
Patriot 12800 6GB DDR3 (I would recommend 12GB if you can afford it)
Gigabyte EX58-UD5
Nvidia GTX 260
Crucial M225 64GB SSD (For OS and Apps)
Samsung 1TB F3 (I would put two of these in RAID 1 and use them for all your data)
Corsair HS-50 Cooler
Corsair Power Supply
 
The specview numbers are no doubt impressive, though I'm sure that nvidia go to great lengths to make sure that their drivers are optimised for that test. I just wonder how they apply to real world application usage. I gave an example of a personal experience with the cards I've owned, that's all.

The point I'm trying to make (though maybe not very succinctly) is that a gaming card will do everything needed for an app like max or softimage. If I can get 50 fps with a million polys, why would I need to spend money on a quadro that could be put to better use elsewhere?
 
I'm with Matt on this. I've tested many quadro models over the years and I havn't used a single one that's been worth the cash over a regular gaming card. Until 3ds max utilises GPU properly (by at least supporting dx10/11) then it's still a cash thing.

fact is, you might get say 50fps from an 8800 in one of your scenes and maybe 55FPS or 60FPS with a quadro. is it really worth shelling out a probably 4 times the cash for? Not for me it isn't.
 
Interesting. Are these in situations where you are cpu limited? It's difficult to imagine the benchmarks are completely independent of reality, as people put considerable effort into them and they look quite convincing while they're running, even down to seeing the difference in frame rate with different configurations. If cpu limited, a quadro isn't going to help at all. However if they offer nothing over gaming cards under any circumstances, they wouldn't sell.

The driver optimisation is definitely present. An 8800gt running quadro drivers performs significantly better than the same 8800gt, at the same clocks, than when running geforce drivers. That's from my experience, and from benchmarks on my hardware. I can also offer that increasing gpu clocks made very little difference to benchmarks with the q9550 at 3.4ghz, but improved them almost linearly with the q9550 at 3.8ghz. In gaming terms there's no way an 8800gt could be holding a q9550 back, hence my question regarding cpu/gpu bottlenecks in cad.

Again looking forward to your replies, as I'm yet to have a play with a real quadro.
 
Last edited:
Certainly cpu limitation comes into it. Especially when running an animation package, as soon as you scrub frames or press play, its all about the cpu, and the gpu is a very minor player. With Autocad or Solidworks, the focus on the gpu is much greater.

I don't doubt that any of the most recent quadros would have good frame rates for viewport work, and be reliable (driver certification is probably one of the biggest benefits of a quadro, so that you can be sure that it will play nice with your particular app), but if a gaming card is good enough for the rather limited job it has with max/maya/softimage, its can be a hefty price to pay for that peace of mind.

The last quadro I used was an fx5600 at work, had a very heavy scene with multiple characters and a cityscape behind, when I took the scene home I was working with, I could still orbit and do everything I needed to finish the job on the 8800gtx I had at the time. Though in that instance the quadro had the edge on frame rates the 8800gtx still managed to do a good enough job and for a fraction of the cost.
 
Listen to the Animators!

As another poster said, there's some very good, but also some genuinely terrible advice from some respondants on this thread who seem to be treating it like yet another gaming rig question without any apparent understanding of how animation software differs from games.
Without naming names, whoever said that animation doesn't require much CPU power gets a special Dunce award and a clip round the ear - for presenting the most wrong answer possible as his only advice. I have worked as an animator, both for 2D and 3D for a number of years (using Lightwave3D, 3DSMax, After Effects, Premiere among others), and I can say that the CPU is by far the most important component of any animation/multimedia rig.

The other animators who responded have it right: practically any gaming grade GPU these days will be more than enough power. The GPU is typically used only in the preview window of 3D sofware packages, and you would have to create a pretty demanding scene to tax it. Because the preview window is only really used for positioning keyframes etc, you can quite happy suffer drops down to 15-20fps with it still remaining usable for work. Not saying you'd want it that low, but it's just an illustration that a focus on the GPU here is just wrong.

Advanced User Interfaces, 2D/3D Rendering, Encoding, just about any other multimedia production task you care to mention, it's all done in software on the CPU. CPU and RAM, these before all else. I'd go for an i7 without hesitation, then 6-8Gb of RAM and as good a graphics card as your budget allows after that. The speed of the RAM is also just as important as the quantity.

Also, I might be on shaky ground saying this in an Overlockers forum but for heavens sake, if she's using this rig for important work, don't overclock it. Yes you may feel you can do it safely but if it goes wrong in the middle of her course and she knows about the bleeding edge tweaks you've been performing, you aren't going to be flavour of the month to put it lightly.

Another thing that hasn't been considered here yet is: dual monitor... Go round any animation/editing studio in existence today and you won't see a single monitor system. This is because all multimedia production softwares have extensive and complex User Interfaces that will eat up all the screen space you can throw at 'em and then some. I would say in my experience that its more useful to have two smaller monitors with a higher sum resolution, than it is to have one large widescreen. There is also a lot to be said for the easy compartmentalisation this affords... preview on one screen, tool pallettes on another etc. becomes very important on a project. As well, don't skimp on the mouse! That things gonna be used for hours and to make single pixel adjustments on value sliders etc. Listy your most recent spec, above, looks to be almost perfect really, if you can just squeeze another monitor in there, it will be a dream animation rig.

For editing get her to consider the features of Adobe After Effects instead of Premiere. It's a full blown professional 2D editing/compositing suite, a great compliment to 3D package, and has enough editing features to be used as an editor in place of Premiere. After Effects is used to produce a fair proportion of the TV ads you see on screen today. Remember Adobe have academic deals on their software too. Good luck to your girlfriend on her course, computer animation is great fun...

Just my 2p, but listen to the posters who have actually been animators, cos the gamers, for all their good intention, don't seem to fully get this one.
 
Last edited:
Good posts.

As far as overclocking being risky - I agree. This is why I suggest that the OP buys a pre-overclocked OCUK bundle, as it will be under warranty with OCUK if anything goes wrong.

Spot-on re graphics cards and quadros. Until quadros/fireGL offer something truly worth the money gaming cards are absolutely fine and recommended for preactically any CGI-based use.
 
earlier i posted that RAM isn't as important accidently, by this i mean that 4 gig seems adequete enough at least for my project work as its never 100% used when rendering but it comes close when doing extensive work in photoshop (lots of docs open at once)...

i tested 2 rigs with DDR 2 and DDR3 with a cpu at same speed as mine (Q9550) and rendering times were not altered by the speed of ram, CPU alone only seemed to make the difference.

hope thats cleared my side up
 
earlier i posted that RAM isn't as important accidently, by this i mean that 4 gig seems adequete enough at least for my project work as its never 100% used when rendering but it comes close when doing extensive work in photoshop (lots of docs open at once)...

i tested 2 rigs with DDR 2 and DDR3 with a cpu at same speed as mine (Q9550) and rendering times were not altered by the speed of ram, CPU alone only seemed to make the difference.

hope thats cleared my side up

aye, it depends on the scene/project. if your RAM is slow or you don't have enough for your scene, rendertimes will increase, because the renderer will have to start using your HDD instead, which is slower. RAM speed and size will directly affect how long it takes a scene to "translate" (the process that happens before any real rendering begins), thus the actual render may appear to take just as long but in reality over several frames (i.e. animation) you will notice a huge difference on complex scenes.

generally speaking animation is so intensive rendertime-wise that you will inevitably have to keep your scene relatively simple anyway, so it might well be a moot point. again it all depends on your scenes.

cheers,
 
I'm going to stick with my view that a quadro is the best option (not saying a geforce wouldn't work), its not just about the fps as everyone says its cpu intensive when rendering at the moment*, however the quality of the viewport is also important, geforce gpu's are not designed to work using lots of lines (ie wireframes) and the quality does suffer (I've used both as well). Drivers are also better optimised to suit 3ds max than a geforce is, JonJ678's benchmark thread can prove this. You also get customer support if there is an issue where as you wouldn't with gamers cards.

I will say that even I wouldn't go any higher than a quadro fx3800 myself as I personally don't think the money is worth it.

The only reason I can see that those out there saying that a geforce is good enough is that we obviously have differing requirements in our work as I would never go back to a geforce after using quadro's (and ati fire pro for that matter).

* Mental ray which is owned by Nvidia have just released a new version of the stand alone software, this is coded to work with CUDA and it will more than likely transfer over to 3DS Max at it's next release. Now I would put money on cuda being disabled on non quadro gpu's to ensure that nvidia can keep the higher prices etc. Jon's bios hack may work with it but there is issues with newer cards on that front
 
One thing I forgot to mention is that if you decide to go with 12GB you might find it a bit tricky to get a OC at 4Ghz or above but it is easily done with 6GB.
 
Put things in context

One thing I forgot to mention is that if you decide to go with 12GB you might find it a bit tricky to get a OC at 4Ghz or above but it is easily done with 6GB.

l33tz0r, seriously man, why are you - or anyone else - even talking about getting this guy to overclock his GF's animating rig to over 4Ghz? Are you going to provide the phone support and kleenex when she phones up 2 hours before a project deadline saying her machines locked up right near the end of a 12 hour render? Stability is key when animating and rendering.

Overclocking: yeah, cool when it's your rig and you're doing it for fun and nothings too 'mission critical', but you should think about how appropriate it is when giving others advice. I don't mean to be too hard on you as I'm sure you're giving your advice in good faith, but until you've lost hours and hours of work through system crashes (as I have on occasions) you'll realise there is value in playing things more conservatively at times.
 
Last edited:
l33tz0r, seriously man, why are you even talking about getting this guy to overclock his GF's animating rig to over 4Ghz. Are you going to provide the phone support and kleenex when she phones up 2 hours before a project deadline saying her machines lockedup right near the end of a 12 hour render? Stability is key when animating and rendering.

Overclocking: yeah, cool when it's your rig and you're doing it for fun and nothings too 'mission critical', but you should think about how appropriate it is when giving others advice.

Actually very true, there's been times where I have to complete a lot of programming work on Visual Studio and suddenly I get a BSOD whilst compiling the project.

It causes corruption and it pi**es you off
 
the whole worry over your pc crashing if you overclock isn't that serious is overclockde properly, whilst rendering in 3ds max it is also wise practice to render a animation frame by freamy as jpegs..............

you then stitch them together in an alternative program like after effects once each frame is done....
 
Sorry but I have been overclocking my systems for years, I've never had any problems due to my overclocks. I'm running a E6300 1.8Ghz @ 3.2 and have done for 2 years with two different motherboards, if the machine is in a proper production environment then a mild overclock can still be done with out compromising stability providing the machine is stress tested before being put work use.
 
Yeah I wouldn't go anywhere near a 4Ghz overclock on a production machine - I put my latest build (i7) through 2 days of prime 95 and it came out fine at 3.8, mental ray however was a different story, and only played nice once I eased off to 3.6. Watercooling prob would help, but still, for a machine that has to run unsupervised for long periods (does anyone really want to babysit renders all night?) stability is key.
 
Back
Top Bottom