Home server AI stack anyone?

Associate
Joined
12 Apr 2016
Posts
34
Hi All,

Skimmed through a few pages of threads and didn't spot any home server AI stack builds. Has anyone here built himself a home server for own AI stack?
I'm interested in assembling a budget rat rod server, so would really appreciate suggestions on minimum spec for minimum budget to keep it viable. The first build would be purely to test it out before I commit some serious spondoolicks to it. Ideally, start small and upgrade as I go.
Will most likely put ubuntu server on it.

From initial reading, GPU is key while I can probably save a few quid on CPU and motherboard?

Any thoughts or suggestions welcome.

Thanks!
 
Personally I haven't gone beyond using a Dell QBM1250 with Intel 265 for AI related tasks (as an offload from my main systems) - for any serious local LLM use, etc. you quickly need video cards with a LOT of VRAM.
 
Last edited:
Anyway, my current budget home PC build is happily running on B550 Ryzen 5500G,.

Am I too boring to consider a build on B450 with Ryzen 5500 or 5600x and RTX 3060 12GB?
What form factor to pick? mini-ITX in metalfish T60 perhaps? Would there be an appropriately-sized server PSU for this?

Thanks.
 
Last edited:
I'm a bit shocked by RAM prices. I picked up a pair of new 16gb DDR4 corsairs for about £50 less than 2 years ago. 32gb RAM seems to cost £250 now. Wow. Should have invested my life savings into ram sticks in 2024.

Does anyone have build plans for a budget-friendly time machine? Happy to buy used parts and Chinese knock-offs.

Looks like I'll just re-purpose my current box into an ATX case to run some AI crunching on a new GPU.
 
12GB on a 3060 isn't enough for anything other than a tiny model with a minimal context window. That might be OK for your purposes but it'd have to be VERY specific tasks with models suitable for that - you won't get much choice.

Anything under 32GB "VRAM" isn't going to be much use but I suppose YMMV...
 
Back
Top Bottom