Advice on speccing an AI Workstation to build around a 5090, when I get to the front of the queue lol.

Associate
Joined
7 Jan 2025
Posts
10
Location
Reading
Hey - I am looking to finally get a local system. Up til now I use cloud servers by the minute - I will need a whole AI workstation building - and will need help speccing it up and stuff I currently do all my imaging and video workflows on L40S 48GbVram 64GbRam - so thinking 5090 +128Gb Ram, no gaming needed. It will be running local Ollama / other LLMs and Comfyui workflows and probably unix based for ease of loRa training for Hunyuan Video etc. Ideally I want to keep the total cost under £5k - I already have a screen I use for my macbook, was thinking about 4Tb fast storage, I have 900mb home broadband via fibre - so something that can take advantage of that through ethernet, as power efficient as possible - as I think it's going to cost a fortune to run in the uk at 30p a kwh? But any experience on that gratefully received. Thank you. I hope I haven't opened a bag of worms...
 
If there is no gaming involved maybe a 9950x would be fine with 64 GB or 128gb of c30 6000 MHz ram. There are a lot of reasonable prices motherboard options. 4tb 990 pro or equivalent is like 200 odd. A functional reasonable prices case is like what 80-150. Power supply pick and choose from 1000w PSU. Cooling can be done with air or AIO at a fractional price thermal right for air or AIO. The biggest two expenses are of course the CPU and video card.

Any thoughts to which 5090 you are going for. I think your build will come to a shade over 4k with everything needed maybe even slightly less

I know you mentioned no gaming needed. But it will still be able to play games at a very high level simply because you need a high end CPU and GPU for your work.
 
Last edited:
Thank you - i didn’t realise there was a choice of 5090’s? I want 32Gb Vram - probably the fastest version available as I’d like to try and get 5 years use out of it - through the way AI progresses, it’s probably not realistic. Is the high power use likely to only be when really pushing it - or just when it’s left running - i will probably aim to use it remotely in the house most of the time - via my iPad Pro
 
I saw the msrp of the 5090 seems to be $1999 / £1999 (already a con lol) but I see 4090’s are way more than that already - is there some kind of demand based hiking from all suppliers of these?
 
Seems like there is some kind of new PC architecture recommended now for 5090 in AI. -
NVIDIA NIM microservices and AI Blueprints will be available starting in February with initial hardware support for GeForce RTX 50 Series, GeForce RTX 4090 and 4080, and NVIDIA RTX 6000 and 5000 professional GPUs. Additional GPUs will be supported in the future.

NIM-ready RTX AI PCs will be available from Acer, ASUS, Dell, GIGABYTE, HP, Lenovo, MSI, Razer and Samsung, and from local system builders Corsair, Falcon Northwest, LDLC, Maingear, Mifcon, Origin PC, PCS and Scan.

Learn more about how NIM microservices, AI Blueprints and NIM-ready RTX AI PCs are accelerating generative AI https://nvidianews.nvidia.com/news/nvidia-launches-ai-foundation-models-for-rtx-ai-pcs
 
Last edited:
All of the 5090's have 32gb ram if you are going to be running them non stop 24/7 get the with the best cooling solution you can. Traditionally these have been the Asus strix cards and suprims for the past 2 generations or so but they cost a premium. Alternatively the tuf series from Asus are pretty good as well.

If you are going to be running them for 5 years at least the zotacs have a 5 year warranty.

If the 9950x and 5090 doesn't have enough computing power for your use case then unfortunately there is nothing on the market that can exceed from a home user point of view. You would need to step up to the true workstation levels of thread ripper and workstation cards from nvidia which would deffo exceed your 5k budget
 
Thanks for the advice all - I had a chat with NVIDIA - and I think I’ll end up leaving everything to settle for a while and see what options end up for 2 card potential systems, once everyone figures out power issues etc
 
Thanks for the advice all - I had a chat with NVIDIA - and I think I’ll end up leaving everything to settle for a while and see what options end up for 2 card potential systems, once everyone figures out power issues etc
Well the Nvidia professional card setups still have SLI I do believe that is when you can connect 2 gpu's together. The difference betwwen the Nvidia pro cards and normal home/gaming cards is with the pro series Nvidia controls the quality and specifications where as the home market they give the specs to manufacturers and leave it up to them.

The cost significantly increases by a lot and by a lot I mean it makes the 5090 for the peasants.
 
Last edited:
I was thinking more along the lines of getting a single 5090 now and adding in a second card later - the systems I use have multi gpu addressing - or those that dont will by the time the cards are out. - I think the cold hard reality - that as an obsessed hobby AI’er - I’m still more cost effective in the cloud l40s servers at $1 an hour for a while longer.
 
I was thinking more along the lines of getting a single 5090 now and adding in a second card later - the systems I use have multi gpu addressing - or those that dont will by the time the cards are out. - I think the cold hard reality - that as an obsessed hobby AI’er - I’m still more cost effective in the cloud l40s servers at $1 an hour for a while longer.
From what I gather from reading your posts even though you are a self proclaimed hobbyists. Your use case far exceeds 95% of the peeps here. Hobbyist or Professional you need advice from the peeps here that need/use that kind of processing power. I only play league of legends my biggest concern is whether my top laner is inting like a mofo. Needing multiple gpus for AI work is beyond my pay scale. Although I am one of the very few or a small minority that loved it when Nvidia/Amd allowed multi GPU use for gaming setups.
 
Back
Top Bottom