• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ai machine build

You're not doing that on £200, and you've made multiple threads regarding the same topic.

Out of interest, what is your overall budget or already existing components?
 
Last edited:
Looking around it seems rtx gpus go for around £200 so was hoping to find someone whos built out a local ai using similar and can advise. Wasnt expecting this kind of reaction tbh.
 
For £200 and your initial statements, you're giving the assumption you're wanting to build an in house AI model for your tasks. That's going to be slow going, I'd look at the second hand market but even then you might be better off subscribing to an existing commercial AI model such as ChatGPT.

Again, what is your existing hardware and goals?

My expectations may be higher, but I'd be looking at something around 3090 performance to build an appreciable custom AI aid in house for even semi-commercial use, and preferably multiples of said card.
 
Last edited:
For running AI locally, hassle free, you would need an nvidia GPU with a-lot of VRAM (preferably 16gb+).

For £200-£300, I would recommend the RX 9060XT 16gb variant if you NEED brand new. It is AMD so you will have to do various workarounds such as installing zluda and amd’s rocm framework. It may not work properly on all AI software so be sure whatever you want to run supports it. However it does have 16gb of VRAM which is good. If you prefer to go the NVIDIA, hassle free route, the rtx 3060 12gb seems to be on sale for like £249, however it is many years old now so you’re basically buying an obsolete product with less VRAM.

Your best bet would be to look at the second hand market.
 
I was looking at the 3060 12gb as am not looking for super fast or huge vram i just want local ai to assist with coding, design and some sprite generation.
 
Hi
Am looking to build out a developer machine with a local ai assistant for coding and design. What gpu would be best with a budgets of £200?

I doubt you're going to get very good local AI for something like coding quite frankly, you'd be WAY better off using something like Grok on a subscription, even just the basic X premium version at £8 a month is good, I use that all the time and was experimenting having it write chunks of code for gamemaker, it does a great job at writing boilerplate code.

I do dabble with offline AI for image generation and video and the real bottlenecks are vRAM, that limits the length of video you can reasonably generate, and sometimes the size/resolution of images if you're doing a lot of upscaling in your workflows. Going for Nvidia cards with Tensor cores is best, but honestly at £200 that's really tough. I would genuinely put that towards a cheap online service over a video card.

If you absolutely want to get a video card to do it yourself, something used from an earlier generation like the 3xxx or 4xxx nvidia series would be the best value for money, see if you can pick up a 3070 used or something like that.
 
Thanks folks. I think ill go for the cheaper 3060 with 12gb. On delving deeper my reqs seem to fit into 7b llms and on image generation ill rarely be going above 1024 pixels. Ill see how it goes and upgrade when someone gives me a pile of dosh. Thanks again
 
Back
Top Bottom