Getting Github Copilot Pro for $10 a month is well worth it. Works in VS Code, Jetbrains and Xcode.Use VS code and install an add on called Copilot
You don’t need 128gb, let alone 256 to dabble with local AI. Even the 7b models are fine for messing with, and any CPU/RAM setup is going to be glacially slow compared to something that’ll fit on a GPU.I'm working on a new build at the moment and initially it was going with mini-atx and two ram slots, so I bought 128GB.
I've put most of the bits together, but then found an ATX case that would fit where I want it, so I'm trying to decide if I should ditch the micro and go full atx so I can have 256GB ram to last me a few years and be able to dabble with local ai.
Not sure it's worth it though? Would the bump be worth it you reckon?
You end up with single digit tokens per second when running large models on CPU. The Mac Studio/Mac air aren’t bad options as the unified memory means the GPU has direct access to system memory….although to load a Mac Studio up costs megabucks (£10k+).Well I wanted 128GB anyway as I frequently max out my 32GB in Adobe products.
I tested Topaz a while back and they shift to using your normal ram when the video card caps out, so I figure the more of that you have the better.
I've seen vids on youtube of people with Macs and 512GB of RAM, so....there are ways to fill it up if you try! It'll be a while probably before I bother to get a 5 series graphics card though.
I already spend a lot on the frontier models every month through work.