Does anyone have experience of using AMD GPU's for offline AI?
I'm currently running a 10gb RTX 3080 in my SFF living room PC connected to a 4k LG OLED TV. I use the PC for a mix of media consumption, gaming (recently discovered Helldivers 2), and some AI (mostly text-generation in LM Studio). The whole thing runs seamlessly, but I'm ready to take things up a notch. I'm willing to consider the upcoming 9070 XT, given its 16gb's of VRAM and (potentially) 7900xt/4080 levels of raster (as well as the improved RT, which would look great on this TV).
My one concern are the number of conflicting reports I have read online about people struggling to get LLM's running on AMD.
Anyone here that can share their experience of AI on AMD?
I'm currently running a 10gb RTX 3080 in my SFF living room PC connected to a 4k LG OLED TV. I use the PC for a mix of media consumption, gaming (recently discovered Helldivers 2), and some AI (mostly text-generation in LM Studio). The whole thing runs seamlessly, but I'm ready to take things up a notch. I'm willing to consider the upcoming 9070 XT, given its 16gb's of VRAM and (potentially) 7900xt/4080 levels of raster (as well as the improved RT, which would look great on this TV).
My one concern are the number of conflicting reports I have read online about people struggling to get LLM's running on AMD.
Anyone here that can share their experience of AI on AMD?