• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU for AI (considering RTX 3060 12GB)?

Associate
Joined
5 Sep 2020
Posts
296
I'm interested in getting a GPU for running AI apps locally (mainly text and image). As I understand it I would best consider nVidia cards with as much vram as possible. Given a very limited budget (up to about £250) I could probably run to a second hand RTX 3060 12Gb card - but I'm open to other suggestions.

Is there to choose between different 3060 cards? I mean they will all have the same number of Cuda cores and Tensor cores and things like ray tracing are irrelevant.

NB. I don't play games and the only other intensive GPU activity I might undertake would be some video editing (probably not 4K in the foreseeable future)

Thanks in advance
 
I'd have a look at the A770 too, since articles like this suggest it could be quite good if drivers/apps are optimised to take advantage (Arc has additional hardware like nvidia does).
 
I would suggest getting the RTX 3060 12GB as well for that budget you're rocking. I believe tests done so far show that the 4060 isn't any better, so you'd need to head to the next band up to get any more performance and capability out of it (more VRAM).

Whilst Intel and even AMD could be jumped onto for AI, they have more hoops to jump through. So if you're just starting off, the Nvidia cards would be an easier starting off point as many more software is compatible with them over the other (edit) GPUs (/edit), which may require work to get going, or in some cases, is near impossible to get going without a re-write of the software (edit) to support them.

:: edit ::
The 3000 series is suggested because of the position it has rooted itself as for AI software; it was the latest widely supported GPU generation for a while. Whilst 2000 series is supported as well, the issue is they lack the VRAM and power available and updates to improvements from what is available from the 3000 series. The 4000 series has largely been seen as a minor side grade to the 3000 series, which means if the silicon doesn't have the performance boost (or enough VRAM) then there's only marginal improvements on the most used AI software. Plus there's the price premium on the 4000 series over the 3000 series. So if you are just starting off, and with that budget range you're looking at, if you're not looking to do much debugging and getting into things, then the Nvidia 3000 series GPUs would be your best bet I say. (Although remember to get as much VRAM as possible).
 
Last edited:
I'd have a look at the A770 too, since articles like this suggest it could be quite good if drivers/apps are optimised to take advantage (Arc has additional hardware like nvidia does).
That's an interesting suggestion. The A770 has a generous 16GB Vram, however it was never widely available in this country. Whereas the price in the US seems to have fallen nearer to $200 here in the UK it is over £300.
 
I would suggest getting the RTX 3060 12GB as well for that budget you're rocking. I believe tests done so far show that the 4060 isn't any better, so you'd need to head to the next band up to get any more performance and capability out of it (more VRAM).

Whilst Intel and even AMD could be jumped onto for AI, they have more hoops to jump through. So if you're just starting off, the Nvidia cards would be an easier starting off point as many more software is compatible with them over the other (edit) GPUs (/edit), which may require work to get going, or in some cases, is near impossible to get going without a re-write of the software (edit) to support them.

:: edit ::
The 3000 series is suggested because of the position it has rooted itself as for AI software; it was the latest widely supported GPU generation for a while. Whilst 2000 series is supported as well, the issue is they lack the VRAM and power available and updates to improvements from what is available from the 3000 series. The 4000 series has largely been seen as a minor side grade to the 3000 series, which means if the silicon doesn't have the performance boost (or enough VRAM) then there's only marginal improvements on the most used AI software. Plus there's the price premium on the 4000 series over the 3000 series. So if you are just starting off, and with that budget range you're looking at, if you're not looking to do much debugging and getting into things, then the Nvidia 3000 series GPUs would be your best bet I say. (Although remember to get as much VRAM as possible).
Really good advice. Thanks.
 
Stable diffusion 3 got released to influences last week. No date yet on any public release. No massive leaks I've seen about memory requirements apart from that it will be higher, but "be configurable for all types of hardware".

I only use Stable diffusion 2 (SDXL) in a basic way - couple of plugins, single stage rendering node, standard 2x upscaler, then auto Adetailer. It cranks out a single picture using 8gb memory. If I stick the batch number up to 24 it uses 12gb. I understand a more fancy 4x upscaling stage using its own custom base model will double the ram usage.

So I'm guessing 12gb will be fine for Stable diffusion 3 basic use, but require more if you want to really get into it. @OP Its a real tough decision you have as to whether to go for the 3060 or the 16gb Intel 770.
 
Back
Top Bottom