• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Used GPU price increase - due to memory, or??

Associate
Joined
30 Jan 2017
Posts
1,223
Location
Lincs
So I’m currently selling my 3090 and seeing the prices steadily go up and up, probably in-line with the current market for memory etc. But a lot of prospective buyers are quoting the fact they’re looking to build a home AI server.

Now I get it, a 3090 has a chunky amount of vram, but why the big rush to have home AI? Seems an expensive outlay when you can just ChatGPT it. Are people making money from home AI like they were with the last GPU gold rush (bitcoin)? What am I missing?
 
Last edited:
As an educated guess, it's the last generation consumer card that can be used in SLI, so you can get 2 and make use of 48gb of Vram, so if you need lots of Vram it's a bit of a bargain compared to an NVIDIA RTX PRO 5000 48gb.

But yes considering it's more than 5 years old and it's performance is pretty similar to a 5070, it is pretty crazy that they are selling second hand for £700+ considering brand new 5070's are £550
 
...Seems an expensive outlay when you can just ChatGPT it. Are people making money from home AI like they were with the last GPU gold rush (bitcoin)? What am I missing?
Privacy and cost I imagine.

A subscription costs £8+/month (OpenAI as my example) they store everything you write and it can be recalled if the powers that be ask. They will likely increase costs or start to use ads soon as they aren't actually making any money. No outages when AWS or Azure go down.
Download an open model on your home machine and its free* forever, no deprecation, no monthly subscriptions, no privacy problems. You can download any open model be it language, video or image and play away to your heart's content.

*electricity, initial outlay, blah blah :)
 
Last edited:
Actually forgot about SLI so I guess that’s a valid point for those who are seeking the cards. And I sorta get the privacy thing, even though I’m not worried about that personally. But it did make me wonder if people have found a sneaky way the leverage cash from having a personal AI server.

I’ll be honest, if I get 700 for my card, I’ll be over the moon!
 
Also just more demand for old or "cheap" GPUs. People aren't playing the latest games because they are mostly garbage, so don't want to pay the latest prices for GPUs. A new GPU to play old games is a waste of money.
 
Last edited:
You don't even need SLI for LLM's. The 3090 has decent memory bandwidth and 24GB VRAM, 2 x 3090 gives a lot more performance than 3 x 5060Ti 16GB for the same or less cost and few motherboards have more that 2 x16 or x8 slots.
There was a lot of excitement for the 5070 Ti Super with 24GB but that is on perma hold.
48GB is sufficient to run the Llama 70b models, a single card will run some quite decent smaller local models too.

Can be used for coding, image generation or whatever SFW/NSFW/illegal chat, story writing you don't want to be training data when ChatGPT starts rolling out ads.
Recommended for you..... "His and Hers restraints, non chafing"
 
Back
Top Bottom