• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU prices are bound to fall, AI training costs likely to drop

Associate
Joined
3 May 2021
Posts
1,284
Location
Italy

If true, the AI bubble is going to deflate soon at least for needed GPU horsepower.
In simple terms, if datacenter GPU demand drops the consumer segment will increase its relative importance and will have less competition from higher margin products.

Now let's see if they can pull another trick from the leather jacket sleeves...
 
Yeh it was never going to last forever. Was shortsighted to think no one but Nvidia/the US could come up with something good.
 
Well, that's if you believe what the chinese gov is saying. Also the costs are likely to be heavily subsidised by the chinese gov too, like they do with EV's.
 
Well, that's if you believe what the chinese gov is saying. Also the costs are likely to be heavily subsidised by the chinese gov too, like they do with EV's.
Except it's a startup and not a government project.
From what I read it's a team of quants who moved from crypto to LLM and focused on training efficiency to reuse the hardware they already had laying around.

But even if it's with chinese gov subsidies as long as it takes demand away from datacenter GPUs it's a good news for us.
 
Interesting to see how this pans out, Nvidia futures are brutal. Plenty of meltdowns already but things could get really ugly.
 
Interesting... :)

It might not drive costs for GPU's to train on AI, but it might reduce the requirements of needing ultra high VRAM and speeds on them too in order to do certain stuff (once you go beyond a certain point), so it might be possible to get away with 12GB 3060 (or at most 16GB VRAM cards) instead of needing 3090/4090/5090 with 24GB+ VRAM when transformers are involved along with the models now.
 
Oh dear . . . :cry:



 
Last edited:
If anything I'm surprised it's a Chinese company. Their tech companies are usually renound for putting out large claims then delivering a shart, seems this might be different.
 
Just stop buying it and they will soon get the message...but people don't, so nothing will changed.
 
I expect them to launch DLAI FG (fact generation). It uses AI to interpolate what the AI is going to spit out, and inserts the facts between the traditional AI facts to give you a complete fact.
5000 series can infact spit out 3 parts of the facts for only one real part of the fact which will make it bigly faster than the older generations.

When an AI model running on nvidia 5000 using DLAI FG was asked who is the greatest of them all the response was "The greatest is Nvidia", with the only traditional part being "The".

/s
 
Back
Top Bottom