• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU prices are bound to fall, AI training costs likely to drop

All I see are ever increasing prices for gpu's and I don't see them ever returning to reasonable levels. Pricing of the 5000 series is disgusting, a 80 series card for £1150+!! The last 80 series I bought was a 780GTX for £315 and that was a EVGA high end FTW card. Due to price increases I was then forced to a 70 series card, the 1070 which was again a high end EVGA FTW for £410. Next was a 3070 at the height of Covid which cost me a whopping £600 and which I will never pay that much ever again. While Nvidia has increased vram on the 5070 to 12Gb I am not paying the price it's commanding for a 12Gb card so like many others I am waiting to see what the jokers at AMD can do with their new cards although again I can see stupid prices incoming. Keep it up and I can see the end of PC gaming coming before long.
 
Well, looks like the genie is out of the bottle and open source is already spurring solution on consumer grade hardware:

This puts a semi-decent LLM (which while not as good as the full thing, ought to be good enough for retrieval augmented generation) at SME-level budget, while reducing dependence on datacenters.
 
Well, looks like the genie is out of the bottle and open source is already spurring solution on consumer grade hardware:

This puts a semi-decent LLM (which while not as good as the full thing, ought to be good enough for retrieval augmented generation) at SME-level budget, while reducing dependence on datacenters.
You'd probably still use a data centre and probably still want Nvidia GPUs for the workloads, but this does give you more options.
 
I think people overhype "AI" anyway. It is dumb "AI". I asked Chat GPT who the president of the USA was yesterday, turns out it was Joe Biden...I believe Joe left the job 9 days back.

AI is not really AI, it is just hyped up machine learning that has made a bubble of over-investment in that sector. When they were openly discussing building nuclear reactors to power such data AI centres, then you know something has gone wrong.

If the Chinese "AI" works as cheaply as they say, then AMD, Nvidia, and other companies, will have a lot of spare stock soon; forcing them to drop prices whether they want to or not.

I've asked today and it said with the 4o variant (paid if it makes any difference):

As of January 30, 2025, the President of the United States was Donald J. Trump. He was sworn in for his second term as the 47th President on January 20, 2025.

news.sky.com


On January 30, 2025, President Trump signed several executive orders.

youtube.com


For more information about President Trump's administration and policies, you can visit the official White House website.

whitehouse.gov


Additionally, President Trump maintains an active presence on social media, where he shares updates and announcements.

x.com


For a detailed overview of President Trump's recent activities, you can watch the following video:



The problem with ChatGPT is that it doesn't look for information unless you specifically ask for it (and it that case it does a fairly decent summary of web searches), so you're getting a statistically correct answer at the times of the last training set update (which doesn't happen daily or weekly given the costs!).
I understand why it's like this by default (way cheaper to give a statistical answer than to do retrieval augmented generation) but it's the main reason why LLMs can be misleading.

I do work with ML but I'm far from an LLM expert, that said I still think the current model relies too much on brute force and too little in actual content analysis.
Frankly most people suck at asking questions and I've yet to see any LLM asking the user to detail what they want and instead just spew what they think will please the user, which is good enough let's say for retail environment but unacceptable on many other topics.

This is utterly dangerous because gen Alpha is growing up being used to virtual assistants and even part of gen Z (and I suspect not uncommon at older ages) is starting to rely on LLMs for emotional support.
I suspect we're ending up in the cyberpunk timeline after all, next stop ED 209 as LEO is coming...
It does look (at least in my case), no need to look for it specifically, but I guess it depends for what you're looking for.
 
Back
Top Bottom