• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU prices are bound to fall, AI training costs likely to drop

Eh, I think you still don't get it.
Things are priced in USD before any taxes or tariffs is applied.

They aren't priced based on what consumers in the USA actually pay at the till. It's the importers and retailers that are charged the tariffs, not the manufacturer or distribution towards the USA.

The last time the trump tariffs were in effect literally nothing outside the USA was affected.
I see what you're saying so we'll have to wait and see. I still think it will be used as an excuse to raise prices globally.
 
I see what you're saying so we'll have to wait and see. I still think it will be used as an excuse to raise prices globally.
Don't get me wrong, I didn't disagree with you with opportunity strikes with higher prices, look at what AMD tried to pull recently so I get that.
 
I can see it now ... 6000 series launch in 2.5 years. Jensen turns up on stage in trackie bottoms & top pleading with gamers that he was only joking when he said we all have loads of dosh to afford 10K Command centres.
I am old enough to remember similar tech leaders saying 3D TV's were the future, nobody wants a touchscreen phone, 640kb and should be enough for everyone. What Jensen says doesn't necessarily happen. For instance, 5070 being as fast as a 4090.
 
I find the idea that a reduction in demand in GPU's for AI will somehow reduce the cost of GPUs for gaming unlikely, exept in only the shortest of terms, maybe 1 generational fire sale to maintain company value but that's about it. The only reason that NVDIA and TSMC could afford the insane R&D cost is because of the huge consecutive tech bubbles we've had. We are already well past the originally expected point of diminishing returns for silicon process nodes.

Maybe we will finally get optical chip interconnects, this was discussed even 20 years ago when i was at uni and the advantages in engergy consumption and signal poropogation speed were well known then, even so it still hasn't gone anywhere.

Maybe wide bandgap materials will finally become cost effective and enable a step jump in clock speeds, so far it's always been held back by vastly higher cost of the materials.

Sorry for the bleak outlook but i genuinely think that a real high end PC will be close to Jensens $10K USD again (as it was in the 90s)
 
Nvidia are a major player in the industrial revolution we are witnessing. Whether it's AI or robotics, healthcare or driving, it's clear that they are no longer prioritising gaming GPU's.
 
We've already hit peak AI; the actual intelligence part hasn't changed much, AI is still pretty dumb. So what we're doing now is making the same dumb AI, but doing it cheaper because we can't make it any better

People will get fed up with it pretty quickly when they find out it wont tell them the winning lottery numbers, it wont pick the perfect partner for them, and it wont go and do their work whilst they stay in bed.
 
People will get fed up with it pretty quickly when they find out it wont tell them the winning lottery numbers, it wont pick the perfect partner for them, and it wont go and do their work whilst they stay in bed.
Yea but it will be cool to ask your phone whether the new mole on your back has cancer-like symptoms, or automate some repetitive task like the coding assistants provide, or diagnose a car problem via a back and forth conversation.

There are a lot of valid applications but we’re still very early in the process. I agree that it’s way overhyped in its current state, and shoved down our throats at every opportunity, but there is value.
 
I think people overhype "AI" anyway. It is dumb "AI". I asked Chat GPT who the president of the USA was yesterday, turns out it was Joe Biden...I believe Joe left the job 9 days back.

AI is not really AI, it is just hyped up machine learning that has made a bubble of over-investment in that sector. When they were openly discussing building nuclear reactors to power such data AI centres, then you know something has gone wrong.

If the Chinese "AI" works as cheaply as they say, then AMD, Nvidia, and other companies, will have a lot of spare stock soon; forcing them to drop prices whether they want to or not.
 
Last edited:
I think people overhype "AI" anyway. It is dumb "AI". I asked Chat GPT who the president of the USA was yesterday, turns out it was Joe Biden...I believe Joe left the job 9 days back.

AI is not really AI, it is just hyped up machine learning that has made a bubble of over-investment in that sector. When they were openly discussing building nuclear reactors to power such data AI centres, then you know something has gone wrong.

If the Chinese "AI" works as cheaply as they say, then AMD, Nvidia, and other companies, will have a lot of spare stock soon; forcing them to drop prices whether they want to or not.
The problem with ChatGPT is that it doesn't look for information unless you specifically ask for it (and it that case it does a fairly decent summary of web searches), so you're getting a statistically correct answer at the times of the last training set update (which doesn't happen daily or weekly given the costs!).
I understand why it's like this by default (way cheaper to give a statistical answer than to do retrieval augmented generation) but it's the main reason why LLMs can be misleading.

I do work with ML but I'm far from an LLM expert, that said I still think the current model relies too much on brute force and too little in actual content analysis.
Frankly most people suck at asking questions and I've yet to see any LLM asking the user to detail what they want and instead just spew what they think will please the user, which is good enough let's say for retail environment but unacceptable on many other topics.

This is utterly dangerous because gen Alpha is growing up being used to virtual assistants and even part of gen Z (and I suspect not uncommon at older ages) is starting to rely on LLMs for emotional support.
I suspect we're ending up in the cyberpunk timeline after all, next stop ED 209 as LEO is coming...
 
I don't know enough to say for sure but anything involving or coming out of China I would treat with a fist full of salt. The authorities other there will back any project or business if it forwards the governments agenda, makes China look strong or hurts the west
We've seen this play out with consumer goods like cheap mobile phones which are nothing then Chinese spying devices, electric cars and 5G technology.
 
As opposed to Korean (Samsung) phones who are Korean spying devices or expensive Chinese phones (you know, the cool ones with a piece of fruit logo) who datamine you for the cult of Steve?
You can also choose to be datamined by Sundar Pichai, Jeff Bezos or Satya Nadella according to your selection of devices...

Political bull aside, their model is open source and there have been censorship-free versions already released if you know where to look, the main issue might be the bias on the training set but that's pretty much the same problem as other LLMs.
 

OpenAI had security breaches in 2023 and 2024, I suspected a chinese hacker probably stole OpenAI data and sold it to Liang Wenfeng who is the owner of both High Flyer and DeepSeek companies.
The hard part is not the code, the hard part in a LLM is gathering the data, be sure that a breach couldn't steal a datacenter...
 
Bet you weren't ready for it
FB-IMG-1738142926378.jpg
 
Last edited:
Back
Top Bottom