• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AI in a CPU - what would/could it actually do ?

We're living through 1984 right now,so I'm expecting Terminators soon enough.
When I see a naked man appear from nowhere in a ball of Lightning then I’ll be worried.

I do think though that the AI hype train is near a peak though and will fade to something more normalised. I just can’t see past that it appears to be a means to infer and discern based on data that has gone before, rather than really looking forward if you know what I mean.

Going back to my OP, I’m still trying to find something from th AI hype that makes me think ‘oh, that would be really useful to me’ in a desktop OS / CPU environment.
 
Me and my dad have talked about movies from the 70's onwards and what is 'scary' is how a lot of things have actually came true. Look at RoboCop 1&2. It was satire back then, but hit's actually happening as we speak.
I wouldn't be surprised if the movie 'a boy and his dog' becomes reality lol.
 
Last edited:
When I see a naked man appear from nowhere in a ball of Lightning then I’ll be worried.

I do think though that the AI hype train is near a peak though and will fade to something more normalised. I just can’t see past that it appears to be a means to infer and discern based on data that has gone before, rather than really looking forward if you know what I mean.

Going back to my OP, I’m still trying to find something from th AI hype that makes me think ‘oh, that would be really useful to me’ in a desktop OS / CPU environment.
I think we’ve already seen the peak as we’re seeing all of the failed ventures now.

Ignore all of the LLM stuff, having an NPU is nice purely for the fact that it is a) efficient at ML workloads, and b) takes that strain off of the CPU. Background blur for your webcam is a useful example - depending on how it’s implemented, it’s pretty garbage on a CPU and can generate a lot of heat, whereas the NPU can do it without breaking a sweat. Laptops will benefit the most from it.
 
The reason we are seeing all this consumer AI hardware is that all the companies providing AI services right now are eating a massive loss because they're paying for all the compute (ChatGPT, Copilot, Bard etc).

The next generation phones and laptops are being specced with 16Gb ram minimum and inference hardware so consumers start paying the energy bills.
 
Last edited:
The reason we are seeing all this consumer AI hardware is that all the companies providing AI services right now are eating a massive loss because they're paying for all the compute (ChatGPT, Copilot, Bard etc).

The next generation phones and laptops are being specced with 16Gb ram minimum and inference
*Apple have entered the chat*

You’ll get 4GB of RAM and be happy about it.
 
Last edited:
Absolutely, hence the thread asking.

I’m now wondering how powerful the npu sections of a cpu will be compared to a typical gpu. Would there be much point to having cpu processing if the gpu outstripped its computing power by miles?
very good point, im sure cpu's will still have some use. But atm they definitely seem to be the limiting factor when it comes to most things.
 
if I read it rightly, the new CPUs are in the region of 40-50 TOPs which is the right threshold to meet windows copilot+, but reasonable rtx 20xx cards start at about 100 and go up from there.

But then you read how lower cards are hampered by their vram amount and AI performance can be boosted greatly by access to ram …. Which a cpu could potentially have more of. Ie a typical computers 16gb ram is more than the average about of gpu vram.
 
Last edited:
Even if that were true,I doubt Microsoft would care. They would rather people be using/reliant on Windows than be using a rival OS. Pirated or not.
This is where it’s headed. Windows will be free, so your data and audience becomes the product.

macOS is free but obviously there’s the hardware tax. Linux is also free.
 
This is where it’s headed. Windows will be free, so your data and audience becomes the product.

macOS is free but obviously there’s the hardware tax. Linux is also free.
I'm not even on about say,win 7 8 or even 10. Microsoft has known pretty much since windows inception that it is heavily pirated. They really don't care. The amount of patents they own far outweighs any losses from windows licenses. They own loads of patents that all Android phones etc use,and they make serious bank from that alone. The more people on Windows the better, simple. And they have achieved that.
 
I'm not even on about say,win 7 8 or even 10. Microsoft has known pretty much since windows inception that it is heavily pirated. They really don't care. The amount of patents they own far outweighs any losses from windows licenses. They own loads of patents that all Android phones etc use,and they make serious bank from that alone. The more people on Windows the better, simple. And they have achieved that.
Completely agree. I’m sure a large percentage of home users pirated Windows and Microsoft have never really made an effort to stop it or change their pricing model, but I could see them doing it now so that they can justify the data scraping.
 
Yeah that's exactly what it is,data scraping/personal info. Hey,if they wanna watch 2 hot lesbians go at with me,more power to them. I'll even get the drinks in goddamit. I'm a god damn sexual tyrannosaur.
 
Back
Top Bottom