• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AI in a CPU - what would/could it actually do ?

Associate
Joined
22 Jun 2018
Posts
1,674
Location
Doon the watah ... Scotland
AI is very much a thing going forward, in CPU’s, software, OSes … but to me it seems that a lot of the big stuff is also back ended with massive amounts of AI CPU crunching hardware or have been trained on massive datasets of pictures and literature.

Does all backend ChatGPT, stable diffusion, do all the learning on the large dataset which ends up in a distilled, much smaller ‘thing’ which can be run on a much smaller system ?

So what is a chunk of silicon on a desktop CPU actually going to be able to do? I just dont get it. Can someone explain it in a simple manner ?
 
Lets say you wanted to create a meme, or a piece of art but you're not an artist, so you simply explain to the AI what you want, you describe it and the AI will do it for you, this can be as simple or as complex as you like.
Just to name one of thousands of use cases.

Ok, so there is no reason why a CPU or GPU can't do that with its existing technology, but to run the Machine Learning Neural Net it might take a lot of house power, a lot of power consumption and a lot of time, a specific AI accelerator might do it in a fraction of the time with a fraction of the power consumption.

My GPU, which has AI did this in 3.9 seconds, the GPU barely even woke up from its sleep.

i used to think that AI was a fad. having played with it i don't any more.
Now you know why people who create content are raging against this, because now everyone is an artist, content creator.
One more thing, AMD, Nvidia, Intel..... use AI to help them develop the hardware you're running.
Also, AI is being use to make breakthrough developments in cancer treatment.

Are we getting the picture yet? :)

E9kHRlv.jpeg
 
Last edited:
From what I understand, it is comparable to the other instruction sets that your CPU supports, like AVX which accelerates particular stuff your CPU might need to do regularly while running that workload.

I think the NPUs are a bit different and more comparable to the media encoding/decoding hardware in your GPU, in that they're offloading the task to a dedicated chip that is designed for the purpose, but I don't know how much offloading occurs (i.e. how independent they actually are from the CPU):

Wordy words:

Unlike general-purpose CPUs and GPUs, NPUs are optimized for a data-driven parallel computing, making them highly efficient at processing massive multimedia data like videos and images and processing data for neural networks. They are particularly adept at handling AI-related tasks, such as speech recognition, background blurring in video calls, and photo or video editing processes like object detection.

NPUs are integrated circuits but they differ from single-function ASICs (Application-Specific Integrated Circuits). While ASICs are designed for a singular purpose (such as mining bitcoin), NPUs offer more complexity and flexibility, catering to the diverse demands of network computing. They achieve this through specialized programming in software or hardware, tailored to the unique requirements of neural network computations.
 
Last edited:
Currently most AI compute is just matrix multiplication. GPU’s do this very fast as graphics work also uses this a lot. CPU’s can also do it but much slower, even when using SSE/AVX. Think AI is moving to 1-bit LLM’s as it uses a lot less memory and it much faster, what this means for the early NPU’s I don’t know as the hardware for 1-bit LLM is much simpler, it just uses matrix addition and subtraction.
 
Another case of companies inventing a non-existent problem so that they can sell you a solution.

there is no current application that a small NPU would solve for you. As useless as RTX on a 3060 for example.
 
The two driving factors are lower power use and upfront cost compared to other options. Offering dedicated hardware to accelerate workloads to a CPU adds nothing to a systems BLM compared to an add in part.
 
At current the best thing it can do is make people think they need a new CPU. It's great for the marketing team.
Oh, this new model has an NPU that does this many TOPS!? I probably need to buy it.

Down the line, I'm sure it will become useful.
 
I know I've been joking about my GT1030 lately,but on co-pilot (win10 desktop) I can ask it literally anything and within 5-10 seconds I have my answer. I got it to make me a Scottish Highlands desktop image and it only took about 8 seconds. It looks so realistic I actually have it as my desktop wallpaper. GPU or CPU? Don't know. This is with a W3850 Xeon btw.
 
Last edited:

Tom's Hardware said:
Unfortunately, laptops with Intel Meteor Lake or AMD Ryzen 7000 CPUs aren’t powerful enough to make the cut so, if you bought one hoping it would benefit from future Windows updates, you wasted your money. The first Copilot+ PCs will only come with Qualcomm Snapdragon X series processors.
 

Interesting... for the notebooks you have first click on show more and then scroll through a lot of marketing slides to get to the bottom of the page where you will find a small "Show all specs" button, only there will you find that all 3 of those notebooks are running Qualcomm.

Its hard to know if ARM on Windows is something Microsoft thinks its a good idea or if its because Qualcomm in a bid to get their foot in the door of this market are giving them away for about $25 a pop.

Who knows if this is a long term thing or not, it seems to depend on how Microsoft feels about it when Qualcomm stops giving their chips away, if that's what they are doing.
 
Lets say you wanted to create a meme, or a piece of art but you're not an artist, so you simply explain to the AI what you want, you describe it and the AI will do it for you, this can be as simple or as complex as you like.
Just to name one of thousands of use cases.

Ok, so there is no reason why a CPU or GPU can't do that with its existing technology, but to run the Machine Learning Neural Net it might take a lot of house power, a lot of power consumption and a lot of time, a specific AI accelerator might do it in a fraction of the time with a fraction of the power consumption.

My GPU, which has AI did this in 3.9 seconds, the GPU barely even woke up from its sleep.

Are we getting the picture yet? :)
So is your stable diffusion running solely on your own machine, and not having to send/fetch anything from a back end ?

I’m just wondering how it knows what is mountain, sky etc … that is what I meant by distilling the large dataset into a smaller ‘thing’ that runs.
 
Last edited:
It runs entirely offline. It's pretty magical tbh.
You can download different checkpoints, I guess trained on different data.
I have no real need to use it, but it's pretty awesome for generating weird space/planet/explorer style landscapes. I've considered using it for creating teaching flashcards, but currently I prefer to use my own drawings.

Pretty sure the current low power NPUs in AMD/Intel chips would be rubbish in SD. No idea, but I've not heard of anyone using them for this.
 
What it needs to replace are these weird little talentless influencers. Let an AI do all the "influencing"
Amen. Probably fabricated for clicks anyway judging by the gormless thumbnail.

Already happening with a lot of advertising content going AI driven. Talentless coasters are the ones in trouble, skilled artists and higher performing people are going to be fine if they embrace AI tools.
 
I watched that video and it does really make you think that a lot of jobs will be at risk… but as above you will still need an element of getting the ideas of what to draw first done by someone, then ai can help derive content from that supplied idea.

I installed fooocus to play around with things. I have got to say it’s flipping amazing what gets generated, and I could really see that someone savvy.

So I do see how AI NPU processing performance on a chip help that side of things.
 
Back
Top Bottom