• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AI in a CPU - what would/could it actually do ?

Associate
Joined
22 Jun 2018
Posts
1,596
Location
Doon the watah ... Scotland
AI is very much a thing going forward, in CPU’s, software, OSes … but to me it seems that a lot of the big stuff is also back ended with massive amounts of AI CPU crunching hardware or have been trained on massive datasets of pictures and literature.

Does all backend ChatGPT, stable diffusion, do all the learning on the large dataset which ends up in a distilled, much smaller ‘thing’ which can be run on a much smaller system ?

So what is a chunk of silicon on a desktop CPU actually going to be able to do? I just dont get it. Can someone explain it in a simple manner ?
 
Associate
OP
Joined
22 Jun 2018
Posts
1,596
Location
Doon the watah ... Scotland
Lets say you wanted to create a meme, or a piece of art but you're not an artist, so you simply explain to the AI what you want, you describe it and the AI will do it for you, this can be as simple or as complex as you like.
Just to name one of thousands of use cases.

Ok, so there is no reason why a CPU or GPU can't do that with its existing technology, but to run the Machine Learning Neural Net it might take a lot of house power, a lot of power consumption and a lot of time, a specific AI accelerator might do it in a fraction of the time with a fraction of the power consumption.

My GPU, which has AI did this in 3.9 seconds, the GPU barely even woke up from its sleep.

Are we getting the picture yet? :)
So is your stable diffusion running solely on your own machine, and not having to send/fetch anything from a back end ?

I’m just wondering how it knows what is mountain, sky etc … that is what I meant by distilling the large dataset into a smaller ‘thing’ that runs.
 
Last edited:
Associate
OP
Joined
22 Jun 2018
Posts
1,596
Location
Doon the watah ... Scotland
I watched that video and it does really make you think that a lot of jobs will be at risk… but as above you will still need an element of getting the ideas of what to draw first done by someone, then ai can help derive content from that supplied idea.

I installed fooocus to play around with things. I have got to say it’s flipping amazing what gets generated, and I could really see that someone savvy.

So I do see how AI NPU processing performance on a chip help that side of things.
 
Associate
OP
Joined
22 Jun 2018
Posts
1,596
Location
Doon the watah ... Scotland
There are definitely some positives to AI, but atm its jut a buzzword that most people have no idea how it'll help them out day to day
Absolutely, hence the thread asking.

I’m now wondering how powerful the npu sections of a cpu will be compared to a typical gpu. Would there be much point to having cpu processing if the gpu outstripped its computing power by miles?
 
Associate
OP
Joined
22 Jun 2018
Posts
1,596
Location
Doon the watah ... Scotland
We're living through 1984 right now,so I'm expecting Terminators soon enough.
When I see a naked man appear from nowhere in a ball of Lightning then I’ll be worried.

I do think though that the AI hype train is near a peak though and will fade to something more normalised. I just can’t see past that it appears to be a means to infer and discern based on data that has gone before, rather than really looking forward if you know what I mean.

Going back to my OP, I’m still trying to find something from th AI hype that makes me think ‘oh, that would be really useful to me’ in a desktop OS / CPU environment.
 
Associate
OP
Joined
22 Jun 2018
Posts
1,596
Location
Doon the watah ... Scotland
if I read it rightly, the new CPUs are in the region of 40-50 TOPs which is the right threshold to meet windows copilot+, but reasonable rtx 20xx cards start at about 100 and go up from there.

But then you read how lower cards are hampered by their vram amount and AI performance can be boosted greatly by access to ram …. Which a cpu could potentially have more of. Ie a typical computers 16gb ram is more than the average about of gpu vram.
 
Last edited:
Back
Top Bottom