Associate
AI is very much a thing going forward, in CPU’s, software, OSes … but to me it seems that a lot of the big stuff is also back ended with massive amounts of AI CPU crunching hardware or have been trained on massive datasets of pictures and literature.
Does all backend ChatGPT, stable diffusion, do all the learning on the large dataset which ends up in a distilled, much smaller ‘thing’ which can be run on a much smaller system ?
So what is a chunk of silicon on a desktop CPU actually going to be able to do? I just dont get it. Can someone explain it in a simple manner ?
Does all backend ChatGPT, stable diffusion, do all the learning on the large dataset which ends up in a distilled, much smaller ‘thing’ which can be run on a much smaller system ?
So what is a chunk of silicon on a desktop CPU actually going to be able to do? I just dont get it. Can someone explain it in a simple manner ?