New high-end GPUs are hugely power-hungry. The 40 series is looking for 450W to 600W, maybe more. The form factor imposes considerable restrictions on cooling and noise control. This is a quandary. But we now have high bandwidth external connections like USB 4 and 100 Gb ethernet. So the question arises: does the GPU really need to be an internal device any more? Why not make the GPU its own device external to the PC? Not just a standard card on an adapter in a box like current eGPU solutions. Making the GPU external would allow for innovative design, cooling, and power solutions while removing the heat and noise problems from the PC.
It also frees up PCIe lanes in the PC.
Low-power GPUs can still go in the PC.
What do you think?
We're definitely heading towards more integration, ideally they want to sell you CPU/GPU combos and force you to upgrade both at the same time and not allow you to get a big CPU without a big GPU (and vice versa). Apple is already doing it. Nvidia is doing it in data centre, and it's only a matter of time before AMD does it too. With Intel coming into the GPU market, they are also in a position to do it.
Apple and Nvidia (in data centre) are also already doing unified CPU/GPU cache and memory, AMD is moving towards it every generation and Intel won't be far behind.
What's likely to happen is that Intel/Nvidia/AMD will release a "motherboard" that has soldered CPU, GPU and RAM. Some of the lower-end ones will have CPU/GPU in one chip, higher-end models will have them on separate dies or chiplets. Some AIBs may offer various versions of the motherboard with different ports, integrated features or even support for additional RAM. You put it in your PC and attach your own cooling, storage and connect your power supply. This will be the future of DIY PCs.
Last edited: