What does that mean though?
Should everybody be running a 1200W just to have 600W of spare power for a spike?
I wont pretend to fully understand how the spikes etc work. All I can say is the highest at-wall pull I have seen is 560-580W with a 100% load on CPU & GPU...
So I have what... 300W~ headroom at minimum for spikes?
Here OptimumTech tests 600W PSUs with 3080s:
https://www.youtube.com/watch?v=Bdohv96uGLw&t=3s
Well a 3090 is way outside my comfort zone so haven't paid that much attention to it, but after the release didn't Nvidia tame the beast a bit and it is now a lot less bursty at the cost of a small bit of performance (think it was only 1% or so)?
But PSU should be able to handle some bursts too, which is why some people are fine with a 650W and a 3080 but others have trouble with even 850W. Even good quality supplies will handle burst differently and some will just consider a ms long burst to be a good reason to turn off rather than possibly burning something out. Both approaches are legit, but one doesn't play to well with a GPU boost bursting to nearly 600W for a bit. I do blame Nvidia somewhat here rather than the PSU supplier: either cut back the burst and lose 1% or less (which is what they did AFAIR), or put a lot of expensive capacitors on the GPU board to smooth out the burst (if that is possible).