Yup, theres several things, as shown the 376W is total system power, you can not work out gpu power output from a wallplug monitor and thats where people measure it, some people include the efficiency adjusted numbers(the difference between the amount drawn from the plug and the amount the system uses with some of that power being lost to the PSU's draw of power), and some don't.
Most games tend to use less than the "full" power of the card as it won't be completely loaded at every second, but other apps will use the full power available. THen you have to take into account problems like furmark sli/xfire giving inaccurate numbers.
The first [H] review showed Furmark going up 100W from a 480gtx, to 480gtx sli, which isn't much, they incorrectly decided this means the gpu doesn't use close to 300W.
Their next review showed 470gtx's in sli using 250W more than one on its own, but IN A GAME(metro 2033). Furmark doesn't accurately load dual gpu's.
In worst case scenario I'd expect quad fire 480gtx's to be using 1200W, alone and another 200W for everything else, in games, where the 2nd/3rd/4th gpu's don't scale that well, meaning they use less power, average it out to 220W per gpu and you're at 900W for the gpu's and 200W for the other bits and bobs.
I wouldn't dream, I mean, I wouldn't even consider quad sli on less than a 1500W psu, maybe a 2kw psu to be honest because, why risk it. You might generally only use games but if you stumble on some awesome physx game in the future and forget you left something that uses gpu acceleration in the background, a page with flash open using acceleration that bugged out and put a massive load on the gpu, I wouldn't risk having less than they could all use in the worst case situation.