In reply to the lovely yellow text, my green text...
Indecently, I do agree with you, a fuse is a last line of defence to protect you in the event of too much current being drawn. It is, however, largely irrelevant to my situation as I hope never to draw that sort of current. To be anal, the maths given worse case scenario would be a total draw at the wall of 921.5W, add 10% for losses (very generous) is 1013.65W, at 230v this is 4.407A, giving me a plentiful headroom over my 10A strip.
I've always been told this was an absolute no-no and thus have adhered all my life to the rule that this should never be done. Good. Keep it up.
However with the outlets in my current flat it's really bloody awkward to have extension leads all over the place... Oh-Oh
...and life would be considerably easier if I could daisy chain a few together. Ah, an easy life. When did that ever lead to trouble?
So, putting my logical head on, I cannot see the problem? Oh well, that's it decided then!
So long as the total load across two power strips does not exceed the total power limit of the first strip, it shouldn't cause an issue? Wrong. The maximum load of the first strip is based on the characteristics of the first strip... once you add another strip on to it you then have something different. You did say you had your logical head on, right? Surely the second strip is seen to the first strip as a single load though? My first strip is rated to 10A, therefore can take 2.3KW. Remove the 8.5w (lets call it 10 for easy maths), that leaves us with a theoretical maximum of 2.29KW to "spare". If the second strip has a load less than this figure, surely the strip does not know it is from a multiple sources on another strip, or perhaps say a kettle or a fan heater, which nobody would think twice about running in this situation?
Perhaps the resistance of the extra connections may cause some reduction in power limit but I imagine this is minimal on our voltage-high amperage-low power we have in the UK? You have quite the imagination. Voltage-High/Amperage-Low? That's not unique to the UK, it's part of the laws of the Universe. It's true that voltage is indirectly proportional to current, but I've no idea what the relevance is. Using words like "perhaps" and "imagine" should be enough of an indicator that you're having a stab in the dark here, mate. A little bit of stabbing, otherwise I would have not started this thread. The relevance of I vs V is important, as high currents cause heat when passed across a resistance, not high voltage. Going by Joule heating, being that P = I^2 R, then a halving of voltage (such as in America) would cause a doubling of current draw, and thus a 4X increase in heat produced if this current was passed along a wire with a fixed resistivity. We could expand this into the Peltier effect to measure the heat increase over the terminals within my plugs, however I am not knowledgeable of the Peltier coefficients of the components within my extension leads so we can count those as negligible.
If I were running several plugs at their limits, the way the power is delivered (as in P=IV) is incredibly important if heat is your main concern.
So before I go blow myself up, I am looking to install strip B, into strip A. Strip A will have one Phillips Hue bulb only and be plugged into the wall. Strip B will have my PC (750w PSU with only one GPU, I'd estimate ~400W max.), my monitor and some KEF Egg speakers. These will be plugged into strip A through a WiFi plug. So you've convinced yourself that "So long as the total load across two power strips does not exceed the total power limit of the first strip, it shouldn't cause an issue" ... and then you've decided to stick a WiFi plug right in the middle of it all. The purpose of a WiFi plug is to be the conduit for 3 extension leads carrying various items of electrical equipment? Really? Incorrect, the wifi plug only carries the computer, not the 8.5w (perhaps a little more due to the resistivity and resistance of my Ikea lamp's wiring) from the Philips Hue bulb. Given if I maxed out my 750w PSU, my monitor (according to Samsung) draws maximum 100w and my speakers running a 63w brick (if for some reason they drew the max current) then I would have a maximum of 913W. What is different between an extension lead with this combined power useage or say, a 900w Microwave?
Am I stupid? Probably not, but you do give an insight into the mental thought process that we go through as human beings, all to justify doing something that we knew was wrong right from the start. Correct, I wanted to run this setup as it's the only easy way for me to do it within the room my PC resides and as a human I am lazy and want my cake and to eat it. Or should that be I want my PC in the far right hand side of my room with the nice view and to power it? The whole point of a forum is to gain advice or discuss. In this case I was hoping for reasons why it is not a good idea to do it. Right now, you've said it's not, but have yet to provide real reasons why it's a bad idea?
Indecently, I do agree with you, a fuse is a last line of defence to protect you in the event of too much current being drawn. It is, however, largely irrelevant to my situation as I hope never to draw that sort of current. To be anal, the maths given worse case scenario would be a total draw at the wall of 921.5W, add 10% for losses (very generous) is 1013.65W, at 230v this is 4.407A, giving me a plentiful headroom over my 10A strip.