Old DC Power Supplies... Dangerous?

Soldato
Joined
20 Oct 2002
Posts
3,469
Location
London, UK
The other week, I was testing the DC output power of the 3v power adapter for my alarm clock because it ceased working. I used a digital multimeter, and to my surprise it was outputting nearly 6V DC, despite the label clearly showing 3V :eek:

Whilst I still had my multimeter out, I wondered if it was normal for old adapters to output tonnes of extra voltage and/or current, and so I got out all my other power adapters I had sitting in my man drawer, and started testing them one by one (yes, I was bored :p).

Turns out, nearly every power adapter I had that was supposed to output 12V DC now gives between 18-20V, which is bad in itself, most outputted 5x the rated amps, and one of them which was rated for 1000mA turned out to output just over 9 Amps (9000mA)!!! :eek: That's more than enough to kill somebody, surely?

Just to put that in perspective, most standard plugs have 13 Amp fuses, and that's meant to be pretty extreme to say the least, meant for such situations as the live wire being shorted. Suffice to say I threw out all the dodgy power adapters, before somebody plugged them into something and fried the hell out of it.

Then I had an epiphany... perhaps the reasons I have all these spare adapters and no corresponding appliances was because they were most likely thrown out after being fried... by these very adapters. :rolleyes:

I did a quick search around on google to see if there were any articles on old adapters losing their voltage reduction capabilities but found nothing.

Anyone else had random appliances die on them for apparently no reason? Perhaps check the actual DC voltage of the power supply adapter and you may have your answer ;)
 
IIRC unregulated PSU's can/will often output a higher voltage or current if they are under no load, most cheaper transformers are (or were) unregulated.

I've got a variable voltage regulated supply here, and it's about 3-4 times the size of the unregulated ones i've got.
 
The other week, I was testing the DC output power of the 3v power adapter for my alarm clock because it ceased working. I used a digital multimeter, and to my surprise it was outputting nearly 6V DC, despite the label clearly showing 3V :eek:

Whilst I still had my multimeter out, I wondered if it was normal for old adapters to output tonnes of extra voltage and/or current, and so I got out all my other power adapters I had sitting in my man drawer, and started testing them one by one (yes, I was bored :p).

Turns out, nearly every power adapter I had that was supposed to output 12V DC now gives between 18-20V, which is bad in itself, most outputted 5x the rated amps, and one of them which was rated for 1000mA turned out to output just over 9 Amps (9000mA)!!! :eek: That's more than enough to kill somebody, surely?

If you connect a multimeter to it and nothing else of course your going to get a higher rating because your effectivley shorting it. Multimeters are supposed to be connected to a circuit under load (if you want to determine the current it outputs in normal operation). I = v/r, if you have almost no resistance your going to get a very high current, if you short an AA battery you will get a current of tens of amps. And no it won't kill you because I=v/r. ;)
 
Last edited:
If you connect a multimeter to it and nothing else of course your going to get a higher rating because your effectivley shorting it. Multimeters are supposed to be connected to a circuit under load. I = v/r, if you have almost no resistance your going to get a very high current, if you short an AA battery you will get a current of tens of amps. And no it won't kill you because I=v/r. ;)

Edit: Bleh, nice ninja edit :p

Ah yes, regarding amps, I have an A in A-level physics aswell, can't believe I missed that lol, must be cos it was 4am :p

But that still doesn't explain the higher voltages, i mean if I test ordinary 1.5v AA batteries they do show as 1.5v... :confused:
 
Last edited:
Edit: Bleh, nice ninja edit :p

But that still doesn't explain the higher voltages

Bad multimeter or incorrect setting? If the resistance of the multimeter is high the voltage accross it will increase. The only worrying thing is that the 1A transformer did not cut out after outputting 9A!
 
Last edited:
see edit ;)

Plus for some of the DC adapters it reported perfectly correct voltages so its definitely the right setting and the multimeter is brand new.
 
IIRC unregulated PSU's can/will often output a higher voltage or current if they are under no load, most cheaper transformers are (or were) unregulated.

I've got a variable voltage regulated supply here, and it's about 3-4 times the size of the unregulated ones i've got.

this :)

most modern psu's are much smaller switch mode type and don't require a certain amount of load to drop to the specified voltage

simple unregulated psu's (just a transformer, bridge rectifier and smoothing caps) are really a thing of the past now...
 
Last edited:
The other week, I was testing the DC output power of the 3v power adapter for my alarm clock because it ceased working. I used a digital multimeter, and to my surprise it was outputting nearly 6V DC, despite the label clearly showing 3V :eek:

Whilst I still had my multimeter out, I wondered if it was normal for old adapters to output tonnes of extra voltage and/or current, and so I got out all my other power adapters I had sitting in my man drawer, and started testing them one by one (yes, I was bored :p).

Turns out, nearly every power adapter I had that was supposed to output 12V DC now gives between 18-20V, which is bad in itself, most outputted 5x the rated amps, and one of them which was rated for 1000mA turned out to output just over 9 Amps (9000mA)!!! :eek: That's more than enough to kill somebody, surely?

Just to put that in perspective, most standard plugs have 13 Amp fuses, and that's meant to be pretty extreme to say the least, meant for such situations as the live wire being shorted. Suffice to say I threw out all the dodgy power adapters, before somebody plugged them into something and fried the hell out of it.

Then I had an epiphany... perhaps the reasons I have all these spare adapters and no corresponding appliances was because they were most likely thrown out after being fried... by these very adapters. :rolleyes:

I did a quick search around on google to see if there were any articles on old adapters losing their voltage reduction capabilities but found nothing.

Anyone else had random appliances die on them for apparently no reason? Perhaps check the actual DC voltage of the power supply adapter and you may have your answer ;)

as others said here there was nothing wrong with your psus, so you threw them out for no reason.. unregulated psus read high with no load. if you had measured the voltage while the device was connected that the psu was meant for you'd of found the voltage pretty much spot on

your 9amp current measured was because you shorted out the psu with your meter.. you are meant to measure current within a circuit with a load. im very surprised you didnt pop the fuse in your meter or blow a thermal fuse in your psu. (if unlucky and no fuse popped you could have caused extreme heat then fire)

remember current through you is the figure your looking at that kills (if 9A is flowing through a circuit you touching it does not expose you to 9 amps, but to what the power source can pass through your body to earth) and on a 240v AC mains supply it takes about 7mA (7/1000s of a amp) to cause muscles to contract and hinder your ability to let go of the source of the shock. this is why most modern houses have a breaker monitoring balance between live and neutral to detect leaks to earth (someone being shocked or a fault), these are rated in mA and offer some degree of protection against death

it's all about your path to earth... when doing anything electrical use one hand if possible and keep your spare hand behind. reason for that is if you got a shock and your other hand was touching a earthed object such as a radiator you have a direct path to earth accross your heart.. fry time

to summarize, the only thing dangerous was you shorting out your power supplys while 'checking them', best not mess with things if you don;t know what your doing... im really rusty on all this stuff as not done it for years, but knowing the basics keeps you safe, just wading in with a meter is abit dodgy, and in this case has cost you a bunch of perfect psus ;)
 
Last edited:
as others said here there was nothing wrong with your psus, so you threw them out for no reason.. unregulated psus read high with no load. if you had measured the voltage while the device was connected that the psu was meant for you'd of found the voltage pretty much spot on...

...in this case has cost you a bunch of perfect psus ;)
Well I just fished them out of my bin, I still have them :p

your 9amp current measured was because you shorted out the psu with your meter.. you are meant to measure current within a circuit with a load. im very surprised you didnt pop the fuse in your meter or blow a thermal fuse in your psu. (if unlucky and no fuse popped you could have caused extreme heat then fire)
My multimeter is rated up to 20Amps maximum 10secs at a time, and I only measured for a couple seconds :), but anyways come to think of it yeah what I did was pretty dumb, I don't know what I was thinking when I was measuring the current, I know its supposed to be in series in a used circuit as thats simple electronics 101. :o

I think I was just measuring the voltage across the positive & negative terminals and without really thinking thought I'd switch it to ammeter mode... doh :p

Oh well no harm done, apart from my OcUK pride, but a lessen learned, lol :D
 
Back
Top Bottom