Matching chargers with devices, milliamps rating

Soldato
Joined
16 Nov 2003
Posts
9,682
Location
On the pale blue dot
What are the rules when it comes to mixing chargers and devices, when it comes to the number of milliamps they can put out?

I would have thought say a device normally has a 1000mA charger and you plug in a 500mA one, it would take twice as long to charge. What would happen though if you put the 1000mA charger in the device that normally accepts the 500mA one?
 
Nothing because the current is determined by the voltage and resistance, I = V/R. The rated current is simply the maximum it can provide.
 
I always thought that in the first instance, it might overload the charger, and in the second it'd be fine. Never tried it though.
 
Yup, amps is fine to have above, watts I think the same. Voltage needs to be correct though or else yes the potential for "bang".
 
nope. first case: charge half as fast.
second case: charge twice as fast, possibly damaging the battery.

You're wrong. So long as the voltage is correct, than a higher amperage rating is fine as the device will use only what it requires (I = V/R). A transformer that is underrated on the current side will most likely run hot and almost certainly not supply sufficient voltage for the device to run properly.
 
500 mA Device / 1000 mA Charger Supply - Fine, no trouble or overheating or any of that, as long as the voltage is correct.

1000 mA Device / 500 mA Charger - It may work, but the device could overdraw the charger and either cause it to shutdown, or overheat. This depends how much the device actually draws typically , as opposed to its maximum.

Its original charger may be 1000 mA , but the device may only draw 800 mA typically at full load - and anywhere below this when its "just" charging or idling - so the second option could work fine - or could overheat and catch fire. Unless you can measure it/calculate it, always best to use the correct rated charger :)
 
Last edited:
Back
Top Bottom