Does anyone here know the difference between mA & mW

Caporegime
Joined
27 Nov 2005
Posts
25,515
Location
Guernsey
Does anyone here know the difference between mA & mW ??

As am trying to workout if a USB port has the power to run a SSD drive

USB2.0 rated output 5v 500mA
My SSD is rated as needing 5v 150mW to power it..
 
While searching around about this i found this..

USB 3.0 could soon drive monitors, hard drives with 100W of power

The next USB 3.0 specification will provide up to 100 watts of power to devices, allowing users to power some of the more demanding gadgets on their desks without additional power supplies. The USB 3.0 Promoter Group announced that the standard would allow USB 3.0 ports to power and charge devices like notebook PCs and would remain backwards compatible with USB 2.0 devices.

USB 3.0 ports introduced high data transfer speeds of up to 5Gb/s to compatible components, and have also been able to maintain currents and voltages up to 900mA at 5V for a maximum power output of 4.5W. This was about twice the maximum power output of USB 2.0 ports, but a current USB 3.0 port would still struggle to power most external hard drives.

Since the new spec raises gives USB 3.0 more than twenty times its old power input and output, the slate of products the ports could power is much larger and includes monitors, desk lamps, and even notebook PCs. The beefier ports could clear up some of the crowding at wall outlets, and if adopted as the main power connector for items like laptops, could help eradicate proprietary ports.

The USB 3.0 Promoter Group says the new standard will be ready for industry evaluation at the end of 2011 and is set for release to manufacturers in early 2012.
http://arstechnica.com/gadgets/news...wer-laptops-monitors-soon-to-be-a-reality.ars

Can't wait for motherboards with these USB sockets to come out..
 
Will it still be at 5v though or will we see a new breed of peripherals at 20v/5A instead - perhaps with autodetect/switching? If the autoswitch ever failed though.... ouch!
 
Will it still be at 5v though or will we see a new breed of peripherals at 20v/5A instead - perhaps with autodetect/switching? If the autoswitch ever failed though.... ouch!
With dum hubs and such like can't see how they could change the voltage. Would just be a disaster.
 
hope OP doesn't mind the thread hijack, but I've been brushing up on my very rusty knowledge of such things lately and still haven't got my head around it fully. I keep reading the water analogy, but it doesn't cut it for me.

Say we have 10 amps, which is a definite amount of electrons passing a given point in an electrical circuit at a point in time (I understand about coulombs, just wanted to keep it as simple as possible). Then at a voltage of 10V we have 100W of power. If we have a voltage of 5V then we have 50W of power.

What is missing for me is that in both the 100W and 50W examples, the same amount of electrons are passing a given point at the same time, yet they are doing different amounts of work (because of the voltage). So from a physical perspective, what is different about the electrons passing through the electric circuit between the 100W and 50W examples (as amount of electrons flowing is the same in both cases i.e. the same amps)?

I probably look like a plonker for asking the question, but hey :)
 
hope OP doesn't mind the thread hijack, but I've been brushing up on my very rusty knowledge of such things lately and still haven't got my head around it fully. I keep reading the water analogy, but it doesn't cut it for me.

Say we have 10 amps, which is a definite amount of electrons passing a given point in an electrical circuit at a point in time (I understand about coulombs, just wanted to keep it as simple as possible). Then at a voltage of 10V we have 100W of power. If we have a voltage of 5V then we have 50W of power.

What is missing for me is that in both the 100W and 50W examples, the same amount of electrons are passing a given point at the same time, yet they are doing different amounts of work (because of the voltage). So from a physical perspective, what is different about the electrons passing through the electric circuit between the 100W and 50W examples (as amount of electrons flowing is the same in both cases i.e. the same amps)?

I probably look like a plonker for asking the question, but hey :)

Voltage= Energy/ Coulomb. I think that answers your question, the energy of the electron changes with increased voltage. If I understand correctly and try to recall if the energy of an electron is constant.
 
That a good point and i can't see how it going power monitors as they are normally 240v AC

The external input on a monitor is 240ac (or more normally 110~120/220~250), which is almost immediately run through transformers and inverters inside it to get it to the normal voltages used by the actual electronics.

Think about it for a moment, a lot of monitors have an external power brick that does the voltage conversion down, and every single laptop that runs of a battery tends to use somewhere around 7-25v (often again with internal transformers to get it to the right voltage for individual parts - some parts will run at 1.5v such as the CPU, some at 3.3 etc).

About all it would require for a lot of monitors to run at 5v would likely be some changes to the internal power boards, so that rather than an external 110v or whatever, they worked from 5v, it would probably actually decrease the cost of the monitors a tad (as it would likely need less in the way of voltage conversion parts).
 
thanks for the replies, I've done a bit more reading and I think I remember now. The electrons orbiting a nucleus with a higher energy (or the way we model them anyway) exist at a higher energy orbit. I can visualise this as the electron is being pulled towards the proton in the nucleus by the coulomb force, but at a higher energy this 'distance' is increased (hence a higher electronic orbit).

I understand the equations, it was just what happens at a fundamental physical level that I could not remember well.
 
Back
Top Bottom