http://arstechnica.com/gadgets/news...wer-laptops-monitors-soon-to-be-a-reality.arsUSB 3.0 could soon drive monitors, hard drives with 100W of power
The next USB 3.0 specification will provide up to 100 watts of power to devices, allowing users to power some of the more demanding gadgets on their desks without additional power supplies. The USB 3.0 Promoter Group announced that the standard would allow USB 3.0 ports to power and charge devices like notebook PCs and would remain backwards compatible with USB 2.0 devices.
USB 3.0 ports introduced high data transfer speeds of up to 5Gb/s to compatible components, and have also been able to maintain currents and voltages up to 900mA at 5V for a maximum power output of 4.5W. This was about twice the maximum power output of USB 2.0 ports, but a current USB 3.0 port would still struggle to power most external hard drives.
Since the new spec raises gives USB 3.0 more than twenty times its old power input and output, the slate of products the ports could power is much larger and includes monitors, desk lamps, and even notebook PCs. The beefier ports could clear up some of the crowding at wall outlets, and if adopted as the main power connector for items like laptops, could help eradicate proprietary ports.
The USB 3.0 Promoter Group says the new standard will be ready for industry evaluation at the end of 2011 and is set for release to manufacturers in early 2012.
That a good point and i can't see how it going power monitors as they are normally 240v ACNeed pretty chunky cables for 20A.
With dum hubs and such like can't see how they could change the voltage. Would just be a disaster.Will it still be at 5v though or will we see a new breed of peripherals at 20v/5A instead - perhaps with autodetect/switching? If the autoswitch ever failed though.... ouch!
hope OP doesn't mind the thread hijack, but I've been brushing up on my very rusty knowledge of such things lately and still haven't got my head around it fully. I keep reading the water analogy, but it doesn't cut it for me.
Say we have 10 amps, which is a definite amount of electrons passing a given point in an electrical circuit at a point in time (I understand about coulombs, just wanted to keep it as simple as possible). Then at a voltage of 10V we have 100W of power. If we have a voltage of 5V then we have 50W of power.
What is missing for me is that in both the 100W and 50W examples, the same amount of electrons are passing a given point at the same time, yet they are doing different amounts of work (because of the voltage). So from a physical perspective, what is different about the electrons passing through the electric circuit between the 100W and 50W examples (as amount of electrons flowing is the same in both cases i.e. the same amps)?
I probably look like a plonker for asking the question, but hey![]()
That a good point and i can't see how it going power monitors as they are normally 240v AC
That a good point and i can't see how it going power monitors as they are normally 240v AC