Simply. normal computer chips can send electrical signals as a "1" or a "0", quantum computers can send data as a "1", "0" or a "1" and a "0". Which makes it able to compute far larger quantities of data.
A few people have said this. Current electronics can do "0", "1", or "not set" quite happily. Adding a third state doesn't represent a massive increase in computational capacity.
I *think* quantum computing is a phrase invented by the media that loosely covers a lot of ideas. Transistors based on light exist, and I'm fairly sure logic gates do too. There are hopes of storing binary data in the spin states of electrons, which would represent a considerable increase in information density. Finally there are ideas relating to entanglement and waveform collapse that might yield fast computation. I believe all these, and various other things, get described as "quantum computing".
It is unlikely that we are on the cusp of throwing out our current technology entirely. The next step beyond silicon is probably carbon, Intel have set up prototype transistors on artificial diamond sheet. The reasoning is that doped carbon can survive higher temperatures than silicon, so moving the barrier of getting 200W out of a tiny surface area. Incremental changes are always far more likely than revolutionary ones.


