Check it out, the IBM project is super cool:
A classical computer has a memory made up of bits, where each bit is represented by either a one or a zero. A quantum computer maintains a sequence of qubits. A single qubit can represent a one, a zero, or any quantum superposition of those two qubit states; a pair of qubits can be in any quantum superposition of 4 states, and three qubits in any superposition of 8 states. In general, a quantum computer with n qubits can be in an arbitrary superposition of up to 2^n different states simultaneously (this compares to a normal computer that can only be in one of these 2^n states at any one time). A quantum computer operates by setting the qubits in a controlled initial state that represents the problem at hand and by manipulating those qubits with a fixed sequence of quantum logic gates. The sequence of gates to be applied is called a quantum algorithm. The calculation ends with a measurement, collapsing the system of qubits into one of the 2^n pure states, where each qubit is zero or one, decomposing into a classical state. The outcome can therefore be at most n classical bits of information. Quantum algorithms are often non-deterministic, in that they provide the correct solution only with a certain known probability.
Canadian Prime Minister Justin Trudeau recently attempted to explain this at a press conference announcing $50 million in funding to the Perimeter Institute for Theoretical Physics for quantum computing research (incidentally D-Wave is a Canadian company). He said this:
Normal computers work by sending power through a wire or not, a one or a zero, the binary system. What quantum states allow for is much more complex information to be encoded into a single bit. A regular computer bit is ether a one or a zero. On or off. A quantum state can be much more complex than that because, as we know, things can be both particle and waves at the same time and the uncertainty around quantum states allows us to encode more information into a much smaller computer.
So why is this so hard to explain? Because it contradicts our reality. It’s not how we are wired to think. We operate fundamentally at a different cosmic scale than the quantum world. The opposite is also true. The distance from Earth to the center of our galaxy is 27,000 light years. So the light you see when you glance up at that bright spot in the Milky Way has been making the journey to reach your eyes for twenty-seven thousand years! That is a massive, inconceivable, distance. And that is only half way across our own galaxy. The distance to other galaxies? Completely beyond our ability to comprehend. So it stands to reason that if the super large scale is impossible to understand, then the super small scale should also be impossible to understand.
Charles Bennett, an Information Theorist at IBM, has an interesting way of looking at this. Not by explaining it, per se, but by forming an analogy to a dream:
Quantum Information is like the information in a dream. If you start even trying to share it to one other person besides yourself and talking about it, you start forgetting the dream and only remembering what you said about it.
So here is where you are expecting me to finally give a better explanation. I think I can do that, and I have attempted it in my forthcoming novel, Spheria, which makes heavy use of quantum computing as part of its premise. I can’t wait to share both the book and the explanation with you soon. Please subscribe to my mailing list and I will let you know the day it is released.