Much of the disruptive force of quantum computing can be reduced to a single and relatively old idea: Shor's algorithm. Devised in 1994 by MIT mathematician Peter Shor, the algorithm is designed to find the prime factors of very large numbers—a computation whose difficulty lies at the heart of the cryptographic schemes that are themselves at the heart of network security. An algorithm that can find these factors within a reasonable amount of time would be cryptographic doomsday.
In the near term, we don't have to worry. Shor's algorithm is a quantum algorithm, e.g. one designed to run on a quantum computer. We don't have a general quantum computer of any real scale and likely won't for some time, so breathe easy for the moment. Implementing Shor's algorithm in the real-world remains a tremendous challenge—it may be possible to realize the algorithm at small scales, but to live up to its full RSA cracking potential, the implementation needs to scale up to large quantum machines.
A scheme for accomplishing just that is described in this week's Science by physicists at MIT and the University of Innsbruck in Austria. They were able to implement Shor's algorithm at a relatively small scale—using five qubits in total—which is an implementation that can be extended to relatively large computers (the ones that don't exist yet, that is). Using this scheme, they were able to factor the number 15 using a trapped ion quantum computer, a machine based on representing qubits with charged ions suspended in electromagnetic fields and manipulating them with lasers.
Recall that a qubit is the quantum version of a regular old binary bit. While a binary bit can take one of two values, a 1 or a 0, a qubit can, thanks to some peculiar quantum properties, represent information as combinations of 1s and 0s, or superpositions of 1s and 0s. Compared to simple 1s or 0s, this is a far more rich information unit as it winds up being a lot like having a bit that can be both a 1 and a 0 at the same time. That's about the essence of quantum strangeness, really.
There are a lot of catches involved in working with qubits, however. For one thing, they're incredibly fragile and it doesn't take much of a disturbance to flatten all of that in-between 1 and 0 information into just a plain 1 or 0. You can imagine a qubit then as a carefully balanced object just looking for an excuse to be toppled over. This is a big part of the quantum computing challenge.
Study co-author Isaac Chuang, a professor of physics and professor of electrical engineering and computer science at MIT, created a quantum computer capable of factoring the number 15 back in 2001. It was a cool demonstration, and the first experimental implementation of Shor's algorithm, but it wasn't scalable. It required too many qubits as the numbers to be factored grew and grew.
"Once you had too many atoms, it was like a big forest—it was very hard to control one atom from the next one," Chuang offered in a statement. "The difficulty is to implement [the algorithm] in a system that's sufficiently isolated that it can stay quantum mechanical for long enough that you can actually have a chance to do the whole algorithm."
Chuang and his group were able to accomplish this stability by reducing the number of qubits involved. The essence of the advance intuitive enough: By outputting the result one digit at a time, less qubits are required. In their design, four qubits are used for the actual computation, while the single remaining qubit is used to move information within the computer and to output the result. It's basically a clever way of recycling qubits.
Factoring the number 15 with Shor's algorithm normally takes about 12 qubits, so you can see how quickly the number of qubits required by said algorithm to operate can very quickly explode as the number to be factored grows. (Note that the lower limit on RSA keys is 1024 bits, or 309 decimal digits.)
Even with numbers this small, there is still work to be done in ensuring the fidelity of the system and making it more reliable. To factor larger numbers means scaling the computer itself, not just the algorithm, and that in itself remains an enormous challenge.