Quantum mechanics has already brought us the transistor and the laser. But, says Seth Lloyd, quantum computing will bring further advances, like truly secure cryptography

We are in the midst of a technological revolution. Because of the rapid advance in sensitivity and accuracy of manufacturing techniques, the technologies that we can construct are pushing up against the limits of size and accuracy allowed by the fundamental laws of physics.

Quantum mechanics is the branch of physics that governs how tiny things behave. As the size of the components of machines pushes down towards the atomic scale, quantum mechanics becomes increasingly important in regulating how those machines function. Quantum mechanics is famously weird: an electron can be in two places at once, and Albert Einstein himself protested against what he called quantum mechanics’ apparent “spooky action at distance”.

Weird but wonderful
This quantum weirdness gives rise to problems in constructing quantum technologies. In particular, quantum devices tend to be more sensitive to noise than their classical counterparts. But quantum weirdness also gives rise to opportunities: quantum computers can solve problems classical computers can’t; quantum cryptographic systems provide provably secure communications; and the very sensitivity of quantum systems to noise allows quantum technologies to construct intrinsically more powerful sensors, detectors and measurement devices than is possible using only classical effects.

This article reviews the advance of quantum technologies, beginning with familiar technologies such as the laser and the transistor, and ending with speculative technologies such as quantum computers and quantum cryptographic systems. I’ll show how quantum weirdness allows one to construct devices that approach the bounds of accuracy and power allowed by the fundamental laws of physics. Sometimes, weird is good.

Quantum history
At the turn of the 20th century, Max Planck, a German physicist, was trying to understand the way that hot glowing objects emitted light. Since black objects emit light at all frequencies, such objects are called “black bodies”, and the light that they emit is called “black-body radiation”. Planck had access to good measurements of the amounts of energy that black bodies were emitting, and he was attempting to construct a model that would allow him to explain those amounts in terms of James Clerk Maxwell’s theory of electromagnetism. But no matter how he calculated the amounts of energy emitted in black body radiation, the answer he got was infinity: the black bodies should be radiating an infinite amount of energy per second.

Planck puzzled about this result. Not only were his calculations giving the wrong answer, he was off the actual answer by an infinite amount. Clearly, some radical reconception of radiation was in order. After trying a variety of solutions, Planck guessed that perhaps the apparently continuous waves of light emitted by black bodies were at bottom discrete. Suppose that rather than coming in continuous waves, light came in little “chunks”, which Planck called “quanta” (after the Latin quantus, or “how much”). Each quantum of light would then behave like a tiny particle, rather than a wave. Planck called these particles “photons”, after the Greek word for light. Planck found that if he incorporated his guess into his model of how black bodies radiated, not only did he get rid of the annoying infinities, he could also reproduce the experimental observations exactly. Quantum mechanics was born.

Waves are made of particles. Sound waves, too, are made of small chunks called “phonons”. Having discovered the particulate nature of waves, physicists began to look more closely at particles like electrons or atoms. Particles – electrons, atoms or, for that matter, basketballs – were found to possess wave-like qualities: their positions were not well defined, but seemed to extend throughout space, just as a water wave extends over the ocean. In addition, the positions and velocities of the particles seemed to wiggle about in a wave-like fashion, with well-defined spatial periods and frequencies. This wave-like nature of particles was used by the Danish physicist Niels Bohr to explain the characteristic energies of the particles of light emitted by atoms.

Waves are made of particles; particles have waves attached. This yin-yang relationship between waves and particles is called “wave-particle duality”. In the first two decades of the 20th century, wave-particle duality was invoked to throw light on a host of previously unexplained phenomena, such as the way in which light kicks electrons off the surface of metals (the photo-electric effect), the structure of the hydrogen atom, the nature of heat transfer, and many other effects.

At the end of the 1920s, the Bavarian physicist Werner Heisenberg and the Austrian Erwin Schroedinger developed a formal theory of quantum mechanics that made the somewhat fuzzy concept of wave-particle duality mathematically precise. Heisenberg’s uncertainty principle, for example, codified the idea that measuring one quantity – for example, position, necessarily disturbs the value of another quantity, such as momentum. With the development of a complete mathematical theory, the flood gates opened, and quantum mechanics washed away the accumulated detritus of unsolved problems of how small things behave. From the behaviour of elementary particles, to the periodic table of the elements, to the life cycles of stars, to the origins of the universe itself, the predictions of quantum mechanics were confirmed over and over again. Today, weird though it may be, quantum mechanics is probably the best-confirmed scientific theory ever, supported by literally millions of data points.

Quantum-mechanical engineering
Where science makes its discoveries, engineering soon follows. Engineering is science put into practice. In fact, engineering often gets up such a head of steam that it surges in front of science and gives rise to fundamental discoveries about the natural world (the steam engine preceded the scientific account of thermodynamics; the telegraph and telephone preceded the mathematical theory of communication; the computer preceded computer science, and so on). No sooner had quantum mechanics been developed than it was put to use. Electron guns and photo-luminescence created the cathode ray tube and prototype televisions. The quantized energy levels of electrons in semiconductors were exploited to construct transistors. Quantum mechanics was used to unravel the structure of the chemical bond and gave rise to chemical engineering.

The laser is a familiar example of quantum-mechanical engineering. The word laser is an acronym, standing for Light Amplification by the Stimulated Emission of Radiation. We have seen how Planck used quantum mechanics to explain the emission of radiation by hot black bodies. Einstein immediately saw the usefulness of quantum mechanics and used the concept of quanta of light to show how photons could kick electrons off the surface of a metal. (Einstein was awarded the Nobel prize for his early work on quantum mechanics, which is ironic, as he was always suspicious of the “weird” aspects of the field.) He also showed that photons were a particularly gregarious kind of particle. Photons like to snuggle up together: the more of them you cram into a particular space, the happier they are. This cuddly nature of photons is the basis for lasers. If an atom is ready to emit a particle of light, and you send another photon by, the atom preferentially emits its photon “on top of” the photon zipping by, a process known as stimulated emission. With lots of atoms and lots of photons, the result is a concentrated beam of light consisting of many photons snuggled on top of each other – a laser.

Exploiting weirdness
Over the past decade or so, quantum-mechanical engineers have managed to put to use some of the weird and funky aspects of quantum mechanics that Einstein protested against. The fact that an electron can be in two places at once has non-trivial implications for constructing computers. A conventional digital computer works by dividing up information into its smallest components – bits, and transforming those bits one or two at a time. A bit represents the distinction between two states, conventionally called 0 and 1. In an electronic computer, a bit is registered by electrons: a billion electrons over here represents a 0, a billion electrons over there represents a 1. In a quantum computer, a bit can be registered by a single electron: one electron over here represents 0, one electron over there represents 1.

So far, so good: a quantum computer just registers a bit using one electron rather than billions of electrons. But according to the well-established laws of quantum mechanics, an electron doesn’t have to be either “here” or “there”: it can just as easily be here and there at the same time! In some weird quantum sense, such an electron registers 0 and 1 at once. This ability to register two values simultaneously is strange and counter-intuitive. Such quantum bits, or “qubits” possess a new ability that conventional bits do not possess. Quantum computers build on this ability to perform computations in ways that classical computers can’t. For example, even a relatively small quantum computer consisting of a few thousand quantum bits could break all existing public-key cryptosystems (public-key cryptosystems guarantee the security of publicly exchanged information, such as bank and credit card transactions performed over the Internet).

Luckily for people who want to protect their information, current quantum computers contain fewer than 10 bits and are not a threat to e-commerce or national security at the moment. But the size of quantum computers is increasing rapidly, and in a decade or two quantum computers may indeed constitute a technology that can disrupt secure information exchange.

Fortunately, in addition to providing a threat to secure communications, quantum mechanics also provides a solution to provably secure information exchange. Quantum cryptography relies on the fact that measuring a quantum system necessarily perturbs the system at the same time, a feature embodied in the Heisenberg uncertainty principle already mentioned.

Two parties who wish to establish a shared cryptographic key for secure communications can exploit the Heisenberg uncertainty principle to detect the presence of eavesdroppers and to foil their attempts to gain information about the secure communication. The security of conventional public key cryptography is protected only by the fact that certain mathematical problems, such as factoring large numbers, are apparently hard; any day, a sufficiently smart person might figure out a way to break them (for example, using quantum computers). By contrast, the security of quantum cryptography is guaranteed by the laws of physics themselves: no-one, however smart, can break those laws.

Quantum cryptography today is in its initial marketing phase, with several systems commercially available, and prototype systems installed and tested.

The future
Quantum mechanics’ very weirdness makes quantum mechanics a fruitful source for new and innovative technologies. Many of the technologies – the laser, the transistor, the electron microscope – have already had a large impact on society. Others – quantum computers and quantum cryptography – are just beginning to have an impact. What other impacts might quantum weirdness have? That’s for the next generation of quantum-mechanical engineers to discover.

Seth Lloyd
Seth Lloyd is professor of quantum-mechanical engineering at the Massachusetts Institute of Technology. His research focuses on how physical systems process information.