As billions of dollars flow into quantum computing and countries build secure communications networks using quantum encryption, the importance of quantum information science has become increasingly difficult to ignore.
This year’s Breakthrough Prize in Fundamental Physics honors four pioneers who combined math, computer science and physics to do “fundamental work in the field of quantum information”. The prize is shared between Charles Bennett of IBM, Gilles Brassard of the University of Montreal, David Deutsch of the University of Oxford and Peter Shor of the Massachusetts Institute of Technology.
“These four people really contributed a lot to the emergence of quantum information theory,” explains Nicolas Gisin, experimental quantum physicist at the University of Geneva. “It’s nice to see these awards getting closer to my heart.”
The Breakthrough Prizes were co-founded by Russian-Israeli billionaire and physicist Yuri Milner in 2012, and they have been generously supported by other tycoons, including co-founders Mark Zuckerberg and Sergey Brin. Similar to Alfred Nobel, whose Nobel Prize funding fortune was born from his invention of dynamite, Milner’s past financial ties to the Kremlin have drawn attention, especially in light of the continued invasion of the Ukraine by Russia. In previous interviews, Milner has emphasized her independence and donations to Ukrainian refugees. A spokesperson pointed out American Scientist that Milner moved to the United States in 2014 and has not returned to Russia since.
But the recognition of quantum information science has not always been easy – or with such financial support. Broadly speaking, the field is a combination of two theories: quantum mechanics, which describes the counterintuitive behavior of the atomic and subatomic world, and information theory, which details the mathematical and physical limits of computation and communication. Its history is a more complicated one, with sporadic advances often overlooked by mainstream scientific journals.
In 1968, Stephen Wiesner, then a graduate student at Columbia University, developed a new way to encode information with polarized photons. Among other things, Wiesner proposed that the inherently fragile nature of quantum states could be used to create counterfeit-resistant quantum money. Unable to publish many of his heady theoretical insights and drawn to religion, Wiesner, who died last year, largely left academia to become a construction worker in Israel.
Before Wiesner left Columbia, he passed on some of his ideas to another young researcher. “One of my roommates’ boyfriends was Stephen Wiesner, who started telling me about his ‘quantum money,'” Bennett recalled. “[It] Sounded interesting to me, but it didn’t feel like the start of a whole new area. In the late 1970s, Bennett met Brassard, and the two began discussing Wiesner’s money, which they believed might require the unlikely task of trapping photons with mirrors to create a quantum banknote. .
“Photons aren’t meant to stay, they’re meant to travel,” says Brassard, explaining the thought process. “If they travel, what could be more natural than to communicate? The protocol proposed by Bennett and Brassard, called BB84, would launch the field of quantum cryptography. Later detailed and popularized in American Scientist, BB84 allowed two parties to exchange messages in complete secrecy. If a third party were spying, they would leave indelible evidence of their interference, like damaging a quantum wax seal.
As Bennett and Brassard developed quantum cryptography, another radical idea began to emerge: quantum computing. In a now famous meeting at MIT Endicott House in Dedham, Massachusetts, in May 1981, physicist Richard Feynman proposed that a computer using quantum principles could solve problems impossible for a computer bound by the laws of classical physics. . Although he didn’t attend the conference, Deutsch heard about the idea and was hooked. “I gradually became more and more convinced of the links between computation and physics,” he says.
Talking to Bennett later that year, Deutsch experienced a crucial epiphany: mainstream computer theory was then based on bad physics – Isaac Newton’s “classical” mechanics and Albert Einstein’s relativistic approach rather than on the deeper quantum reality. “So I thought about rewriting the theory of computation, basing it on quantum theory instead of basing it on classical theory,” Deutsch says matter-of-factly. “I didn’t expect anything fundamentally new to come out of it. I just expected it to be more rigorous. Soon, however, he realized he was describing a very different type of computer. Even though he got the same results, he got there with the principles of quantum mechanics.
Deutsch’s new theory provided a crucial link between quantum mechanics and information theory. “It made quantum mechanics accessible to me in my computer language,” says Umesh Vazirani, a computer scientist at the University of California, Berkeley. Later, with Australian mathematician Richard Josza, Deutsch proposed, as a proof of principle, the first algorithm that would be exponentially faster than classical algorithms, although it did nothing practical.
But soon more useful apps emerged. In 1991, Artur Ekert, then a graduate student at Oxford, proposed a new quantum cryptography protocol, E91. The technique has caught the attention of many physicists due to its elegance and practicality, as well as the fact that it has been published in a leading physics journal. “It’s a great idea. It’s a bit surprising that Ekert isn’t on the list of winners of this year’s Breakthrough Prize in Fundamental Physics, Gisin says.
Two years later, when Bennett, Brassard, Josza, computer scientist Claude Crépeau, and physicists Asher Peres and William Wootters proposed quantum teleportation, physicists paid attention. The new technique would give one party the ability to transmit information, such as the outcome of a coin toss, to another via entanglement, the quantum correlation that can link objects such as electrons. Despite popular science fiction claims, this technique does not enable faster-than-light messaging, but it has greatly expanded the possibilities for real-world quantum communications. “It’s the most mind-blowing idea,” says Chao-Yang Lu, a quantum physicist at the University of Science and Technology of China, who helped implement the technique from space.
Words like “revolution” are overused to describe progress in science, which is usually laborious and incremental. But in 1994, Shor quietly started one. While working at AT&T Bell Laboratories, he had absorbed discussions from Vazirani and Bennett. “I started thinking about useful things you could do with a quantum computer,” he says. “I thought it was a long shot. But it was a very interesting area. So I started working on it. I haven’t really told anyone about it.
Inspired by the success of other quantum algorithms with periodic or repetitive tasks, Shor developed an algorithm capable of dividing numbers into their prime factors (e.g., 21 = 7 x 3) exponentially faster than any classic algorithm. The implications were immediately obvious: prime factorization was the backbone of modern encryption. Finally, quantum computers had a truly revolutionary practical application. Shor’s algorithm “just made it clear that you have to drop everything” to work on quantum computing, says Vazirani.
Although Shor had found a powerful use case for a quantum computer, he hadn’t solved the more difficult problem of how to build one, even in theory. The fragile quantum states that these devices could exploit to outperform classical computing also made them extremely vulnerable to errors. Moreover, error correction strategies for classical computers could not be used in quantum computers. Undeterred, at a conference on quantum computing in Turin, Italy, in 1995, Shor bet with other researchers that a quantum computer would factor a 500-digit number before a classical computer could. do it. (Even with today’s conventional supercomputers, factoring 500 digits would likely take billions of years.) Nobody took Shor’s bet, and some demanded a third option: that the sun goes out first. .
Two types of errors plague quantum computers: bit errors and phase errors. These errors are akin to flipping a compass needle from north to south or east to west, respectively. Unfortunately, correcting bit errors makes phase errors worse, and vice versa. In other words, a more accurate north bearing results in a less accurate east or west bearing. But later in 1995, Shor discovered how to combine bit correction and phase correction – a chain of operations similar to solving a Rubik’s Cube without altering a completed face. Shor’s algorithm remains inefficient until quantum computers become more powerful (the highest number factored with the algorithm is 21, so the classical factorization stays ahead – for now). But it still made quantum computing possible, if not practical. “That’s when it all became real,” says Brassard.
All this work has led to new visions of quantum mechanics and computing. For Deutsch, this inspired an even more fundamental theory of ‘builders’ – which he says describe the ‘set of all physical transformations’. Others remain agnostic about the likelihood of other profound insights emerging from the quantum realm. “Quantum mechanics is really weird, and I don’t think there will ever be an easy way to understand it,” Shor says. When asked if his work on quantum computing makes the nature of reality easier or harder to understand, he playfully replied, “It certainly makes it more mysterious.”
What began as an eclectic hobby or intellectual pursuit has now far exceeded many of the wildest imaginations of pioneers in the field. “We never thought it would ever become practical. It was great fun thinking about these crazy ideas,” says Brassard. “At some point we decided we were serious, but people didn’t follow us. It was frustrating. Now that he is recognized to this extent, it is extremely gratifying.