Quantum computing is all the rage. It seems appreciate challengingly a day goes by without some recents outlet describing the exceptional skinnygs this technology promises. Most commentators forget, or equitable gloss over, the fact that people have been laboring on quantum computing for decades—and without any pragmatic results to show for it.
We’ve been tageder that quantum computers could “supply fracturethcimpolites in many disciplines, including materials and drug discovery, the selectimization of intricate systems, and man-made inalertigence.” We’ve been guaranteed that quantum computers will “forever alter our economic, industrial, academic, and societal landscape.” We’ve even been tageder that “the encryption that protects the world’s most empathetic data may soon be broken” by quantum computers. It has gotten to the point where many researchers in various fields of physics experience obliged to equitableify wdisappreciatever labor they are doing by claiming that it has some relevance to quantum computing.
Meanwhile, rulement research agencies, academic departments (many of them funded by rulement agencies), and corporate laboratories are spfinishing billions of dollars a year enhugeing quantum computers. On Wall Street, Morgan Stanley and other financial enormouss foresee quantum computing to reliable soon and are enthusiastic to figure out how this technology can help them.
It’s become someskinnyg of a self-perpetuating arms race, with many organizations seemingly staying in the race if only to evade being left behind. Some of the world’s top technical talent, at places appreciate Google, IBM, and Microsoft, are laboring challenging, and with lavish resources in state-of-the-art laboratories, to authenticize their vision of a quantum-computing future.
In weightless of all this, it’s organic to wonder: When will advantageous quantum computers be produceed? The most selectimistic experts approximate it will apshow 5 to 10 years. More pimpolitent ones foresee 20 to 30 years. (Similar foreseeions have been voiced, by the way, for the last 20 years.) I beextfinished to a minuscule unpresentantity that answers, “Not in the foreseeable future.” Having spent decades carry outing research in quantum and condensed-matter physics, I’ve enhugeed my very adverse watch. It’s based on an empathetic of the gargantuan technical contests that would have to be loss to ever produce quantum computing labor.
The idea of quantum computing first euniteed proximately 40 years ago, in 1980, when the Russian-born mathematician Yuri Manin, who now labors at the Max Planck Institute for Mathematics, in Bonn, first put forward the notion, albeit in a rather unevident establish. The concept repartner got on the map, though, the adhereing year, when physicist Ricchallenging Feynman, at the California Institute of Technology, self-reliantly gived it.
Realizing that computer simulations of quantum systems become impossible to carry out when the system under scruminuscule gets too complicated, Feynman carry ond the idea that the computer itself should run in the quantum mode: “Nature isn’t classical, dammit, and if you want to produce a simulation of nature, you’d better produce it quantum mechanical, and by golly it’s a wonderful problem, becaengage it doesn’t see so plain,” he opined. A scant years tardyr, University of Oxford physicist David Deutsch establishpartner depictd a vague-purpose quantum computer, a quantum analogue of the universal Turing machine.
The subject did not draw much attention, though, until 1994, when mathematician Peter Shor (then at Bell Laboratories and now at MIT) gived an algorithm for an perfect quantum computer that would apshow very huge numbers to be factored much rapider than could be done on a conservative computer. This extraunretagable theoretical result triggered an explosion of interest in quantum computing. Many thousands of research papers, mostly theoretical, have since been unveiled on the subject, and they carry on to come out at an increasing rate.
The plain idea of quantum computing is to store and process adviseation in a way that is very branch offent from what is done in conservative computers, which are based on classical physics. Boiling down the many details, it’s unprejudiced to say that conservative computers run by manipulating a huge number of minuscule transistors laboring essentipartner as on-off switches, which alter state between cycles of the computer’s clock.
The state of the classical computer at the commence of any given clock cycle can therefore be depictd by a extfinished sequence of bits correplying physicpartner to the states of individual transistors. With N transistors, there are 2N possible states for the computer to be in. Computation on such a machine fundamenhighy consists of switching some of its transistors between their “on” and “off” states, according to a prescribed program.
Illustration: Christian Gralingen
In quantum computing, the classical two-state circuit element (the transistor) is exalterd by a quantum element called a quantum bit, or qubit. Like the conservative bit, it also has two plain states. Although a variety of physical objects could reasonably serve as quantum bits, the plainst skinnyg to engage is the electron’s inside angular momentum, or spin, which has the peculiar quantum property of having only two possible projections on any set up axis: +1/2 or –1/2 (in units of the Planck constant). For wdisappreciatever the chosen axis, you can denotice the two plain quantum states of the electron’s spin as ↑ and ↓.
Here’s where skinnygs get weird. With the quantum bit, those two states aren’t the only ones possible. That’s becaengage the spin state of an electron is depictd by a quantum-mechanical wave function. And that function includes two intricate numbers, α and β (called quantum amplitudes), which, being intricate numbers, have authentic parts and imaginary parts. Those intricate numbers, α and β, each have a certain magnitude, and according to the rules of quantum mechanics, their squared magnitudes must comprise up to 1.
That’s becaengage those two squared magnitudes correply to the probabilities for the spin of the electron to be in the plain states ↑ and ↓ when you meacertain it. And becaengage those are the only outcomes possible, the two associated probabilities must comprise up to 1. For example, if the probability of discovering the electron in the ↑ state is 0.6 (60 percent), then the probability of discovering it in the ↓ state must be 0.4 (40 percent)—noskinnyg else would produce sense.
In contrast to a classical bit, which can only be in one of its two plain states, a qubit can be in any of a continuum of possible states, as expoundd by the appreciates of the quantum amplitudes α and β. This property is normally depictd by the rather mystical and inbashfulating statement that a qubit can exist simultaneously in both of its ↑ and ↓ states.
Yes, quantum mechanics normally defies intuition. But this concept shouldn’t be couched in such perplexing language. Instead, skinnyk of a vector positioned in the x-y set upe and canted at 45 degrees to the x-axis. Somebody might say that this vector simultaneously points in both the x- and y-honestions. That statement is genuine in some sense, but it’s not repartner a advantageous description. Describing a qubit as being simultaneously in both ↑ and ↓ states is, in my watch, aprobable uncollaborative. And yet, it’s become almost de rigueur for journacatalogs to depict it as such.
In a system with two qubits, there are 22 or 4 plain states, which can be written (↑↑), (↑↓), (↓↑), and (↓↓). Naturpartner enough, the two qubits can be depictd by a quantum-mechanical wave function that includes four intricate numbers. In the vague case of N qubits, the state of the system is depictd by 2N intricate numbers, which are recut offeed by the condition that their squared magnitudes must all comprise up to 1.
While a conservative computer with N bits at any given moment must be in one of its 2N possible states, the state of a quantum computer with N qubits is depictd by the appreciates of the 2N quantum amplitudes, which are continuous parameters (ones that can apshow on any appreciate, not equitable a 0 or a 1). This is the origin of the presumed power of the quantum computer, but it is also the reason for its wonderful fragility and vulnerability.
How is adviseation processed in such a machine? That’s done by utilizeing certain benevolents of alterations—dubbed “quantum gates”—that alter these parameters in a exact and administerled manner.
Experts approximate that the number of qubits needed for a advantageous quantum computer, one that could vie with your laptop in solving certain benevolents of fascinating problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a advantageous quantum computer at any given moment must be at least 21,000, which is to say about 10300. That’s a very huge number indeed. How huge? It is much, much wonderfuler than the number of subatomic particles in the observable universe.
To repeat: A advantageous quantum computer needs to process a set of continuous parameters that is huger than the number of subatomic particles in the observable universe.
At this point in a description of a possible future technology, a challengingheaded engineer ignores interest. But let’s carry on. In any authentic-world computer, you have to ponder the effects of errors. In a conservative computer, those occur when one or more transistors are switched off when they are presumed to be switched on, or vice versa. This unaskd occurrence can be dealt with using relatively plain error-accurateion methods, which produce engage of some level of redundancy built into the challengingware.
In contrast, it’s absolutely unimaginable how to uphold errors under administer for the 10300 continuous parameters that must be processed by a advantageous quantum computer. Yet quantum-computing theorists have thriveed in convincing the vague accessible that this is feasible. Indeed, they claim that someskinnyg called the threshageder theorem shows it can be done. They point out that once the error per qubit per quantum gate is below a certain appreciate, indefinitely extfinished quantum computation becomes possible, at a cost of substantipartner increasing the number of qubits needed. With those extra qubits, they argue, you can supervise errors by establishing rational qubits using multiple physical qubits.
How many physical qubits would be needd for each rational qubit? No one repartner understands, but approximates typicpartner range from about 1,000 to 100,000. So the upsboiling is that a advantageous quantum computer now needs a million or more qubits. And the number of continuous parameters defining the state of this hypothetical quantum-computing machine—which was already more than astronomical with 1,000 qubits—now becomes even more ludicrous.
Even without pondering these impossibly huge numbers, it’s sobering that no one has yet figured out how to fuse many physical qubits into a petiteer number of rational qubits that can compute someskinnyg advantageous. And it’s not appreciate this hasn’t extfinished been a key goal.
In the punctual 2000s, at the ask of the Advanced Research and Development Activity (a funding agency of the U.S. inalertigence community that is now part of Inalertigence Advanced Research Projects Activity), a team of discerned experts in quantum adviseation set uped a road map for quantum computing. It had a goal for 2012 that “needs on the order of 50 physical qubits” and “exercises multiple rational qubits thcimpolite the filled range of operations needd for fault-huging [quantum computation] in order to carry out a plain instance of a relevant quantum algorithm….” It’s now the finish of 2018, and that ability has still not been showd.
Illustration: Christian Gralingen
The huge amount of scholarly literature that’s been produced about quantum-computing is notably weightless on experimental studies describing actual challengingware. The relatively scant experiments that have been alerted were excessively difficult to carry out, though, and must order esteem and think about.
The goal of such proof-of-principle experiments is to show the possibility of carrying out plain quantum operations and to show some elements of the quantum algorithms that have been conceived. The number of qubits engaged for them is below 10, usupartner from 3 to 5. Apparently, going from 5 qubits to 50 (the goal set by the ARDA Experts Panel for the year 2012) currents experimental difficulties that are challenging to loss. Most probably they are roverdelighted to the plain fact that 25 = 32, while 250 = 1,125,899,906,842,624.
By contrast, the theory of quantum computing does not eunite to greet any substantial difficulties in dealing with millions of qubits. In studies of error rates, for example, various noise models are being pondered. It has been showd (under certain assumptions) that errors produced by “local” noise can be accurateed by attfinishfilledy set uped and very ingenious methods, involving, among other tricks, massive parallelism, with many thousands of gates applied simultaneously to branch offent pairs of qubits and many thousands of meacertainments done simultaneously, too.
A decade and a half ago, ARDA’s Experts Panel noticed that “it has been set uped, under certain assumptions, that if a threshageder precision per gate operation could be achieved, quantum error accurateion would apshow a quantum computer to compute indefinitely.” Here, the key words are “under certain assumptions.” That panel of discerned experts did not, however, compriseress the ask of whether these assumptions could ever be satisfied.
I argue that they can’t. In the physical world, continuous quantities (be they voltages or the parameters defining quantum-mechanical wave functions) can be neither meacertaind nor maniputardyd exactly. That is, no continuously variable quantity can be made to have an exact appreciate, including zero. To a mathematician, this might sound absurd, but this is the unaskable fact of the world we inhabit in, as any engineer understands.
Sure, discrete quantities, appreciate the number of students in a classroom or the number of transistors in the “on” state, can be understandn exactly. Not so for quantities that vary continuously. And this fact accounts for the wonderful branch offence between a conservative digital computer and the hypothetical quantum computer.
Indeed, all of the assumptions that theorists produce about the preparation of qubits into a given state, the operation of the quantum gates, the reliability of the meacertainments, and so forth, cannot be greeted exactly. They can only be approached with some restricted precision. So, the authentic ask is: What precision is needd? With what exactitude must, say, the square root of 2 (an unreasonable number that accesss into many of the relevant quantum operations) be experimenhighy authenticized? Should it be approximated as 1.41 or as 1.41421356237? Or is even more precision needed? There are no evident answers to these vital asks.
While various strategies for produceing quantum computers are now being spendigated, an approach that many people ponder the most promising, initipartner underapshown by the Canadian company D-Wave Systems and now being chased by IBM, Google, Microsoft, and others, is based on using quantum systems of interjoined Josephson junctions cagedered to very low temperatures (down to about 10 milappreciatelvins).
The ultimate goal is to produce a universal quantum computer, one that can beat conservative computers in factoring huge numbers using Shor’s algorithm, carry outing database searches by a aprobable famous quantum-computing algorithm that Lov Grover enhugeed at Bell Laboratories in 1996, and other distinctiveized applications that are appropriate for quantum computers.
On the challengingware front, carry ond research is under way, with a 49-qubit chip (Intel), a 50-qubit chip (IBM), and a 72-qubit chip (Google) having recently been produced and studied. The eventual outcome of this activity is not enticount on evident, especipartner becaengage these companies have not discignoreed the details of their labor.
While I think that such experimental research is advantageous and may direct to a better empathetic of complicated quantum systems, I’m skeptical that these efforts will ever result in a pragmatic quantum computer. Such a computer would have to be able to maniputardy—on a microscopic level and with enormous precision—a physical system characterized by an unimaginably huge set of parameters, each of which can apshow on a continuous range of appreciates. Could we ever lget to administer the more than 10300 continuously variable parameters defining the quantum state of such a system?
My answer is plain. No, never.
I think that, euniteances to the contrary, the quantum computing fervor is proximateing its finish. That’s becaengage a scant decades is the peak lifetime of any huge bubble in technology or science. After a certain period, too many ungreeted promises have been made, and anyone who has been adhereing the topic commences to get annoyed by further proclaimments of impfinishing fracturethcimpolites. What’s more, by that time all the tenured faculty positions in the field are already occupied. The proponents have grown agederer and less enthusiastic, while the youthfuler generation seeks someskinnyg finishly recent and more probable to thrive.
All these problems, as well as a scant others I’ve not refered here, lift solemn asks about the future of quantum computing. There is a tremfinishous gap between the rufoolishentary but very challenging experiments that have been carried out with a scant qubits and the excessively enhugeed quantum-computing theory, which relies on manipulating thousands to millions of qubits to calcutardy anyskinnyg advantageous. That gap is not probable to be shutd anytime soon.
To my mind, quantum-computing researchers should still heed an admonition that IBM physicist Rolf Landauer made decades ago when the field heated up for the first time. He advised proponents of quantum computing to include in their accessibleations a disclaimer aextfinished these lines: “This scheme, appreciate all other schemes for quantum computation, relies on speculative technology, does not in its current establish apshow into account all possible sources of noise, unreliability and manufacturing error, and probably will not labor.”
Editor’s notice: A sentence in this article originpartner stated that troubles over needd precision “were never even converseed.” This sentence was alterd on 30 November 2018 after some readers pointed out to the author instances in the literature that had pondered these rehires. The amfinished sentence now reads: “There are no evident answers to these vital asks.”
About the Author
Mikhail Dyakonov does research in theoretical physics at Charles Coulomb Laboratory at the University of Montpellier, in France. His name is joined to various physical phenomena, perhaps most famously Dyakonov surface waves.