Discover more from Ovahi
Anatomy of Hype: Quantum Computing
The claims of what quantum computing can do are dizzying: from forecasting financial crashes to solving optimization problems to leading to an “apocalypse.”
There is, however, a wide schism between those who boost quantum computing and others who are more grounded. This schism exposes how the incredibly difficult to understand technology, through a process of oversimplification and marketing, is distorted.
On the one hand, consulting firms, newspapers, and startups are urging tech leaders to “act now” in response to quantum computing developments and prepare for a “quantum revolution.”
On the other, bonafide quantum computing scientists, like Scott Aaronson, are cringing at the claims that those “magical uber-machines” (sarcasm) would cure cancer and solve global warming.
Let’s begin by examining what the hype camp is saying.
Solving the impossible
McKinsey is representative of those who hold a bullish view on quantum computing. Like other consulting companies and popular media, they expect the technology to be useful to companies within 5 to 10 years:
Solving the impossible in a few hours of computing time, finding answers to problems that have bedeviled science and society for years, unlocking unprecedented capabilities for businesses of all kinds—those are the promises of quantum computing, a fundamentally different approach to computation.
Employing a quantum mechanics characteristic, superposition, McKinsey predicts that quantum computers will be exponentially faster than their classical peers:
Classical computers are programmed with bits as data units (zeros and ones). Quantum computers use so-called qubits, which can represent a combination of both zero and one at the same time, based on a principle called superposition.
Qubits interacting together in a superposition state would perform millions or billions of calculations simultaneously. A calculation that may take years to solve using a supercomputer could be solved within minutes or hours using a quantum computer.
Linking “massive parallelization” to quantum computing ignited the imagination of many. Quantum computers are widely expected to accelerate autonomous vehicles, solve optimization problems in real-time, model diseases and their cures, and of course, pose a serious cybersecurity threat.
What the scientist says
Scott Aaronson is a professor of computer science at the University of Texas at Austin, specializing in quantum computing and computational complexity theory. He discusses quantum computing on his blog, writes in different publications in an accessible manner, and is a renowned contributor to the field.
Time and again, Aaronson stresses that there’s no such thing as parallelization in quantum computing. This is because of something called the observer effect: in quantum mechanics, when a quantum phenomenon is observed (i.e. measured), it ceases to be in superposition. So, by extension, if a quantum computer holds many simultaneous states, we can read (i.e. measure) only one of those states. This is simply physics. Aaronson quips that this would yield a random solution, so we might as well not use a quantum computer at all for that.
The notion that qubits can simultaneously be 0s and 1s AND that these simultaneous states can be measured amounts to a “fundamental misstep of quantum computing popularization,” according to Aaronson.
Quantum computers aren’t what they’re made out to seem. They won’t solve impossible problems. For example, Accenture says that quantum computers will solve the Traveling Salesman Problem, but according to Aaronson, this is something almost all experts believe they won’t be able to do.
Qubits, Aaronson explains, are not simply 0s and 1s at the same time. They carry complex information in the form of amplitudes, which are similar to probabilities. Amplitudes express, in complex numbers, the possibility of a qubit being a 1 or a 0. Something called interference happens between qubits. Just like waves cancelling or amplifying each other, interference can cancel out or magnify amplitudes:
The goal in devising an algorithm for a quantum computer is to choreograph a pattern of constructive and destructive interference so that for each wrong answer the contributions to its amplitude cancel each other out, whereas for the right answer the contributions reinforce each other. If, and only if, you can arrange that, you’ll see the right answer with a large probability when you look. The tricky part is to do this without knowing the answer in advance, and faster than you could do it with a classical computer.
The choreography of amplitudes and interference to arrive at the right solutions makes quantum computers unique and therefore specialized to a few problems. Aaronson says that the best uses for a quantum computer are to simulate quantum phenomena. Beyond that, other applications are elusive.
What about the quantum cybersecurity threat? In theory, a quantum computer may be able to break certain cryptography. But in practice, this is not going to happen anytime soon. Indeed, a quantum computer with millions of qubits (vs. today’s Google’s 53) would be required. This is an incredibly expensive and complicated machine to build if it can be built at all.
The path to hype
Referring to the quantum computer over-hype, a commenter on Aaronson’s blog said:
I’m worried currently that the biggest risk to eventually building large quantum computers are not the great technical and scientific challenges ahead, but the potential loss of credibility our field is facing because we are being so reckless.
The commenter, who is a quantum computing researcher, went on to pinpoint the academic culture as the starting point of a cascading set of conditions that led to the over-hype:
It has become acceptable to overpromise on quantum computing timelines and use-cases when submitting grant proposals and papers. The quantum computing community seems to have justified this hype in the name of science.
From this point, money flowed. And why not? Science is credible; it’s a great bet! A gold rush set in with expectations of a $64B market size by 2030. Consulting companies and the media took note. They began writing about the topic, setting up practices to advise clients, and preparing to capture some of the market share.
But explaining quantum computing to a non-specialist is close to impossible. So simplifications and abridgements became the public narrative. With simplifications come errors, misconceptions, and of course, overstated expectations.
We have seen this cycle play out in Artificial Intelligence. It’s now common to hear that “fully autonomous vehicles are more complex than previously thought” and that AI for medical diagnosis is questionable.
Hype costs. As that blog commenter said, the cost could be the loss of credibility in a scientific field. Other costs are misallocation of investments and bad business advice.
This doesn’t mean that we shouldn’t pay attention to or get excited about a scientific breakthrough. But maybe we need to be humble first and acknowledge our limits as we try to understand science.
And maybe, the funding of scientific research should be the opposite of a popularity contest.
Notes & Sources: