Thursday, May 22, 2014

Quantum or not?


Here’s the original text of my article on D-Wave’s “quantum computers” (discuss) in the March issue of La Recherche.

_________________________________________________________________

Google has a quantum computer, and you could have one too. If you have $10m to spare, you can buy one from the Canadian company D-Wave, based in Burnaby, British Columbia. The aerospace and advanced technology company Lockheed Martin has also done so.

But what exactly is it that you’d be buying? Physically, the D-Wave 2 is almost a caricature: a black box the size of a large wardrobe. But once you’ve found the on-switch, what will the machine do?

After all, it has only 512 bits – or rather, qubits (quantum bits) – which sounds feeble compared with the billions of bits in your iPad. But these are bits that work according to quantum rules, and so they are capable of much, much more. At least, that is what D-Wave, run by engineer and entrepreneur Geordie Rose, claims. But since the company began to launch its commercial products in 2011, it has been at the centre of a sometimes rancorous dispute about whether they are truly quantum computers at all, and whether they can really do things that today’s classical computers cannot.

One of the problems is that D-Wave seemed to come out of nowhere. Top academic and commercial labs around the world are struggling to juggle with more than a handful of qubits at a time, and they seem to be very far from having any kind of computer that can solve useful problems. Yet the Canadian company just appeared to pop up with black boxes for sale. Some scepticism was understandable. What’s more, the D-Wave machines use a completely different approach to quantum computing from most other efforts. But some researchers wonder if they are actually exploiting quantum principles at all – or even if they are, whether this has any advantages over conventional (classical) computing, let alone over more orthodox routes top quantum computers.

While there is now wider acceptance that something ‘quantum’ really is going on inside D-Wave’s black boxes, the issue of what they can achieve remains in debate. But that debate has opened up questions that are broader and more interesting than simply whether there has been a bit of over-zealous salesmanship. At root the issues are about how best to harness the power of quantum physics to revolutionize computing, and indeed about what it even means to do so, and how we can know for sure if we have done so. What truly makes a computer quantum, and what might we gain from it?

Defying convention

Quantum computing was first seriously mooted in the 1980s. Like classical computing, it manipulates data in binary form, encoded in 1’s and 0’s. But whereas a classical bit is either in a 1 or 0 state independently of the state of other bits, quantum bits can be placed in mixtures of these states that are correlated (entangled) with one another. Quantum rules then enable the qubits to be manipulated using shortcuts, while classical computers have to slog through many more logical steps to get the answer.

Much of the difficulty lies in keeping groups of qubits in their delicate quantum states for long enough to carry out the computation. They rapidly lose their mutual coordination (become “decoherent”) if they are too disturbed by the thermal noise of their environment. So qubits must be well isolated from their surroundings and kept very cold. Like some other research groups, D-Wave makes qubits from tiny rings of superconducting material, where roughly speaking the 1’s and 0’s correspond to electrical currents circulating in opposite directions. They’re kept at a temperature of a few hundredths of a degree above absolute zero, and most of the volume in the black box is needed to house the cooling equipment. But while the other labs have prototypes with perhaps half a dozen qubits that are nowhere near the marketplace, D-Wave is out there selling their machines.

According to the Canadian company’s researchers, they’ve got further because D-Wave has chosen an unconventional strategy. One of the key problems with quantum computing is that quantum rules are not entirely predictable. Whereas a classical logic gate will always give a particular output for a particular input of 1’s and 0’s, quantum physics is probabilistic. Even if the unpredictability is very small, it rapidly multiplies for many qubits. As a result, most quantum computer architectures will rely on a lot of redundancy: encoding the information many times so that errors can be put right.

D-Wave’s approach, called quantum annealing, allegedly circumvents the need for all that error correction (see Box 1). It means that the circuits aren’t built, like classical computers and most other designs for quantum computers, from ‘logic gates’ that take particular binary inputs and convert them to particular outputs. Instead, the circuits are large groups of simultaneously interacting qubits, rather like the atoms of a magnet influencing one another’s magnetic orientation.

___________________________________________________________________________

Box 1: Quantum annealing

Computing by quantum annealing means looking for the best solution to a problem by searching simultaneously across the whole ‘landscape’ of possible solutions. It’s therefore a kind of optimization process, exemplified by the Travelling Salesman problem, in which the aim is to find the most efficient route that visits every node in a network. The best solution can be considered the ‘lowest-energy’ state – the so-called ground state – of the collection of qubits. In classical computing, that can be found using the technique of simulated annealing, which means jumping around the landscape at random looking for lower-lying ground. By allowing for some jumps to slightly increase the energy, it’s possible to avoid getting trapped in local, non-optimal dips. Quantum annealing performs an analogous process, except that rather than hopping classically over small hills and cols, the collective state of the qubits can tunnel quantum-mechanically through these barriers. What’s more, it works by starting with a flat landscape and gradually raising up the peaks, all the while keeping the collective state of the qubits pooled in the ground state.

Optimization problems like the Travelling Salesman belonging to a class called NP-hard problems. Working out how the amino acid chain of a protein folds up into its most stable shape is another example, of immense importance in molecular biology. These are very computationally intense challenges for classical computers, which generally have to simply try out each possible solution in turn.

Quantum annealers, explains physicist Alexandre Zagoskin of Loughborough University in England, a cofounder of D-Wave who left the company in 2005, are analog devices: less like digital logic-gate computers and more like slide rules, which calculate with continuous quantities. Logic-gate computers are ‘universal’ in the sense that they can simulate each other: you can perform the same computation using electrical pulses or ping-pong balls in tubes. “The obsession with universal quantum computing created unrealistic expectations, overhype, disillusionment and fatigue, and it keeps many theorists developing software for non-existing quantum ‘Pentiums’”, says Zagoskin. “In the foreseeable future we can make, at best, quantum slide rules like quantum annealers.”

_____________________________________________________________________

That makes a quantum annealer well suited to solving some problems but not others. “Instead of trying to build a universal computer”, says computer scientist Catherine McGeoch of Amherst College in Massachusetts, “D-Wave is going for a sort of ‘quantum accelerator chip’ that aims to solve one particular class of optimization problem. But this is exactly the class of problem where theoreticians think the important quantum speedups would be found, if they exist. And they are important computational problems in practice.” As physicist Alexandre Zagoskin puts it, D-Wave might be a few-trick device, but “if the trick is useful enough, and the performance-to-price ratio is good, who cares?”

Aside from avoiding error correction, quantum annealing (QA) allegedly has other advantages. Daniel Lidar, scientific director of the University of Southern California–Lockheed Martin Quantum Computing Center in Los Angeles, which uses D-Wave’s latest commercial machine D-Wave 2, explains that it relaxes the need for qubits to be switched fast, compared to the logic-gate approach. That in turn means that less heat is generated, and so it’s not such a struggle to keep the circuits cool when they contain many qubits. Mark Johnson, D-Wave’s chief scientific officer, sees various other practical benefits of the approach too. “We believe a quantum annealing processor can be built at a useful scale using existing technology, whereas one based on the gate model of quantum computing cannot”, he says. There are many reasons for this, ranging from greater resistance against coherence to the less taxing demands on the materials, on control of the environment, and on interfacing with users.

However, there are arguments about whether QA actually solves optimization tasks faster than classical algorithms. “No good reason has ever been given”, says John Smolin of IBM’s T. J. Watson Research Center in Yorktown Heights, New York, where much of the pioneering theory of quantum computation was developed in the 1980s.

Put it to the test

Perhaps the best way to resolve such questions is to put D-Wave’s machines to the test. McGeoch, with Cong Wang at Simon Fraser University in Burnaby, has pitted them against various classical algorithms for solving NP-hard optimization problems. They primarily used a D-Wave circuit called Vesuvius-5, with 439 working qubits, and found that it could find answers in times that were at least as good as, and in some cases up to 3,600 times faster than, the classical approaches. The speed-up got better as the number of elements in the problem (the number of destinations for the salesman, say) increased, up to the maximum of 439.

But not everyone is persuaded. Not only is D-Wave playing to its strengths here, but the speed-up is sometimes modest at best, and there’s no guarantee that faster classical algorithms don’t exist. Smolin still doubts that D-Wave’s devices “have solved any problem faster than a fair comparison with conventional computers.” True, he says, its harsh to compare D-Wave’s brand new machine with those coming from a 50-year, trillion-dollar industry. But that after all is the whole point. “History has shown that silicon-based computers always catch up in the end”, he says. “Currently, D-Wave is not actually faster at its own native problem than classical simulated annealing – even a relatively naive program running on standard hardware, written by me, more or less keeps up. If I spent $10 million on custom hardware, I expect I could beat the running time achieved by D-Wave by a very large amount.” And the real question is whether D-Wave can maintain an advantage as the problems it tackles are scaled up. “There is no evidence their larger machines are scaling well”, Smolin says.

Besides, Lidar notes, it remains a big challenge to express computational problems in a way that D-Wave can handle. “Most optimization problems such as protein folding involve a large amount of preprocessing before they can be mapped to the current D-Wave hardware”, he says. “The pre-processing problem may itself be computationally hard.”

Quite aside from the putative speed advantages, how can we tell if D-Wave’s machines are using quantum rules? There’s some suggestive evidence of that. In 2011 Johnson and colleagues at D-Wave reported that an 8-qubit system on the 128-qubit chip of D-Wave 1 showed signs that it was conducting true quantum annealing, because the experimental results didn’t fit with what the qubits were predicted to do if they were behaving just like classical bits. Lidar and his colleagues subsequently conducted more exacting tests of D-Wave 1, finding that even 108 coupled qubits find their ground state in a way that doesn’t fit with the predictions of classical simulated annealing. Perhaps surprisingly, a ‘quantum’ signature of the behaviour remains even though the timescale needed to find the optimal solution was considerably longer than that over which thermal noise can scramble some of the quantum organization. “In this sense QA is more robust against noise than the gate model”, says Lidar.

But he stresses that “these types of experiments can only rule out certain classical models and provide a consistency check with other quantum models. There’s still the possibility that someone will invent another classical model that can explain the data, and such models will then have be ruled out one at a time.” So none of these tests, he explains, is yet a “smoking gun” for quantum annealing.

Besides, says Zagoskin, the more qubits you have, the less feasible it becomes to simulate (using classical computers) the behaviour of so many coherent qubits to see how they should act. To anticipate how such a quantum computer should run, you need a quantum computer to do the calculations. “The theory lags badly behind the experimental progress, so that one can neither predict how a given device will perform, nor even quantify the extent of its “quantumness”, he says.

Joseph Fitzsimons of the Center for Quantum Technologies of the National University of Singapore finds Lidar’s tests fairly persuasive, but adds that “this evidence is largely indirect and not yet conclusive.” Smolin is less sanguine. My opinion is that the evidence is extremely weak”, he says. The whole question of what it means to “be quantum” is a deep and subtle one, he adds – it’s not just a matter of showing you have devices that work by quantum rules, but of showing that they give some real advantage over classical devices. “No one is denying that the individual superconducting loops in the D-Wave machine are superconducting, and it is well accepted that superconductivity is a manifestation of a quantum effect.” But quantum rules also govern the way transistors in conventional computers function, “and one doesn’t call those quantum computers.”

How would you know?

To assess the performance of a quantum computer, one needs to verify that the solutions it finds are correct. Sometimes that’s straightforward, as for example with factorization of large numbers – the basis of most current encryption protocols, and one of the prime targets for quantum computation. But for other problems, such as computer simulation of protein folding, it’s not so easy to see if the answer is correct. “This raises the troubling prospect that in order to accept results of certain quantum computations we may need to implicitly trust that the device is operating correctly”, says Fitzsimons.

One alternative, he says, is to use so-called interactive proofs, where the verifier forms a judgement about correctness on the basis of a small number of randomly chosen questions about how good the solution is. Fitzsimons and his collaborators recently demonstrated such an interactive proof of quantum effects in a real physical “quantum computer” comprised of just four light-based qubits. But he says that these methods aren’t applicable to D-Wave: “Unfortunately, certain technological limitations imposed by the design of D-Wave’s devices prevent direct implementation of any of the known techniques for interactive verification.”

For NP-hard problems, this question of verification goes to the heart of one of the most difficult unresolved problems in mathematics: there is as yet no rigorous proof that such problems can’t be solved faster by classical computers – perhaps we just haven’t found the right algorithm. In that case, showing that quantum computers crack these problems faster doesn’t prove that they use (or depend on) quantum rules to do it.

Part of this problem also comes down to the lack of any agreement on how quantum computers might achieve speed-up in the first place. While early proposals leant on the notion that quantum computers would be carrying out many computations in parallel, thanks to the ability of qubits to be in more than one state at once, this idea is now seen as too simplistic. In fact, there seems to be no unique explanation for what might make quantum computation faster. “Having a physical quantum computer of interesting size to experiment on has not produced answers to any of these open theoretical questions”, says McGeoch. “Instead experiments have only served to focus and specialize our questions – now they are more numerous and harder to answer. I suppose that's progress, but sometimes it feels like were moving backwards in our understanding of what it all means.”

None of this seems about to derail the commercial success of D-Wave. “We plan to develop processors with more qubits”, says Johnson. “Of course there are more dimensions to processor performance, such as the choice of connecting qubits, or the time required to set up a problem or to read out a solution. We’re working to improve processor performance along these lines as well.”

There’s certainly no problem of sheer size. The superconducting integrated-circuit chip at the core of D-Wave’s devices is “somewhat smaller than my thumbnail”, says Johnson. Besides, the processor itself takes up a small fraction of this chip, and has a feature size “equivalent to what the semiconductor industry had achieved in the very late 1990s”. “Our expectation is that we will not run out of room on our chip for the foreseeable future”, he says. And D-Wave’s bulky cooling system is capable of keeping a lot more than 512 qubits cold. “D-Wave's technology seems highly scalable in terms of the number of qubits they place and connect on a chip”, says Lidar. “Their current fridge can support up to 10,000 qubits.” So those black boxes look set to go on getting more powerful – even if it is going to get even harder to figure out exactly what is going on inside them.

No comments: