The AIquantum computing mash-up: will it revolutionize science? – Nature.com

Call it the Avengers of futuristic computing. Put together two of the buzziest terms in technology machine learning and quantum computers and you get quantum machine learning. Like the Avengers comic books and films, which bring together an all-star cast of superheroes to build a dream team, the result is likely to attract a lot of attention. But in technology, as in fiction, it is important to come up with a good plot.

If quantum computers can ever be built at large-enough scales, they promise to solve certain problems much more efficiently than can ordinary digital electronics, by harnessing the unique properties of the subatomic world. For years, researchers have wondered whether those problems might include machine learning, a form of artificial intelligence (AI) in which computers are used to spot patterns in data and learn rules that can be used to make inferences in unfamiliar situations.

Now, with the release of the high-profile AI system ChatGPT, which relies on machine learning to power its eerily human-like conversations by inferring relationships between words in text, and with the rapid growth in the size and power of quantum computers, both technologies are making big strides forwards. Will anything useful come of combining the two?

Many technology companies, including established corporations such as Google and IBM, as well as start-up firms such as Rigetti in Berkeley, California, and IonQ in College Park, Maryland, are investigating the potential of quantum machine learning. There is strong interest from academic scientists, too.

CERN, the European particle-physics laboratory outside Geneva, Switzerland, already uses machine learning to look for signs that certain subatomic particles have been produced in the data generated by the Large Hadron Collider. Scientists there are among the academics who are experimenting with quantum machine learning.

Our idea is to use quantum computers to speed up or improve classical machine-learning models, says physicist Sofia Vallecorsa, who leads a quantum-computing and machine-learning research group at CERN.

The big unanswered question is whether there are scenarios in which quantum machine learning offers an advantage over the classical variety. Theory shows that for specialized computing tasks, such as simulating molecules or finding the prime factors of large whole numbers, quantum computers will speed up calculations that could otherwise take longer than the age of the Universe. But researchers still lack sufficient evidence that this is the case for machine learning. Others say that quantum machine learning could spot patterns that classical computers miss even if it isnt faster.

Researchers attitudes towards quantum machine learning shift between two extremes, says Maria Schuld, a physicist based in Durban, South Africa. Interest in the approach is high, but researchers seem increasingly resigned about the lack of prospects for short-term applications, says Schuld, who works for quantum-computing firm Xanadu, headquartered in Toronto, Canada.

Some researchers are beginning to shift their focus to the idea of applying quantum machine-learning algorithms to phenomena that are inherently quantum. Of all the proposed applications of quantum machine learning, this is the area where theres been a pretty clear quantum advantage, says physicist Aram Harrow at the Massachusetts Institute of Technology (MIT) in Cambridge.

Over the past 20 years, quantum-computing researchers have developed a plethora of quantum algorithms that could, in theory, make machine learning more efficient. In a seminal result in 2008, Harrow, together with MIT physicists Seth Lloyd and Avinatan Hassidim (now at Bar-Ilan University in Ramat Gan, Israel) invented a quantum algorithm1 that is exponentially faster than a classical computer at solving large sets of linear equations, one of the challenges that lie at the heart of machine learning.

But in some cases, the promise of quantum algorithms has not panned out. One high-profile example occurred in 2018, when computer scientist Ewin Tang found a way to beat a quantum machine-learning algorithm2 devised in 2016. The quantum algorithm was designed to provide the type of suggestion that Internet shopping companies and services such as Netflix give to customers on the basis of their previous choices and it was exponentially faster at making such recommendations than any known classical algorithm.

Tang, who at the time was an 18-year-old undergraduate student at the University of Texas at Austin (UT), wrote an algorithm that was almost as fast, but could run on an ordinary computer. Quantum recommendation was a rare example of an algorithm that seemed to provide a significant speed boost in a practical problem, so her work put the goal of an exponential quantum speed-up for a practical machine-learning problem even further out of reach than it was before, says UT quantum-computing researcher Scott Aaronson, who was Tangs adviser. Tang, who is now at the University of California, Berkeley, says she continues to be pretty sceptical of any claims of a significant quantum speed-up in machine learning.

A potentially even bigger problem is that classical data and quantum computation dont always mix well. Roughly speaking, a typical quantum-computing application has three main steps. First, the quantum computer is initialized, which means that its individual memory units, called quantum bits or qubits, are placed in a collective entangled quantum state. Next, the computer performs a sequence of operations, the quantum analogue of the logical operations on classical bits. In the third step, the computer performs a read-out, for example by measuring the state of a single qubit that carries information about the result of the quantum operation. This could be whether a given electron inside the machine is spinning clockwise or anticlockwise, say.

Algorithms such as the one by Harrow, Hassidim and Lloyd promise to speed up the second step the quantum operations. But in many applications, the first and third steps could be extremely slow and negate those gains3. The initialization step requires loading classical data on to the quantum computer and translating it into a quantum state, often an inefficient process. And because quantum physics is inherently probabilistic, the read-out often has an element of randomness, in which case the computer has to repeat all three stages multiple times and average the results to get a final answer.

Once the quantumized data have been processed into a final quantum state, it could take a long time to get an answer out, too, according to Nathan Wiebe, a quantum-computing researcher at the University of Washington in Seattle. We only get to suck that information out of the thinnest of straws, Wiebe said at a quantum machine-learning workshop in October.

When you ask almost any researcher what applications quantum computers will be good at, the answer is, Probably, not classical data, says Schuld. So far, there is no real reason to believe that classical data needs quantum effects.

Vallecorsa and others say that speed is not the only metric by which a quantum algorithm should be judged. There are also hints that a quantum AI system powered by machine learning could learn to recognize patterns in the data that its classical counterparts would miss. That might be because quantum entanglement establishes correlations among quantum bits and therefore among data points, says Karl Jansen, a physicist at the DESY particle-physics lab in Zeuthen, Germany. The hope is that we can detect correlations in the data that would be very hard to detect with classical algorithms, he says.

Quantum machine learning could help to make sense of particle collisions at CERN, the European particle-physics laboratory near Geneva, Switzerland.Credit: CERN/CMS Collaboration; Thomas McCauley, Lucas Taylor (CC BY 4.0)

But Aaronson disagrees. Quantum computers follow well-known laws of physics, and therefore their workings and the outcome of a quantum algorithm are entirely predictable by an ordinary computer, given enough time. Thus, the only question of interest is whether the quantum computer is faster than a perfect classical simulation of it, says Aaronson.

Another possibility is to sidestep the hurdle of translating classical data altogether, by using quantum machine-learning algorithms on data that are already quantum.

Throughout the history of quantum physics, a measurement of a quantum phenomenon has been defined as taking a numerical reading using an instrument that lives in the macroscopic, classical world. But there is an emerging idea involving a nascent technique, known as quantum sensing, which allows the quantum properties of a system to be measured using purely quantum instrumentation. Load those quantum states on to a quantum computers qubits directly, and then quantum machine learning could be used to spot patterns without any interface with a classical system.

When it comes to machine learning, that could offer big advantages over systems that collect quantum measurements as classical data points, says Hsin-Yuan Huang, a physicist at MIT and a researcher at Google. Our world inherently is quantum-mechanical. If you want to have a quantum machine that can learn, it could be much more powerful, he says.

Huang and his collaborators have run a proof-of-principle experiment on one of Googles Sycamore quantum computers4. They devoted some of its qubits to simulating the behaviour of a kind of abstract material. Another section of the processor then took information from those qubits and analysed it using quantum machine learning. The researchers found the technique to be exponentially faster than classical measurement and data analysis.

Doing the collection and analysis of data fully in the quantum world could enable physicists to tackle questions that classical measurements can only answer indirectly, says Huang. One such question is whether a certain material is in a particular quantum state that makes it a superconductor able to conduct electricity with practically zero resistance. Classical experiments require physicists to prove superconductivity indirectly, for example by testing how the material responds to magnetic fields.

Particle physicists are also looking into using quantum sensing to handle data produced by future particle colliders, such as at LUXE, a DESY experiment that will smash electrons and photons together, says Jensen although the idea is still at least a decade away from being realized, he adds. Astronomical observatories far apart from each other might also use quantum sensors to collect data and transmit them by means of a future quantum internet to a central lab for processing on a quantum computer. The hope is that this could enable images to be captured with unparalleled sharpness.

If such quantum-sensing applications prove successful, quantum machine learning could then have a role in combining the measurements from these experiments and analysing the resulting quantum data.

Ultimately, whether quantum computers will offer advantages to machine learning will be decided by experimentation, rather than by giving mathematical proofs of their superiority or lack thereof. We cant expect everything to be proved in the way we do in theoretical computer science, says Harrow.

I certainly think quantum machine learning is still worth studying, says Aaronson, whether or not there ends up being a boost in efficiency. Schuld agrees. We need to do our research without the confinement of proving a speed-up, at least for a while.

Read the original:
The AIquantum computing mash-up: will it revolutionize science? - Nature.com

Related Posts

Comments are closed.