How Does Quantum Machine Learning Work?

Quantum machine learning works by using quantum bits, or ‘qubits’ to perform computation on input data using quantum circuits.

The output of a quantum circuit is a probabilistic distribution based on the single data input, and the most probable output is identified as the ‘correct’ solution to train the model.

Because quantum circuits generate a large number of potential solutions each time they are run, they can often improve the performance of a model faster than classical machine learning methods.

Quantum machine learning computation examples

Unlike a classical bit that can only be ‘0’ or ‘1’, a qubit can be 0, 1, 0+1, or 0-1. Computing with multiple qubits simultaneously can also generate a superstition of states like |00....00> and |11...11>, which does not have a classical analog.

Superposition is a quantum mechanical phenomenon that allows a pair of qubits to achieve linear combinations of two values. Comparatively, classical bits can only achieve a few variations. 

For instance, a pair of classical bits could only represent four possible value combinations, which include 00, 01, 10, or 11. By leveraging superposition, qubits can achieve more complex states, like 01+10.

The potential exponential space of quantum states means a solution prepared by a quantum circuit has the potential to be very different from, and in some cases, better than that given by a classical network.

Quantum circuits in quantum machine learning

Quantum circuits are computations run on quantum data using a specific sequence of quantum gates. Quantum circuits can also incorporate data from classical computation into their computation.

Quantum circuit measurements are probabilistic in nature. For example, if you measure a quantum state which looks like (|0> + |1>) in a particular basis, you would get |0> and |1> with equal probability. Similarly, for larger qubit states, measurement in a particular basis would give you different outcomes with different probabilities, similar to drawing samples from a probability distribution.

Modeling distributions with quantum circuits

Quantum machine learning works by training quantum circuits and optimizing them to get the best solution (i.e.: the solution that demonstrates the greatest probability of occurring). The model’s insights can then be used to answer a question or solve a problem.

Quantum algorithms can be implemented using parameterized circuits which can be trained to learn specific algorithms. A circuit is prepared which uses a set of differentiable gates which depend on a set of variational parameters. And the parameters are changed to perform the desired task with as few gates as possible. 

In the example above, you can see how the amplitudes of the quantum wave function define a probability distribution over a set of N binary random variables. Each bit in the basis state is a different random variable. The amplitudes of each basis state when squared give the probability of that sample appearing in the data set.

To fit such a probability distribution,  you need to fix the parameters of your models to fit the different correlations between the random variables.

With N parameters you can essentially set the average value for each of the independent variables.  To fit the pairwise correlations between the variables in your probability distribution you need N^2 additional parameters.  If you want to fit the three body correlations between the different variables, you need N^3 additional parameters. If you fit all N body terms you need 2^N parameters.

But in the quantum state, you create a 2^N dimensional vector that encodes all these complex correlations and in this way, you can represent probability distributions with N qubits that are exponentially hard to reproduce with classical computers.

It is believed quantum computers can be trained to capture correlations, and therefore classify data, more efficiently than classical computers.

Quantum machine learning algorithm examples

In general, quantum machine learning works via two main categories of algorithms: fault-tolerant and near-term algorithms.

Fault-tolerant quantum algorithms 

Fault-tolerant algorithms make use of the “HHL” subroutine to offer a concrete exponential speedup over classical algorithms based on their ability to perform linear algebra quickly.

They are based around a quantum algorithm for solving systems of linear equations. In the example below, you have a 2^Nx2^N matrix A and a 2^N dimensional vector b, and you want to find the vector x, which solves the equation Ax=b.

You can map the vector b and matrix A to a quantum state, and then the HHL algorithm can invert the matrix A and give you a quantum state that solves these equations. This can be done in order N time, so it can give a very concrete exponential speedup.

This type of matrix inversion and linear equation-solving methods are used in several classical machine learning problems such as in support vector machine classification or Gaussian process regression, and so in the quantum setting you can apply these algorithms to solve problems with exponentially large data sets faster than classical algorithms.

However, it’s important to know that these algorithms require very deep quantum circuits and quantum technology is not currently at the point where we can actually apply these methods.

Near-term quantum algorithms

Near-term quantum algorithms can be implemented using parameterized circuits which can be trained to learn specific algorithms. In other words, we prepare a circuit that uses a set of differentiable gates which depend on a set of variational parameters. And we change the parameters to perform the desired task with as few gates as possible.

So, these types of quantum algorithms are designed to be run on near-term devices.

Instead of trying to find an exponential speedup by storing big data in a dense quantum state, these quantum algorithms work by performing mappings of smaller datasets to large feature spaces, given by the quantum Hilbert space. Quantum computers can map this data to some exponentially large Hilbert space and perform linear algebra on these vectors. 

Quantum Machine Learning Use Cases

The ways quantum machine learning works have the potential to generate better, more accurate machine learning models, especially when used to analyze complex data with many different variables.

This is one of the reasons quantum computing has great promise for commercialization across a wide range of use cases. IonQ is currently exploring better ways to load cargo or reduce fuel consumption with Airbus,  researching optimized lithium compounds in batteries with the Hyundai Motor Company, and more. This is in addition to our availability on the Azure Quantum platform and our hybrid quantum system partnership with Dell Technologies.