description
Guide

Quantum Natural Language Processing With IonQ Hardware

Enterprise Leaders
Section 1:
Enterprise Quantum Computing
expand_more
Section 2:
Quantum Cloud Solutions
expand_more
Section 3:
Case Studies
expand_more
Section 5:
Benchmarking Quantum Systems
expand_more
Last updated: January 18, 2024
IonQ Staff

At the QNLP conference | Photo by Destiny Chen

At the QNLP conference | Photo by Destiny Chen

IonQ recently participated in the second Quantum Natural Language Processing conference in Oxford. It was a terrific opportunity to hear from other researchers in this growing field, and to contribute by describing some of the most recent language processing systems we’ve developed at IonQ in the past few months.

At the previous QNLP conference in 2019, the process of running programs on real quantum computers was barely getting started. Back then, even the most established quantum natural language processing (NLP) research initiative had yet to announce successful implementation on quantum hardware, a feat that was accomplished in 2020.

Just a couple of years later, and so much has happened! New results from quantum computers in AI are being published almost every month, with applications including probabilistic reasoningfinancial modeling, and image classification. It’s a feverishly exciting time: quantum algorithms have been developed theoretically since the 1980s, but mainly in equations and simulations. The transition to real quantum implementations is happening now, and happening quickly!

Natural language processing is one of the main application areas. Quantum mechanics and NLP use the same mathematics for key operations, a pattern that was recognized by the early 2000s, and has become even more prevalent since neural networks have carried the use of vectors and tensors throughout AI. At IonQ we have been working systematically to develop prototypes of some of these language components. If we count the time for hardware to become available, it took the field over a decade to build the first QNLP application. Using our access to IonQ Harmony, we’ve built the second, third, and fourth in just a few months!

We’ve chosen examples that are deliberately simple and reusable – rather than focus on a particular NLP technique or language theory, we’ve looked for examples that can be useful in many AI topics. All of the examples presented at the conference involve composition of some sort – the ability to put different parts together and perform computational tasks using the resulting product.

The first application is text classification, the challenge of deciding an appropriate topic label for a given sentence or document. We started with word-based classification, counting word statistics into a “term-document matrix” during training, and then adding these weights together for the words in each new text to choose the best topic label. The algorithm is deliberately basic, and goes back at least to the vector space model for information retrieval, first developed in the 1960’s. The key thing being demonstrated here is that the common mathematical language of vectors enables us to produce quantum implementations for standard AI techniques. Even this simple challenge required a method for building quantum circuits that perform state addition, solved in this case partly with the help of quaternions.

So far this demonstrates that quantum computers can do real work, though it is work we could do already. Thanks to wave-like phenomena like interference and entanglement, quantum computers can go further and express more sophisticated relationships. Our next experiment is with ordered pairs of words. Quantum probability models have shown promise in this area for information retrieval and concept modeling, and we show that the way a quantum circuit learns to “fit” a training distribution can be used to generate and propose new relationships.

As well as generating new relationships, composition often involves leaving some options behind. For example, “Cambridge” may refer to several places, including those in England and Massachusetts. If someone is traveling “from Boston to Cambridge”, knowledge of geography suggests that in this case we’re talking about Massachusetts. In language, the interplay between words is a crucial help when choosing the appropriate meaning - for example, “Java” may refer to an island or a programming language, and the phrase “visit Java” refers to the island while “learn Java” probably refers to the programming language. This behavior can be modelled in linear algebra by representing nouns as vectors and other words such as adjectives and verbs as matrices that operate on these vectors. We demonstrate a simple quantum circuit that performs a similar “sense selection” operation in the “visit Java” example.

The key message here is that real quantum computers are performing examples of language processing tasks. But they are still small. An older ten qubit system could represent 210 variables, which is a classical kilobyte, but with thirty or more qubits in the new IonQ Forte, we can in theory represent 230 variables, which is a gigabyte, and every ten more qubits adds roughly three decimal zeros to that calculation. The systems above do not yet take advantage of this scale, and it is often not yet clear how best to do this.

There is a lot more work to do in this space and we at IonQ will continue to be a part of it. We hope to have even more to report at the next QNLP.

Next Resource:

A New Approach for Accurately Simulating Larger Molecules on IonQ Computers

Learn about techniques to compress large molecules into today's NISQ trapped ion systems

IonQ Staff
Read the case study