# Quantum Glossary

Explore and learn common terminology in the quantum computing world

**2 Qubit Gate Equivalent (2QGE)** is the base unit of capacity, a normalized measure of gate operations as run by a given quantum system, defined as a single customer-submitted two-qubit gate or equivalent operation.

Equivalence is determined as a fraction (or multiple) of the pulse-level gate duration of the given gate as compared to a two-qubit gate on the same system.

Algorithmic benchmarking is IonQ’s preferred benchmarking method. It uses algorithms or subroutines that correlate to real-world applications to assess quantum computer performance. Again, much like LINPACK does for classical supercomputers.

It’s impossible to evaluate the computational power of a quantum system purely by its physical qubit count. Noise, connectivity limitations, and sources of error limits the number of useful operations one can perform, and below a certain threshold, not all qubits could be said to be useful or usable for computation at the same time. We use the Algorithmic Qubit metric as a way to describe the number of “useful” qubits in a system.

Barium is also a silvery rare-earth metal, atomic number 56. IonQ has recently started exploring barium as an alternative qubit species because its slightly more complex structure offers higher fundamental gate and readout fidelities when controlled correctly, and because it primarily interacts with light in the visible spectrum, allowing additional opportunities for us to use standard fiber optic technologies in parts of the system.

The number of entangling gates performed in a given quantum circuit. i.e. a quantum circuit that uses thirty entangling gates would be said to have a circuit depth of thirty, regardless of how many qubits it uses to do so.

The number of entangled qubits in a single quantum circuit, i.e. a quantum circuit that entangles six qubits would be said to have a circuit width of six, regardless of how many gates it uses to do so.

This is what most people would just call “a computer.” We call it a classical computer because its approach to storing and calculating information uses *classical* mechanics: information is stored as a 0 or a 1, and all operations are simple combinations of these basic building blocks. In most classical computers, a binary bit is represented by the presence or absence of electrical current in a semiconductor device called a transistor.

Sometimes described in technical specs as “T2 time,” coherence time is how long a qubit can maintain coherent phase, that is, how long it can successfully maintain one of the critical quantum qualities like superposition and entanglement necessary for computation. Without these, you could use the qubits like classical bits, but there wouldn’t be much utility in that. Again, ions have a major advantage over many other qubit technologies here, with coherence times measured in minutes, potentially thousands of times longer than other platforms.

The complexity of an algorithm or problem in computer science can be defined as the quantifiable amount of computing resources required to run it, usually described in terms of time and memory requirements. Certain types of problems require an exponentially greater amount of classical resources (either time, memory, or both) in order to solve them as more variables are added, eventually becoming impossible at large scales. Some (but not all) of these classically intractable problems can be unlocked with the power of quantum computing.

Sometimes described as *topology*, “connectivity” describes what qubits can perform gates with other qubits within a quantum computer. Trapped ions have the benefit of having *all*-*to-all* connectivity, where each qubit can be directly entangled with any other qubit. Many other platforms are limited in their connectivity, which creates additional overhead and potentially introduces error.

There are many ways to make a quantum computer — trapped ions, neutral atoms, superconducting circuits, photonics, nitrogen vacancies in diamond, and more. In 2000, Physicist David Divincenzo proposed five conditions that are necessary for a quantum computer to be considered: it has to be **scalable** (can plausibly expand to tens or hundreds of qubits), you have to be able to **Initialize** the qubits to the same state, perform a **universal** set of quantum gates (i.e. not just annealing), and allow for individual **measurement. ** The qubits also have to have a **long coherence time,** long enough for the initialization, gates and readout to actually be performed.

A property of quantum mechanics where two particles, even when physically separated, behave in ways conditionally dependent on each other. This phenomenon can be harnessed for certain types of quantum logic gates in quantum information science, and is critical to expressing a quantum computer’s full power.

Sometimes also called logical qubit, error-corrected qubits are groups of physical qubits, v logical qubit is a group of physical qubits that are logically combined using techniques called *error *correction encoding to act as one much higher-quality qubit for computational purposes.

EGT is a proprietary IonQ ion trap technology that enables multi-core operation. This is achieved by allowing tighter ion confinement and reduced heating which allows more precise qubit control.

Fault tolerance refers to a system’s ability to accommodate errors in its operation without losing the information it is processing and/or storing. To achieve fault tolerance in quantum computing, we need three things: more qubits, higher-quality qubits, and error correction, ultimately allowing for much larger, longer, and more complex computation. This is considered the endgame for quantum computing, as a scalable fault tolerant quantum computer has the potential to unlock the ability to solve problems in physics, mathematics, computer science, and physical sciences that are impossible to solve today.

Gate fidelity is a way to describe how much noise (or error) is introduced in each operation during a quantum algorithm. Fidelity is a common way of describing this, defined as 100% minus the error rate; i.e. a *fidelity* of 99% is the same as an *error rate* of 1%.

How long it takes to perform a quantum gate. While raw gate speed could become a factor for time-to-solution in a fault-tolerant computer, the most important consideration for gate speed in NISQ systems is that it is fast enough for the computation to complete before the qubits lose *coherence*. That is, all of the gates in the algorithm need to be shorter than the qubit *coherence time.*

An *Ion Trap* or *Ion Trap Chip* is the heart of a trapped-ion quantum computer. It contains many microfabricated electrodes that together create a field that hold (“trap”) the ions in place, ready for computation. Imagine a maglev train, where the train is a microscopic line of trapped ions — the trap is the apparatus responsible for levitation. While ion traps might seem exotic, they can actually be produced with commercial fabrication technology.

At the end of a quantum computation, the answer is measured. The exponentially large computational space available during computation collapses down to a binary string, with each qubit going from a superposition state to a 1 or a 0. The state of the qubits before measurement determines which of the two states it will collapse into. Measurement is a part of quantum computation that can be confusing, because it can make people assume that because measurement forces superpositions to collapse probabilistically, all quantum computation is probabilistic in nature, but this is not true. Ignoring noise, every step in a quantum computation up to measurement is completely deterministic.

One component of IonQ’s technical roadmap, a multi-core QPU describes a single quantum processor that has multiple quantum compute zones — much like a multi-core processor in a classical computer — that can compute in parallel and be entangled via moving and recombining ion chains.

Coined by John Preskill, the Noisy Intermediate Scale Quantum (NISQ) Era is considered to be the first era of quantum computation, where modestly-sized devices with tens to hundreds of noisy qubits may be able to provide early quantum advantage, but will still be limited by noise and size. We are beginning to enter this era now, and will leave it when we achieve fault tolerance at scale.

For quantum computers to compute correctly, they must be isolated from the environment around them. Any interaction with the environment, or imperfection in the control systems that perform gates, introduces *noise*. As noise accumulates, the overall likelihood that an algorithm will produce a successful answer goes down. With too much noise, a quantum computer is no longer useful at all.