The journey to realizing functional quantum computers will be long and it’s a path that Q-CTRL is committed to making as easy as possible for you.

And by easy, we mean less difficult. Building a universal quantum computer with millions of entangled, coherent quantum bits running complex algorithms is not going to be simple or straightforward. But there are many successes to be achieved along the way.

## Error Correction Comes at a Cost

Much of the great promise supporting our community’s aspirations for quantum computing at-scale, and in the long term, comes from quantum error correction (QEC).

QEC is a process whereby the quantum information stored in a single qubit is distributed across other supporting qubits; we say that this information is “encoded” in a logical quantum bit. This procedure protects the integrity of the original quantum information even while the quantum processor runs - but at a cost.

Depending on the nature of the hardware and the type of algorithm you choose to run, the ratio between the number of physical qubits you need to support a single logical qubit varies - but current estimates put it at about 1000 to one. That’s huge.

Our community isn’t yet at the stage where we have a fully encoded logical qubit, but there is a global research effort under way trying to work out how to get us there, how we can make QEC more efficient, and what can be done in the interim.

For near-term technology - what CalTech’s John Preskill has called Noisy Intermediate-Scale Quantum technology - quantum error correction is not required but errors remain a major concern. In this regime Q-CTRL directly improves the performance of quantum operations and enhances the coherence of quantum bits in these systems.

“It’s as simple as that: by driving down hardware error rates you expand the functionality of your system; you expand what it can do before it suffers an algorithmic error,” Q-CTRL CEO Professor Michael Biercuk said.

We’ve talked about the NISQ regime and how Q-CTRL can help before. But what about future functionality?

As we move towards universal, fault-tolerant quantum computing, error correction will be critical - but will be used as a last resort.

Professor Stephen Bartlett is a theoretical quantum physicist at the University of Sydney working on the development of quantum error correction. He is not associated with Q-CTRL but is an academic colleague of Professor Biercuk.

He said: “Error correction is resource expensive. You want to do everything to reduce or eliminate errors first.”

Professor Bartlett said quantum error correction is slow, costly, complex, so using mechanisms to suppress errors before you run any algorithms is essential.

“You want to get not just below the error threshold but as far below as possible,” he said.

## Dive Below that Error Threshold

The error threshold is the break-even point: it represents the point where all the resources you have spent to perform logical encoding and running the process of error detection and correction are balanced out by the gains in reducing the likelihood of error overall.

But break-even isn’t really good enough. The further below the error threshold you are, the more physical resources you have available to do the real work you want a quantum computer to do: processing quantum algorithms.

One approach to implementing QEC is known as concatenated quantum error correction. This is by no means the most modern QEC encoding mechanism, but the mathematics of how concatenated QEC gives benefits is easy to understand.

Professor Biercuk said: “How much gain you realize in performing this type of ‘simple’ quantum error correction scales like a ‘super exponential’ based on a ratio of error likelihood to error threshold.”

It is explained with this formula for the likelihood of error when QEC is being applied:

Where is the likelihood of error (a probability between zero and one); is the error threshold and is the level of concatenation (a measure of how much encoding overhead is incurred).

Mathematically this means that being below threshold corresponds to the value of at which is less than one.

“Here the fundamental driver is the ratio of these error probabilities,” Professor Biercuk said. “If is only just below then in order to achieve a target final error probability we need to make very large, incurring huge overheads. In this instance, QEC still enables large-scale computation in principle, but doing so simply becomes impractical.”

However, if, for instance, is one thousand or one million times smaller than , the pay-off is huge. We achieve the same target error rate with very low values of .

“At Q-CTRL, our solutions drive the error per gate, , down as far as we can, and in so doing increase the efficiency of quantum error correction,” he said.

Even though the specific approach of concatenated QEC is in many senses outmoded, Professor Bartlett explains that the fundamental story remains unchanged.

“The principle is the same: the lower the ratio between error likelihood and the threshold, the more functionality you can get from your error correcting code.

“Professor Biercuk’s quantum control research has been shown to be effective in driving down the likelihood of error,” Professor Bartlett said.

Professor Biercuk said: “Some of the calculations we’ve done suggest that considering existing quantum error correcting codes and ambitious target problems, combined with the achievable performance afforded by Q-CTRL solutions, we can in principle reduce the physical-to-logical qubit ratio from 1000:1 to about 50:1 or better.

“We expect those ratios to continue improving as both the solutions and the hardware platforms mature. That’s an enormous boost to a user’s ability to deploy their hardware for actually solving computational problems rather than just correcting errors.”

But the story doesn’t end there.

## Impact of Correlated Noise on QEC

Most current approaches to quantum error correction assume interference in qubits comes from uncorrelated noise.

However, we know from research in the lab that this is not generally a realistic assumption. Many error sources are correlated - either temporally or spatially. For instance, any amplifier whose output slowly changes in time and results in qubit over or under-rotation produces correlated errors.

Professor Biercuk said: “While this doesn’t fundamentally destroy quantum error correction, it does mean that some of the most optimistic error thresholds being reported may become tighter if you relax the assumptions about statistical independence.”

At Q-CTRL the control strategies we employ act like filters to reduce the effect of noise on a qubit, ultimately resulting in lower error probabilities. But these come with an ancillary benefit - the filtering approach primarily reduces slowly fluctuating noise; what’s left over and actually experienced by the qubit has been “whitened”.

This work is being developed by Quantum Control Engineer Claire Edmunds. Ms Edmunds has published this work from her University research in a paper on the Physics Arxiv.

She said: “What we do is get rid of the low-frequency noise by using it against itself.” By this she means that our controls use special symmetries to effectively cancel the effects of slowly varying noise.

The control transforms the strongly correlated laboratory noise into the sort of uncorrelated error that is easily dealt with in QEC. Q-CTRL’s solutions don’t just reduce errors, they also change their statistical properties to improve the efficiency of error correction.

We call this process **Qubit Virtualization**.

## The Moral of the Story

At Q-CTRL we attack the problem of errors in quantum computing from two complementary perspectives.

Our solutions are essential for immediate use in quantum technology to suppress errors, even in the current era where we have no logically encoded quantum operations employing QEC.

And in the future our control solutions will serve as a complement to QEC, dramatically improving its efficiency through both error suppression and **Qubit Virtualization**.