Q-CTRL digest

Making Quantum Error Correction practical

January 12, 2024
Written by
Michael J. Biercuk
Michael J. Biercuk

Last year we saw a wide range of major announcements highlighting demonstrations of Quantum Error Correction (QEC) to achieve what is known as fault-tolerant quantum computing.  Most notably, an experiment from Harvard triggered speculation that we’ve entered the regime of logically encoded quantum computing.

Here we set out to help you understand what this means and to provide real insights into where we go from here - no PhD required.

Quantum computers suffer from an Achilles’ heel that is simply not present in conventional computers: error-prone hardware. These errors arise in the underlying quantum information carriers called qubits—in times much shorter than a second. Hardware failure is almost unheard of in conventional “classical” computing where processors can run for approximately one billion years continuously before a transistor fails.

This enormous gap between what stable classical computers can achieve and the faulty error-prone quantum computers we have today has led to a huge amount of research investment and attention.

Here at Q-CTRL we specialize in stabilizing quantum hardware against errors so users can dramatically expand what’s possible on a real device. For instance, with 127 qubits we’ve shown that using our infrastructure software you can push IBM hardware right to its intrinsic limits, enabling record-setting performance across a wide range of benchmarks.

But the hardware still has limits! What do we do to break through those?

Understanding Quantum Error Correction

QEC is an algorithm designed to identify and fix errors in quantum computers. In combination with the theory of fault-tolerant quantum computing, QEC suggests that engineers can in principle build an arbitrarily large quantum computer that if operated correctly would be capable of arbitrarily long computations (so long as some basic conditions are met - more on that later).

This would be a stunningly powerful achievement. The prospect that it can be realized underpins the entire field of quantum computer science: Replace all quantum computing hardware with “logical" qubits running QEC, and even the most complex and valuable algorithms come into reach. For instance, Shor's algorithm could be deployed to crack codes or hack Bitcoin with just a few thousand error-corrected logical qubits. On its face, that doesn't seem far from the 1,000+ qubit machines promised by 2023. (Spoiler alert: this is the wrong way to interpret these numbers).

In a cartoon depiction, QEC involves “smearing” the information in one physical qubit over many hardware devices, encoding to form what’s known as a logical qubit. One logical qubit carries the same information as a single physical qubit, but now if QEC is run in the right way any failures of the constituent hardware units can be identified and fixed while preserving the stored quantum information! While it sounds futuristic, it actually draws from validated mathematical approaches used to engineer special “radiation hardened" classical microprocessors deployed in space or other extreme environments where hardware errors are much more likely to occur.

QEC is real and has seen many partial demonstrations in laboratories around the world—initial steps that make it clear it's a viable approach. And recently, a beautiful experiment demonstrated that many logically encoded qubits could be run and made to interact in order to execute a full algorithm on logical qubits - an astounding demonstration!

Source: IEEE Spectrum. The quantum error correction cycle. In an iterative loop, errors are indirectly inferred from an encoded logical qubit in order to apply localized corrections to faulty hardware elements. Encoding is simplified from actual QEC strategies for visual clarity.

Quantum Error Correction tends to introduce more error than correction

So, is this the end of the story? Do we just rinse and repeat, making more logical qubits in order to build arbitrarily large machines that are resilient against errors?

Not quite. There’s one major problem: right now QEC does not, in general, make quantum computers perform better. In fact, but for a handful of beautiful demonstrations it almost always makes things worse in the experiments published to date.

If we’re now able to run QEC, why aren’t things just getting better and better? The algorithm by which QEC is performed itself consumes resources—more qubits and many operations; the amount of extra work that must be done to apply QEC tends to introduce more error than correction.

The point at which the improvement balances the new added errors is generically called the error threshold (a measure of the bare hardware performance). Clearly we want our hardware to operate below (better than) the error threshold before running QEC. And many hardware teams are getting close to this breakeven point - so will that finally enable arbitrarily large quantum computations?

Even crossing this break-even point and achieving functioning QEC doesn't mean we suddenly enter an era with no hardware errors—it just means QEC actually gives some net benefit! There’s a huge difference between “some net benefit” and enough error reduction to run a large and complex algorithm with high likelihood of success.

Making Quantum Error Correction practically useful

In order to make QEC practically useful we actually have to first reduce errors well below the error threshold. It turns out that the power of QEC grows as the hardware performance gets better - a truly virtuous circle!

We need to ask how far below the error threshold we need to be for QEC to deliver practical benefits in applications we care about.

According to some calculations, in order to run a large high-value algorithm like Shor’s algorithm for codebreaking, if your error is just barely below the threshold you may need a ratio of >100,000:1 – over 100,000 physical qubits to encode just one logical qubit. Returning to the idea of running Shor’s algorithm and a few thousand logical qubits suddenly implies a few hundred million physical devices! Compare that number to the recent pronouncements of 127-qubit utility scale machines from IBM and you see just how painful the notion is.

But pushing the hardware well below the error threshold can reduce this overhead penalty massively! In short, if we follow the math it looks like we can reduce this number from >100,000:1 → ~50:1 if we make the hardware errors better than 1,000x lower than the threshold. You can learn more about this detailed example from my recent Keynote at IEEE Quantum Week.

The resource-scaling of QEC makes clear that errors need to be over 1,000x below the error threshold in order to deliver practical benefits at a level suitable for execution of useful algorithms. Example scaling behavior based on Phys. Rev. Lett. 98, 190504 (2007) and New Journal of Physics 16, 093045 (2014). Target logical failure rate based on scaling estimates in Advances in Cryptology - ASIACRYPT 2017 pp 241-270. These numbers should be considered indicative rather than absolute as both encoding and algorithmic protocols are improved.

The notion of simply throwing QEC at an error-prone processor and hoping it will deliver improvements is a bit like trying to cool a room by blasting the air conditioning while leaving all of the doors and windows open. You might feel the cool air, but it’s exceptionally inefficient and takes far too long to actually lower the temperature in the room.

The scenario of running QEC alone on bare hardware is analogous to running an air conditioner on a room with all doors and windows open. The heat leaks overwhelm the AC unit and force it to run at full blast while it still often doesn't achieve the desired temperature.
Combining QEC and error suppression in our air conditioning analogy leads to faster cooling with lower effort by first patching the heat leaks.

Combining error suppression with QEC closes all of the windows and doors, and adds insulation to the room first. Then running the air conditioning for a short while quickly gets you to a comfortable temperature!

Success stories from the community

Better than this analogy is the evidence we already have in hand. Three separate manuscripts from Nord Quantique, the University of Sydney, and Q-CTRL published in 2023 show how the combination of our error suppression tools with QEC encoding make QEC perform better – whether in the quality of logical encoding or QEC’s ability to actually identify and correct errors.

In Nord Quantique’s work, the inclusion of our technology for hardware optimization meant the difference between achieving only breakeven performance with QEC or actually achieving a net enhancement in logical qubit lifetime!

Results published by Nord Quantique show how QEC by itself does not lead to a net improvement in system performance.  However, combining QEC with error suppression (Purple bars, “With QEC, optimized”) enables an experimental enhancement of logical-qubit lifetime. From Arxiv.org:2310.11400.

Even teams working independently are implicitly using error suppression in their most advanced QEC demos because they know it’s essential to push the hardware to its limits as a first step. Google integrated the core technique of dynamic decoupling into the execution of their impressive QEC experiments with large codes. And any time an experimentalist uses “DRAG” or a similar pulse-shaping technique at the gate level, they’re really combining error suppression with QEC!

Combining error suppression with QEC appears to be the only viable path forward for three reasons:

  1. We need the hardware to perform at its intrinsic limits in order to get far below the error threshold, and no matter how good the hardware may be, imperfections in all of the classical signals sent to the hardware can always cause errors that need cleaning up.
  2. We know that the action of error suppression improves the compatibility between the typical errors experienced in hardware and the mathematical assumptions behind QEC.
  3. Alternative techniques like error mitigation – which have shown promise in the NISQ era by post processing imperfect results – appear largely incompatible with the “single-shot” approach to real time QEC.

Moving forward to an error-corrected quantum future

We’re thrilled to be working with the hardware manufacturers and the QEC encoding teams to deliver a future in which QEC is both beneficial and practically relevant.  And because of the issues identified above, there’s work to be done by the entire community to make this real.

Achieving practical QEC ultimately requires four areas to progress in parallel:

  1. Intrinsic hardware performance
  2. Error suppression in hardware operation
  3. QEC encoding and protocols
  4. Repetitive execution in real time

Most of the media attention has been on the third point, but without the others, error correction will remain the focus of science rather than practical utility. At Q-CTRL, where we aim to make quantum technology useful, we’re working to ensure everything needed for practical QEC is delivered!

Get in touch to learn more about how our professional quantum EDA tools for error suppression or fully integrated performance management can help you accelerate the path to real and useful Quantum Error Correction!

Latest news and updates