Q-CTRL digest

Q-CTRL survey reveals R&D community wishlist for quantum hardware

November 29, 2022
Written by
Michael J. Biercuk

We’ve heard a lot from potential end users about their views on the field of quantum computing and how they plan to engage with this transformational new technology to gain strategic advantage.

But what about the boots on the ground? What are key researchers in the industry - from hardware-focused physicists and engineers to algorithm developers - thinking about? Where do they see the field going? What do they need?

Earlier this year, we engaged with the R&D community driving quantum computing forward and identified several previously unarticulated dynamics with the potential to force a rethink in prevailing business analyses. We summarize key findings below.

Watch out for emerging "dark horse" technologies

Superconducting cloud quantum computers are truly dominant in the community’s usage patterns. Among researchers, the story is much different. Even with a sample biased towards the solid-state research community (superconducting and semiconducting devices), we found a major discrepancy in the predominance of qubit technologies between quantum computing consumers and R&D teams (roughly half of the overall survey respondents cited access to real hardware either directly, commercially, or through a collaborator).

Among respondents from all backgrounds (quantum algorithm developers, quantum theorists, and quantum hardware R&D), the research community shows that there is a large background investment in neutral-atom systems in particular. In fact, neutral atoms appeared second only to superconducting circuits in prominence, and ahead of trapped ions. Among teams operating their own hardware, neutral atoms almost matched superconducting systems in prominence. This aligns with the emergence of a number of major commercial initiatives in neutral-atom quantum computing from Atom Computing, Pasqal, Quera, and ColdQuanta.

Trapped ions place third in priority among the research community, though they surprisingly revealed little uptake amongst those researchers who rely on cloud access, despite systems from Quantinuum and IonQ already appearing online, and systems from AQT accessible via private engagement.

We also note a “rear-guard” action from other solid-state devices including NV centers and semiconductor quantum dots, again rivaling trapped-ions in scale of uptake. Again, these align with new initiatives from commercial organizations including Diraq and Quantum Brilliance.

Is this indicative of a threat to the current dominance of superconducting systems? Is their head start sustainable or too great to overcome? Is the historical “difficulty” of building trapped-ion systems blocking an expansion of research? Will we see a greater change as more systems using alternate technologies such as neutral atoms and trapped ions become available through providers like Amazon Braket? 

The survey responses provided new and evocative insights on these questions.

Researchers face surprisingly common needs and bottlenecks

As part of the survey, we also asked for feedback from three key groups of stakeholders in the quantum community, including experimentalists, theoreticians and algorithm developers, to understand their research needs and challenges.

Each group seems to have divergent needs to support their research. However our analysis found a surprising convergence of the highest-priority identified needs among these disparate sectors.

Most importantly, the need to simulate complex device dynamics beyond simplified circuit-level models and the need to characterize hardware systems appeared among the top four responses from all respondent groups. Experimentalists uniquely pointed to the need for enhanced automation in hardware tuneup and calibration as their top need  (tasks that are relevant only to teams with hardware).

It appears that theorists and even algorithm developers see a need for better insight into what’s really happening inside of hardware before scaling their work to larger systems.  This highlights a gap between contemporary achievable hardware performance and the performance of conventional simulators. This in turn should place renewed emphasis on the quality of service provided by quantum computer platform vendors.

Moreover, it appears that given the level of interest on the topic, the commercial simulation packages offered by platforms such as Qiskit, Braket, and NVIDIA may be leaving a gap in utility. We surmise this gap comes from the drive to prioritize support for larger circuits (larger numbers of qubits), which naturally comes at the expense of the realism of the simulated dynamics.

A “conventional” quantum simulator may give insight into the expected output of a perfect quantum computer, but most offerings appear to be deficient in providing insights into the actual dynamics experienced by real hardware.

Cutting-edge labs emphasize a need for continued device-level improvement and simulators won’t be enough for the theory community

Most industry roadmaps highlight increases in system size over the coming years, and our team has focused on delivering as much useful performance from this hardware as possible. Our survey revealed what researchers are truly focused on in terms of quantum hardware capabilities.

We found that over the next 2-3 years most theorists and algorithm developers will need access to systems with more than 50 qubits in order to meet their research objectives.  By stark contrast, hardware researchers’ needs will be largely unchanged with roughly half satisfied using devices with less than 5 qubits.

In our analysis, this is a revealing dichotomy. The research community is clearly indicating that the path for researchers who are quantum computer end users and those who are effectively research-grade system manufacturers is diverging. In short, experimentalists are implicitly identifying a wide range of critical challenges to overcome in the coming years before they focus on system scaling. They are signaling that based on their experience and expertise, “more” is not the only answer.

Nonetheless one must not ignore the importance of increased system size - beyond the levels accessible in conventional simulators - in order to continue driving advances in theory and algorithms. Systems with more than 50 qubits are not simulable for most tasks, meaning there is a new emerging source of demand for larger scale hardware systems. Moreover, with simulators out of the picture for most, hardware performance will become an increasingly important consideration as the gap between achieved and potential quantum volume is currently large on most major systems.

Does this disparity suggest that hardware-researchers “know” something most don’t about the remaining outstanding challenges in the field? Are industry roadmaps for scaling getting ahead of the need for continued innovation at the device level? Is the continued interest in basic device-level research indicative of outstanding hardware challenges that are unappreciated amongst end-users? Are major plays in classical simulators doomed to a short shelf life because researcher needs are already overtaking them?

These are some of the high-level insights gleaned from a comprehensive survey focused on the real needs, challenges, and future plans of the quantum computing research community. What this community is thinking today will shape the quantum industry - and the experience of end users - for the next decade.

To learn more about our hardware-agnostic technology for quantum-device simulation, characterization, and performance optimization, visit our Documentation.