Key players insights
The race to commercial quantum computers is one of the most exciting technological challenges of our generation. The potential to disrupt the computing status quo in unprecedented ways is fueling nations and organizations to accelerate the realization of quantum computing and its application to problems of socio-economic importance.  The quantum computing community and industry are truly at an exciting point where the potential of quantum technologies is within reach in the next 5 to 10 years.  Not only can we comprehend the impact on user applications, but we are now increasingly focused on the questions around practical adoption.   This practical adoption will necessitate looking beyond the quantum computer and into the environment and ecosystem in which it will be used.  We as an industry must prepare to best serve the users who will benefit from our technologies.  While possibly the most unique due to its nature, quantum computing is of course not the first disruptive technology, nor the last (we hope).

The application of advanced computing to use-cases of scientific, industrial, and societal importance is ubiquitous. Since the advent of the first mainframes and supercomputers, these computing capabilities have been iteratively democratized and are today applied by organizations around the world every second of every day on systems located in data centers, supercomputing centers, and the cloud.


The advanced or high performance computing (HPC) community has always been at the forefront of technology and continues to play a pivotal role in novel technology adoption.  Novel technologies are often first adopted within the scientific research community whose longer-term horizon is well suited for early basic research, prototyping, and ecosystem development; following which these new approaches are adopted into commercial applications and enterprises.  Today’s quantum world is highly fragmented, with many quantum architecture modalities and an evolving technology stack.  Broad adoption of quantum computing will require an ecosystem based on standards and a consistent developer and user experience across modalities with predictable results.  These are areas in which the HPC community has vast experience and capacity.  As such, this path through the HPC community is also the likely arc for quantum computing.  Today’s HPC platforms increasingly depend on heterogeneous architectures to meet user requirements. This heterogeneity reflects the varied characteristics of algorithms, applications, and workflows, with the opportunity for quantum to further accelerate portions of these.


There are some lessons that we can learn from the HPC community regarding novel technology adoption.   The need for greater capability, such as computational performance for higher resolution simulations, drives technology exploration.  Novel technologies not only speed up existing applications but also open the opportunity for the development of entirely new approaches.  This is of course what we expect with quantum computing.

The second lesson is that novel technologies rarely replace entire applications and are typically applied to accelerate specific algorithms.  The reasons are twofold.  One is that large-scale applications are the result of 100’s of person-years of development with a legacy of output that cannot simply be replaced. The second is that novel technologies need only be applied to areas aligned with their strengths.  Somewhat fueled by the so-called Cambrian Explosion of specialized processors in the late 2010s, the ability to capitalize on bringing multiple technologies to bear on a use-case has driven heterogeneity into HPC systems.   A typical end-user workflow can be composed of several approaches, such as simulation and machine learning, leveraging many different technologies such as x86 processors, GPUs, and AI accelerators.  The outcome is a set of technologies matched to address a range of computational requirements.  Hybrid quantum-classical is of course an example of this.  Stand-alone quantum computing is already heterogeneous by definition and requires classical processing for functions such as quantum error correction.


On the whole, the key to success is the effective integration of multiple technologies into a single system.  The same is true for the successful adoption of any individual technology.  Here is the key takeaway as we think about the practical adoption of quantum computing.
In practice novel technology adoption follows a cadence of stand-alone exploration and validation followed by its integration.  Examples can be found across any industry such as computer-aided simulation in engineering or risk modeling in financial services.  Weather prediction and climate modeling are often cited in the HPC community as the prototypical applications with an insatiable thirst for computing performance.  These are highly complex simulations spanning a broad range of physical and temporal scales.   Since the first practical applications in the 1970s, they have tracked and pushed every new computing technology.  Today they can leverage 100,000s of processor and accelerator cores.  When machine learning became viable some years ago, the first applications were stand-alone to demonstrate the potential of machine and deep learning.  The next step was the tight integration into simulations and with it came a new category of application: Cognitive Simulation.  Looking into quantum’s future, the first applications are also likely to be stand-alone simulations, such as an improved understanding of atmospheric chemistry.  This advancement will then find its way into multi-component earth system models with quantum algorithms embedded in a highly heterogeneous environment composed of multiple processing technologies.  Perhaps this will represent another category called Complex Quantum Cognitive Simulation!


So what are the implications on the quantum computing stack?  In the context of today’s stand-alone quantum computers, the technology stack is composed of the software development layer through to the quantum control layer and then to the QPUs.  Integration however requires a view that extends upwards through a broader traditional computing infrastructure with the control layer as the central point of integration.  The challenge is that quantum computing has unique functional characteristics such as calibration and that the various QPU architectures function differently. In addition, to achieve effective integration into a heterogeneous environment that will provide the necessary performance advantages, careful consideration for data movement, system management, and distributed application processing is required.  Embracing heterogeneity will be necessary for moving towards practical quantum computing and ultimately the broad adoption of this disruptive technology.

Per Nyberg, Vice President of Strategic Markets and Alliances @ Quantum Machines