0:00
/
0:00

Commercializing Quantum Computing: How to Design Sustainable Roadmaps Across the Quantum Stack | Deep Tech Catalyst

A chat with Dave Grimm, Partner @ AlbionVC

Welcome to the 95th edition of Deep Tech Catalyst, the educational channel from The Scenarionist where science meets venture!

If you’re building toward the edge of what’s technically possible—this one’s for you.

Quantum computing sits at the frontier of Deep Tech: rich with promise, but still unproven. The physics are sound, the potential is massive, and yet the path to commercial impact remains open—and hotly contested.

To explore what it really takes to build, fund, and scale in this space, we’re joined by Dave Grimm, Partner at AlbionVC.

In this episode, we cover:

  • What makes quantum computing fundamentally different—and why it won’t replace classical compute

  • Why materials science will likely deliver the first real proof of value

  • How to think about business models before the market even exists

  • What early-stage founders must prove to raise capital in a high-risk, high-capex environment

  • How institutions—from pharma to government—are shaping the early demand

Let’s dive in. ⚛️


✨ For more, see Membership | Deep Tech Briefing | Insights | VC Guides


BEYOND THE CONVERSATION — STRATEGIC INSIGHTS FROM THE EPISODE

What Actually Makes Quantum Different

Quantum computing introduces a fundamental departure from the way information has traditionally been processed. Unlike classical bits, which exist in a binary state of either zero or one, quantum bits—or qubits—can exist in a superposition of both states simultaneously.

This property allows a quantum system to evaluate many possible outcomes in parallel, rather than sequentially.

The result is exponential scaling: a quantum system with n qubits can, in theory, represent 2^n different configurations at once. This unlocks computational power that quickly surpasses even the most advanced supercomputers.

For context, a system with just 100 high-fidelity qubits would be capable of processing more variables simultaneously than there are atoms in the observable universe. This is not simply a theoretical curiosity—it redefines what is computationally possible for certain classes of problems.

Importantly, quantum computers are not universally superior.

They are optimized for a specific set of problems, particularly those involving complex optimization, molecular modeling, or scenarios governed by quantum phenomena themselves. For everything else, classical systems remain more efficient, more scalable, and more practical. Understanding where quantum applies—and where it doesn’t—is critical to its effective deployment.

Orchestrating Hand-Offs Between Classical and Quantum Workloads

Rather than replacing classical compute, quantum technologies are expected to operate alongside it, functioning as highly specialized tools within a broader computational architecture.

In practice, this means workflows will be divided: the bulk of the data processing, control logic, and systems management will continue to be handled by classical infrastructure, while quantum resources are used selectively for specific, computationally intensive steps.

This hybrid model envisions quantum computing as a service layer—a callable resource embedded within a classical pipeline. The interaction may resemble an API call, where classical systems initiate a targeted request to a quantum processor to resolve a particular subproblem, such as simulating a quantum system or identifying a global optimum in a complex search space.

Once the quantum operation is complete, the results are returned and integrated back into the classical environment.

This architectural model has important implications for how quantum computing will be deployed, monetized, and accessed.

It positions quantum as a complementary resource rather than a replacement—valuable precisely because of its specificity, not in spite of it.

As computing strategies evolve, the challenge will lie not only in developing the quantum hardware itself, but also in designing seamless orchestration between classical and quantum workloads. The value lies at the intersection, where each paradigm is allowed to do what it does best.


Capital Movements

From GPUs to grid: efficiency layers fundraise big; autonomy goes field-ready; RNA & clinical keep pace | Deep Tech Capital Movements #39

From GPUs to grid: efficiency layers fundraise big; autonomy goes field-ready; RNA & clinical keep pace | Deep Tech Capital Movements #39

This week Deals Sector Allocation — AI/Compute 8; Energy & Grid 7; Cyber/Defense 7; Bio 6; Materials 4; Semis/Quantum 3; Aga & Food 3; Mobility 3; Space/Aero 2; Robots, Mining & Construction 1 each.


Where Early Value Will Show Up First

Materials Science

Among the wide range of potential quantum computing applications, one category stands out as both technically aligned and commercially meaningful in the near term: materials science.

This is not coincidental.

Many problems in this field—such as modeling the behavior of electrons within novel compounds—are governed by quantum mechanics at their core. Electrons, unlike classical particles, do not always behave predictably. Understanding their position, energy state, or interactions within a material often requires quantum-level precision.

Classical computers are limited in this respect. Even the most advanced simulation tools rely on approximations, which quickly break down as the system grows in complexity.

Quantum computers, by contrast, offer the possibility of simulating these behaviors more precisely and at a greater scale. This could enable breakthroughs in designing new superconducting materials, more efficient batteries, or novel catalysts—domains where even incremental improvements can have system-wide impact.

The opportunity is not just theoretical.

These are applications where existing tools fall short, and where unlocking deeper understanding would have immediate downstream effects. In a field like battery chemistry, for example, the ability to predict and test new cathode materials with high accuracy could shorten development cycles and reduce the need for costly trial-and-error experimentation in physical labs.

Drug Discovery and Finance

Beyond materials, another domain with early quantum relevance is pharmaceuticals. Like materials science, drug discovery involves complex molecular interactions governed by quantum effects.

Modeling how a molecule binds to a receptor, or how a compound behaves within a biochemical pathway, often depends on precise simulations that are beyond the limits of classical compute.

Here too, quantum systems may offer a level of fidelity that could radically improve how compounds are selected, tested, and optimized—potentially streamlining the drug development process.

In contrast, other sectors—such as finance—may take longer to benefit in a meaningful way. While optimization problems abound in financial services, the current scale and fidelity of quantum hardware are not yet sufficient to unlock transformative performance.

The potential is there, particularly in areas like portfolio optimization or risk modeling, but the commercial advantage remains a few steps removed. For now, interest from the financial sector remains largely exploratory.

The LK-99 Moment and What It Revealed About Modeling Limits

One of the most illustrative examples of quantum computing’s potential came not from a quantum breakthrough itself, but from a moment of collective scientific curiosity.

In 2023, a team claimed to have discovered a room-temperature superconductor known as LK-99. The announcement triggered a global wave of attention. Within days, dozens of labs attempted to replicate the findings, racing to confirm whether this material truly defied one of physics’ most persistent limitations.

At the center of the original claim was a quantum phenomenon: tunneling.

While LK-99 ultimately failed to meet expectations, the incident revealed something critical. Even if the material had been viable, researchers lacked the tools to model and optimize its behavior with the necessary precision. The inability to explore its quantum properties in detail—due to the limitations of classical simulation—left the field effectively stuck.

This moment served as a reminder: the bottleneck isn’t just in discovery, but in understanding.

A functioning quantum computer could, in theory, help navigate these kinds of problems with far greater accuracy, allowing researchers not only to identify candidate materials, but to optimize them with purpose.

The LK-99 episode was fleeting, but it underscored the scale of value that could be unlocked when quantum modeling becomes both viable and accessible.

Supremacy as a Technical Proof, Not an Outcome

The quantum computing field has already reached one major milestone: quantum supremacy. This refers to the moment when a quantum computer successfully performs a calculation that no classical computer could complete in a reasonable time.

Google’s demonstration of quantum supremacy, though later debated in some circles, remains a valid and important proof that quantum systems can outperform classical systems—at least in principle.

However, the problem solved during that experiment was, by design, a toy problem. It was engineered to highlight quantum advantage without having any real-world utility.

As such, it marked a significant technical achievement, but not a commercial one.

Supremacy showed that the technology works. What it did not show is that quantum computers can solve meaningful problems better than classical alternatives in ways that matter to industry or society.

In that sense, supremacy is a gate—but not the destination. It clears the way for the next phase of development, but leaves the burden of proving value still squarely on the table.

Demonstrating Useful Advantage—and Then Commercial Advantage

The next inflection point in the quantum journey is what many in the field refer to as “quantum advantage.” This is where a quantum system solves a problem of real value—scientific, industrial, or economic—better than any classical approach.

That distinction matters. It moves quantum computing from the realm of pure theory into practical contribution. The challenge is that no such demonstration has been convincingly delivered yet.

Once that threshold is crossed, a second milestone follows: proving commercial advantage. This is when quantum computing is not only useful in principle but valuable in practice—delivering outcomes that justify the cost, complexity, and operational effort required to run quantum systems.

In some cases, that might mean advancing the state of scientific knowledge in a way that enables new products. In others, it could mean reducing the time or capital needed to develop a technology that already has a known market.

Each of these steps—supremacy, advantage, and commercial impact—acts as a stage gate. Supremacy tells us the door is open. Advantage tells us it leads somewhere valuable. Commercial impact shows that the journey can be monetized.

For now, the race is to demonstrate advantage. And when it happens—likely in a tightly scoped domain like materials or drug discovery—it will trigger a wave of experimentation aimed at replicating and scaling that early success.

That will be the true beginning of quantum computing’s role in industry, and the point where commercial models start to crystallize.



Potential Business Models for Quantum Computing Startups

Paying for Access to Core Machines

Even at this early stage in the field, it’s already possible to imagine how quantum computing might be monetized.

One of the most straightforward paths resembles the cloud computing model: a small number of highly specialized, capital-intensive quantum machines operated by a few players, with customers purchasing access on demand.

In this scenario, quantum compute becomes a scarce but centralized resource—offered as a service to research institutions, pharmaceutical companies, materials scientists, or anyone with a problem that justifies the cost.

The pricing structure may take the form of compute time—measured in qubit cycles or calibrated around the complexity of the task—or be tied to specific algorithms or outcomes.

For example, a company developing a new drug might spend millions on AI compute, then allocate additional budget for targeted quantum simulations. That demand would likely be met by cloud-style access to a third-party quantum machine, rather than building one internally.

Algorithms as an On-Demand Layer

Layered on top of the hardware is a second area of potential value capture: quantum algorithms. These are not just generic code libraries—they are finely tuned for specific use cases and optimized for the constraints of current quantum hardware, such as limited coherence times and error rates.

As quantum systems evolve, so too will the algorithmic approaches, making them a dynamic and essential layer in the stack.

One possible model sees these algorithms functioning like software modules or APIs: callable by classical systems when needed, billed per use, and developed by a growing ecosystem of specialists.

In this model, quantum algorithm developers are not competing to build hardware, but instead positioning themselves as indispensable intermediaries. They translate real-world problems into quantum-native formulations and make the machines useful—regardless of who wins the race on the hardware side.

This level of abstraction also creates optionality. An algorithmic layer can be deployed across different modalities—ion traps, neutral atoms, superconducting qubits—without being locked into a single platform.

That flexibility reduces technology risk and opens the door to broader adoption as multiple hardware approaches mature.

Building Proprietary Applications and Licensing the Output

A more vertically integrated path also exists. Rather than offering compute or algorithms to external users, some quantum companies may choose to build proprietary applications internally.

In this model, the quantum stack is developed and used in-house to solve high-value problems—such as designing novel materials or discovering new molecules.

The output is not compute time or algorithmic access, but intellectual property: a patentable discovery, a manufacturable material, or a licensable formulation.

This approach demands far more upfront effort. It requires not only deep quantum expertise, but also capabilities in adjacent domains—materials engineering, chemical synthesis, and pharmaceutical development.

It may even require building full-stack research operations alongside the quantum platform itself. But the potential upside is significant: the company retains ownership of the entire value chain and shares directly in the commercial returns of whatever breakthrough is achieved.

Each of these models—selling access, licensing algorithms, or creating proprietary outputs—presents trade-offs. The first could offer scalability. The second, strategic flexibility. The third, concentrated upside.

At this stage, no single model has proven itself. The field is still experimenting. But the contours of value capture are becoming clearer—and founders must begin to define where in this evolving stack they intend to compete.

3 Essentials Founders Must Nail to Raise Capital

1. Qubit Fidelity Is Non-Negotiable

Among the many technical challenges facing quantum hardware development, one stands out as fundamental: qubit fidelity. Fidelity refers to how reliably a qubit can perform computations before its quantum state deteriorates—a phenomenon known as decoherence.

When decoherence occurs, the qubit loses its ability to remain in a superposition, effectively becoming useless for further computation.

This limitation defines the boundaries of what is currently possible. Algorithms must be incredibly efficient, designed to complete within the fleeting window before coherence is lost.

For hardware developers, improving fidelity is not simply a matter of performance—it is the precondition for usefulness. Until that constraint is addressed, the scope of problems that quantum computers can solve will remain narrow.

For founders, this creates a clear strategic imperative. A team that can demonstrate meaningful progress on fidelity—whether through materials, control systems, or architecture—commands attention, even in the absence of a defined commercial model. Investors understand that unlocking fidelity unlocks everything else.

2. Avoiding Single-Modality Risk Across the Stack

A second challenge for founders lies in navigating the fragmented and still-evolving hardware landscape. Today, multiple modalities are being pursued—neutral atoms, ion traps, superconducting circuits—each with its own trade-offs, none with definitive long-term dominance.

Betting too early or too narrowly on a single modality exposes a company to significant risk. If that technology fails to scale or loses momentum, the entire business may be stranded.

Some of the most promising startups in the space are avoiding this trap by building across the stack—focusing on layers such as algorithms, network orchestration, or middleware that are agnostic to the underlying hardware.

These companies retain flexibility. They remain relevant regardless of which qubit platform ultimately proves superior.

This approach requires discipline. It often means turning away from core hardware development in favor of tooling, integration, or interoperability—areas that may seem less glamorous but are strategically critical.

It also requires a deep understanding of the ecosystem as a whole: knowing where value will shift as hardware matures, and how to position accordingly.

3. Design the Capital Stack Properly

Quantum companies are, by definition, long-term bets. Their development cycles are measured in years, not quarters. Technical milestones come before commercial traction.

Capital needs are high. In this context, how a company raises money—and who it raises it from—matters enormously.

Founders need to design a capital stack that supports the full journey.

This means partnering with investors who understand the timelines, who can support multiple rounds, and who bring more than just money: access to strategic collaborators, guidance on non-dilutive funding, and experience navigating complex technical risk.

It also means avoiding early structures that create downstream friction—cap tables that constrain future raises, or syndicates that are misaligned on expectations.

Ultimately, the goal is to build a foundation strong enough to carry the company to meaningful inflection points: a fidelity breakthrough, a proof of quantum advantage, or a product milestone that unlocks commercial pathways.

In Deep Tech, early-stage success is less about short-term traction and more about sustained clarity of purpose—technical, strategic, and financial.

Refer a friend

Early Adopters: Who’s Paying Now?

Government and Defense as Early Demand Drivers

In the current quantum landscape, commercial demand remains limited, but institutional interest is growing—and in some sectors, already material.

Government agencies and defense organizations represent the clearest early market. Their motivations are strategic as much as technological: the ability to simulate complex physical systems, optimize cryptographic protocols, or model scenarios at atomic scale has national security implications.

As a result, major governments are actively funding quantum programs and building dedicated infrastructure to support domestic capabilities.

This makes public institutions one of the few sources of near-term funding and engagement for quantum ventures. In some cases, partnerships may involve research grants, prototype deployments, or direct procurement of quantum hardware and services.

These relationships often come with extended timelines and regulatory oversight, but they also offer validation, resources, and long-term alignment—especially for startups tackling foundational challenges in the stack.

Pharma’s Near-Term Engagement and Finance’s Timing

Beyond government, the pharmaceutical industry has emerged as one of the most engaged corporate sectors. Large pharma players are actively tracking quantum developments, particularly for their potential to accelerate early-stage drug discovery.

The idea of simulating molecular interactions with greater fidelity—especially in cases where quantum effects play a role—has clear implications for pipeline efficiency and R&D productivity.

Several pharmaceutical companies are already exploring collaborations with quantum startups, investing in algorithm development, or supporting use-case exploration internally.

These partnerships are often structured around shared learning, rather than immediate commercial output—but they represent serious intent.

The financial sector, by contrast, remains largely in a monitoring phase. While quantum optimization holds long-term potential for areas such as portfolio construction or risk modeling, current hardware limitations constrain the practical value.

Institutions in finance are beginning to explore the space, but few are yet in a position to drive early demand.

Working with Today’s Platform Leaders Without Competing Head-On

One final consideration for founders is how to position alongside, rather than against, today’s major platform players. Companies like Google, IBM, and others are heavily invested in quantum hardware development and have made public strides in advancing the state of the art.

For most startups, the path to relevance is not to compete directly on hardware, but to integrate with, build upon, or complement these existing efforts.

This might mean developing software layers that work across multiple platforms, or creating application-specific tools that leverage access to machines via cloud interfaces.

Even when a startup is building its own hardware, maintaining interoperability with larger systems—or at minimum, understanding how to navigate the broader ecosystem—is essential.

The most effective quantum startups are those that recognize the structure of the market as it exists today: fragmented, fast-moving, and heavily subsidized by institutional players.

Building a commercially viable company in this environment requires not just technical differentiation, but strategic positioning—knowing which relationships to pursue, when to partner, and how to retain control over long-term value creation.



Disclaimer
Please be aware: the information provided in this publication is for educational purposes only and should not be construed as financial or legal advice or a solicitation to buy or sell any assets or to make any financial decisions. Moreover, this content does not constitute legal or regulatory advice. Nothing contained herein constitutes an offer to sell, or a solicitation of an offer to buy, any securities or investment products, nor should it be construed as such. Furthermore, we want to emphasize that the views and opinions expressed by guests on The Scenarionist do not necessarily reflect the opinions or positions of our platform. Each guest contributes their unique viewpoint, and these opinions are solely their own. We remain committed to providing an inclusive and diverse environment for discussion, encouraging a variety of opinions and ideas. It is essential to consult directly with a qualified legal or financial professional to navigate the landscape effectively.