0:00
/
0:00

From Copper to Light: How to Venture into Optical Interconnect | Deep Tech Catalyst

A chat with Henry Huang, Investment Director @ Micron Ventures

Welcome to the 99th edition of Deep Tech Catalyst, the educational channel from The Scenarionist where science meets venture!

Optical interconnect is moving from backbone networks into the heart of AI infrastructure. As data centers push past the limits of copper, the question is no longer whether light can carry the load—it’s how to deliver speed and power efficiency at a cost the market will accept.

How do Deep tech founders build optical solutions that clear today’s speed class and still scale economically?

To unpack this shift and its commercial implications, we’re joined by Henry Huang, Investment Director at Micron Ventures!

Key takeaways from the episode (TL;DR):

💡 From Copper to Light
Optics offers higher throughput and lower energy per bit, moving data over fiber and waveguides—now extending toward chip-to-chip links.

💸 Cost Is the Gatekeeper
Complex component stacks, thermal sensitivities, packaging, and yield keep optics pricier than copper until manufacturing scales.

🧩 Two Paths to Innovate
Win as a best-in-class component or as an integrated system—either way, manufacturability and economics decide adoption.

🏗️ Selling to Hyperscalers
Align with internal roadmaps, prove fit within existing programs, and show a credible path from current foundry partners to volume production.

🚀 What It Takes to Compete
1.6T throughput is table stakes; efficiency (energy per bit, bandwidth density) and repeatable yield turn performance into product.

🎯 Focus to Win
Start with a sharp beachhead—earn revenue, harden the process, and expand only when the cost structure and reliability hold at scale.


✨ For more, see Membership | Deep Tech Briefing | Insights | VC Guides


BEYOND THE CONVERSATION — STRATEGIC INSIGHTS FROM THE EPISODE

What Optical Interconnect Is and Why It Matters

Optical interconnect is a simple idea with far-reaching consequences: move information as light instead of electricity.

The function does not change—components still need to talk to one another, data still needs to flow cleanly from point A to point B—but the medium does. Rather than charging and discharging copper traces and wires, information travels on photons guided through optical fiber, carved into on-chip waveguides, or steered across carefully aligned free-space paths.

The moment the carrier switches from electrons to light, the entire profile of the link shifts. Loss behaves differently. Distance behaves differently.

And most importantly for modern systems, the relationship between throughput and power consumption changes in a way that opens headroom copper can no longer provide.

Why Power Efficiency and Speed Are the Decisive Advantages

Copper holds its status for good reasons. It is proven, it is reliable, and decades of engineering have wrung out extraordinary performance.

But physics does not negotiate.

Driving an electrical link is a repetitive act of charging and discharging conductors, and energy is spent each time.

At the scale of a modern data center, those losses compound into real power walls. Optical transmission redraws the equation. For a given data rate, moving bits as photons can be markedly more power-efficient, and the achievable bandwidth is higher.

That combination—lower energy per bit and higher throughput—is not a marginal gain. It is the core advantage that turns optics from a niche upgrade into a requirement for the next generation of infrastructure.

As more data is produced and more of it must move—AI being a major driver but hardly the only one—the system that can carry it faster and with less energy becomes the system that sets the pace.

Copper remains the default today, but the trend line is clear: where power efficiency and speed define the limit, light becomes the medium that keeps the entire stack viable.


The Hidden Alpha in Harsh-Environment Electronics | The Scenarionist

The Hidden Alpha in Harsh-Environment Electronics | The Scenarionist

The overlooked “picks and shovels” of space, geothermal, and nuclear systems. The path to outlier returns is simple: survive heat, vibration, and radiation.


Where Complexity, Yield, and Thermal Realities Bite

Optics Still Trails Copper on Cost

The promise of optics is straightforward: higher throughput at lower energy per bit.

The obstacle is equally clear: cost.

Copper retains its dominance not because it is superior in performance, but because it is cheaper and deeply integrated into existing manufacturing flows. When budgets are tight and deployment needs to happen at scale, that price delta determines adoption.

The economics are not a mystery to practitioners.

An optical link is a collection of specialized parts and processes, each with its own yield profile and failure modes. When those elements are combined, the aggregate cost structure is harder to compress than the single-material, highly standardized copper path.

The Bill of Materials Becomes a System Problem

An optical link begins with a light source. Lasers remain the mainstream choice, though alternatives such as micro-LEDs can also drive communication.

The source must then be modulated, which pushes design into materials and devices optimized for high data rates and low drive power. The signal travels through fiber or integrated waveguides and is recovered at the receiver by photodetectors. None of these components operates in isolation.

Each introduces constraints that propagate into assembly, packaging, and test. The engineering challenge is not only to make each block work, but to make them work together repeatedly and at yield.

When a single stage underperforms, the penalty shows up as rework, scrap, or tighter process windows—all of which feed directly into cost.

Thermal Sensitivity Turns Into Operational Risk

Even with solid devices on the bench, the system changes once heat is in the loop. Light sources generate thermal load. Modulators, particularly in optical communication, can be sensitive to temperature, and their operating point can drift if the environment moves.

Put the pieces in close proximity, and interactions multiply. What looks stable in isolation can become fragile when the full assembly is powered and pushed at speed.

The practical result is additional control, compensation, and packaging complexity to hold performance inside the window that a data center requires. That overhead is not theoretical; it is part of the real cost of delivering an optical link that is reliable over time.

Manufacturing Remains the Bottleneck to Scale

Cost is as much about process as it is about devices. The semiconductor industry’s muscle memory is built around CMOS—flows, tooling, and supply chains that naturally favor silicon and copper.

Silicon photonics fits some pieces of that puzzle, but not all.

Specialized fabrication for photonic devices, advanced packaging, and precision fiber attach are still less standardized and less scaled than their electrical counterparts. The consequence is visible in yield curves and unit cost.

Large incumbents with packaging depth and fiber-optic manufacturing discipline can narrow the gap, but that expertise is not yet ubiquitous.

A Maturing Ecosystem

There is momentum. Major foundries and packaging houses are investing in silicon photonics and optical assembly, and capabilities are expanding.

That is good for the category. It does not, however, equal immediate access for early companies. Big manufacturers will prioritize big customers, and their slots will fill with programs from hyperscalers and leading system vendors.

For a startup, the path usually begins with smaller foundry and packaging partners who are a better fit for stage and volume. That is a workable route, provided the roadmap never loses sight of mass production.

The question is always the same: can the process move from small lots to meaningful volume without breaking yield, and can it do so at a cost that holds up against copper?

Yield Is the Core of Unit Economics

Ultimately, the business case reduces to repeatability. A prototype that functions is necessary, but it is not sufficient. The market buys parts that ship in volume with predictable performance and acceptable margins.

That means aligning device design, thermal behavior, assembly flow, and supplier choice around yield targets that make a price point possible. When that alignment is in place, optics becomes a credible alternative.

Until then, copper remains the default—not because it is better physics, but because it is better economics.



Landing Your First Customer: 4 Tips for Founders

Across new approaches—new sources, advanced modulators, packaging breakthroughs, and full-stack systems—the common thread is focus.

Component specialists win by making their block unequivocally better and then proving that it slots cleanly into existing roadmaps.

System players win by convincing customers that an integrated solution reduces complexity, accelerates deployment, and holds its performance under real operating conditions.

In both cases, the bar is not a demo; it is manufacturability, yield, and credible economics. With that in mind, here are 4 considerations to keep in mind before approaching your first B2B customer.

1. Fit the Roadmap Before You Pitch the Product

In the AI hardware infrastructure stack, the most important buyers—hyperscalers and high-performance computing providers—are not discovering optics for the first time.

Many run internal programs and maintain deep optical interconnect expertise while also engaging external vendors.

Winning a place in that environment begins with alignment, not invention.

The task is to understand the customer’s roadmap, the specific problems they are prioritizing, and how an external solution can fit alongside, or inside, efforts already underway.

Without that context, even a strong device struggles to find traction, because the decision is less about a single metric and more about how a technology lands in a complex program with defined milestones and dependencies.

2. Manufacturing Is Part of the First Meeting

The practical barrier is manufacturability. There are only so many fabs specialized in silicon photonics, and many are small to medium operations that can support development but not immediate large-scale production.

Packaging and assembly capacity are similarly constrained.

The ecosystem is improving—major players are expanding silicon photonics and optical packaging capabilities—but those gains flow first to the largest customers.

The consequence for an early company is straightforward: expect to begin with partners sized to your stage and volumes, and build a plan that bridges from those partners to mass production as the product matures.

3. Prioritization Favors the Biggest Programs

Access follows gravity. Large manufacturers will prioritize large customers, and slots will be claimed by programs from names everyone knows.

That is not a reason to stand down; it is a reason to be precise.

Choose foundry and packaging partners that can deliver at the pace your stage requires, prove repeatability with them, and keep a clear line of sight to the next rung of scale.

The conversation with a strategic buyer improves when the manufacturing path is not theoretical but evidenced by runs that hit the target specifications reliably.

4. Cost Discipline Travels With You

The refrain does not change as you move from prototype to pilot: cost, cost, cost. Even with a differentiated architecture, the economics will be judged on yield, assembly flow, and the degree to which the process can be repeated without surprises.

The right partners at the right time help, but they do not replace the need to design for manufacturability from the outset.

The stronger position in front of a hyperscaler or HPC buyer is a solution that already lives within their speed class, demonstrates competitive energy per bit and bandwidth density, and shows a credible path to volume that brings unit cost down as yields climb.

When those elements are in place, the conversation shifts from “can this work here?” to “how fast can we roll it into the plan?”



The Economic Sequence: Throughput → Efficiency → Cost

1. Throughput Is the Ticket to Enter

In this market, speed is not a differentiator; it is eligibility. The industry cadence sets the bar, and individual companies do not get to move it.

Eight-hundred gigabit links are already treated as commodity. The working frontier is 1.6 terabits per second, and any serious contender must show that level of performance.

Demonstration matters in the concrete sense—delivering the stated throughput under realistic operating conditions, not a lab-limited experiment. If a product cannot live in the current speed class, it does not join the conversation.

2. Energy per Bit and Bandwidth Density Decide Who Stays

Once the speed bar is cleared, selection shifts to efficiency. Buyers compare peers within the same data rate on two fronts: energy per bit and bandwidth density.

The language is practical—energy per bit for power, and how much performance can be packed into a given silicon footprint.

Clearing 1.6T at excessive power does not earn a slot; it simply moves the bottleneck to the thermal budget. Hitting the same rate with lower energy and tighter integration is the way a new architecture proves it carries a real advantage.

3. Yield and Manufacturability Turn Performance Into Product

After throughput and efficiency, the decision turns on whether the device can be built repeatedly at an acceptable cost. Yield is the hinge.

A compelling prototype that ships poorly is not a product, and buyers have learned to ask how a process behaves outside a controlled run. This is where foundry partners, packaging choices, and assembly flow become business variables.

The mandate is to show that the same performance can be produced at scale with yields high enough to support pricing that holds up in the field.

That means designing for manufacturability early, aligning with partners that match the company’s stage, and proving yield improvement as lots grow.

A Credible Path to Unit Economics

To recap: the economic story follows a simple sequence:

  • First, meet the industry speed class so the device is relevant.

  • Second, show energy per bit and density that compare favorably within that class.

  • Third, deliver yield data from runs that are large enough to matter.

When those elements line up, cost curves begin to make sense, and the unit economics become credible to a hyperscaler.

The value proposition is not a claim about potential; it is a set of numbers that show how performance translates into power savings and how manufacturing translates into price.

That is the point at which a technology shifts from interesting to adoptable.

Focus to Win: Niche → Revenue → Expansion

In optical interconnect, ambition often outruns timing. The total addressable market for full-stack optical systems is enormous, but the path to reach it is not direct.

The companies that build lasting traction begin by focusing on a single, well-defined use case where their technology delivers a clear, measurable advantage.

Narrow focus is not a compromise—it is a strategy for survival.

By concentrating resources on one application, a team can refine its process, validate performance under real operating conditions, and demonstrate early revenue before chasing scale. The alternative—spreading too thin across multiple end markets—usually leads to dilution of both capital and credibility.

The Test Equipment Beachhead: A Case Study

One instructive example comes from the modulator space. A company we discussed, which works with thin-film lithium niobate—a material prized for its high-speed and low-power modulation—chose not to launch directly into the mainstream communications market.

Instead, it targeted test equipment vendors, a smaller but technically demanding niche.

Test systems need precise light generation and modulation to validate other optical components, making them ideal proving grounds for new photonic technologies. The volumes are moderate, but the margins are healthy, and the customers are sophisticated enough to recognize technical merit.

By serving that segment first, the company was able to mature its process, refine its cost structure, and accumulate real operating data. The discipline of shipping into a professional, paying customer base sharpened the technology faster than internal testing alone could have done.

Earning the Right to Scale

With those fundamentals in place—validated performance, stable yield, and production experience—the company could credibly step toward higher-volume applications.

After demonstrating its advantage in test equipment, the next logical targets became optical plugs and communication links. At that point, the technology was no longer experimental; it was a working platform with a known cost profile and proven manufacturability.

This staged approach mirrors how Deep Tech businesses often win: by earning the right to expand. Each stage of commercialization is a test of readiness—not only technical, but operational and financial. By the time the company moves into larger markets, the core process is mature enough to withstand the pressure of scale.

Discipline > Velocity

In an environment fueled by AI demand and record infrastructure spending, it is tempting to move fast, broaden scope, and claim a large market story early. Yet in Deep Tech, discipline outperforms speed.

A narrow beachhead that generates revenue and hardens the process creates leverage that no projection can replace. It anchors valuation to evidence, aligns investor expectations with operational reality, and allows the company to scale on its own terms.

Optical interconnect may promise vast potential, but the companies that capture it are those that sequence their ambition—first mastering a specific problem, then expanding from a position of proven strength.


Rumors

The Optical Interconnect Rush: Powering the New AI Network Stack | Rumors

Oct 2
The Optical Interconnect Rush: Powering the New AI Network Stack | Rumors

Six Startups, One Rumor. A new generation of photonic technologies is rewiring data‑center networks for the AI era—at light speed and with radical efficiency.


Disclaimer
Please be aware: the information provided in this publication is for educational purposes only and should not be construed as financial or legal advice or a solicitation to buy or sell any assets or to make any financial decisions. Moreover, this content does not constitute legal or regulatory advice. Nothing contained herein constitutes an offer to sell, or a solicitation of an offer to buy, any securities or investment products, nor should it be construed as such. Furthermore, we want to emphasize that the views and opinions expressed by guests on The Scenarionist do not necessarily reflect the opinions or positions of our platform. Each guest contributes their unique viewpoint, and these opinions are solely their own. We remain committed to providing an inclusive and diverse environment for discussion, encouraging a variety of opinions and ideas. It is essential to consult directly with a qualified legal or financial professional to navigate the landscape effectively.