Can AI data pains lead to photonics business opportunities?

AlbionVC partner Dave Grimm on why now is the time for photonics chips to shine

Marc Ambasna-Jones

Photonics chips may represent the next leap forward for the IT industry, not by replacing compute, but by unlocking new levels of infrastructure performance and energy efficiency at a time when AI workloads are pushing conventional systems to their limits.

As the global race to scale AI intensifies, the focus is shifting from compute to connectivity. Data transfer inefficiencies are emerging as the real bottlenecks in performance, particularly for hyperscalers and infrastructure providers designing multi-GPU clusters. Photonics, once considered a niche innovation, is now becoming critical infrastructure.

“The real constraint isn’t compute,” says Dave Grimm, partner at AlbionVC. “It’s about how quickly you can move data in and out of GPUs. That’s where photonics is starting to show real value.”

Grimm explains that the issue lies not in how fast models can be trained, but in how quickly and efficiently data can be shuffled across networks.

“Owners and operators of the best AI infrastructure at the moment are finding that it’s data transfer, moving the data into and out of wherever you’re computing and around your servers, that is probably a bigger limiting factor than how fast you can actually compute.”

Photonics is enabling AI-native infrastructure by increasing both the efficiency and the scalability of compute environments.

Photonics technology enables faster, more energy-efficient data movement compared to traditional electrical connections. This has clear implications for AI workloads, where latency and power consumption are both under increasing pressure. It also aligns with broader goals around operational efficiency in data centres, which are reaching the limits of available power supply.

“The large hyperscalers will talk about their data centres currently being power bottlenecked,” says Grimm. “If they could upgrade the amount of power they could put in, they could do more usable compute. So if you can take out, say, the networking layer’s worth of power, 20% of the power of the data centre, what it means is you don’t save 20% of your energy cost. It means you turn exactly the same energy on, but it all goes into compute.”

Recent developments have helped the commercial case. In 2025, Nvidia launched its Spectrum-X and Quantum-X switches, using co-packaged optics to deliver 1.6Tb/s per port. AMD, meanwhile, acquired photonics IP company Enosemi, joining a broader movement by major chipmakers to integrate optical components directly into AI chips. Companies like Lightmatter, Celestial AI, and Ayar Labs are also bringing optical interconnect solutions closer to commercial readiness. These technologies aim to reduce the burden on traditional interconnects, which are struggling to keep pace with escalating model sizes and throughput requirements.

Energy efficiency is an attractive proposition

The energy efficiency argument is particularly interesting. Analysts such as those at Columbia University and the IEEE Photonics Society estimate that up to 40% of AI system power is consumed by data movement, not computation.

According to the IEEE Journal on Photonics, next-generation silicon photonics platforms can reduce interconnect energy to as low as 120 fJ/bit, significantly outperforming traditional copper-based systems. Meanwhile, IDTechEx forecasts that the silicon photonics market will surpass $54 billion by 2035, driven largely by AI infrastructure requirements.

By optimising bandwidth-per-watt, photonics solutions can help operators reallocate power toward actual processing, improving model performance without expanding data centre footprints. For hyperscale operators already constrained by grid capacity, this is a significant advantage. It allows for greater computational throughput without requiring physical expansion or costly energy contracts.

In effect, photonics is enabling AI-native infrastructure by increasing both the efficiency and the scalability of compute environments. The impact extends beyond performance. Energy pricing volatility, environmental targets, and operational risk mitigation are all driving demand for infrastructure that delivers more compute per watt.

Europe may have a competitive advantage. Institutions such as the University of Southampton and the Max Planck Institute, combined with long-standing expertise in optical systems manufacturing, provide a deep base of talent and intellectual property. Historic industrial clusters in regions like Devon are also playing a role, offering experienced workforces and legacy infrastructure to support emerging photonics businesses. 

Venture investors are increasingly aware of this and the potential to support next-generation infrastructure. AlbionVC, for example, has backed Oriole Networks, a start-up building full-stack photonic networking solutions for AI workloads.

“Photonics is one of the few areas where Europe can compete on infrastructure,” Grimm notes. “But it needs coordinated investment across research, commercialisation, and manufacturing.”

He also points to supply chain concerns.

“The photonic supply chain is stretched everywhere right now because it’s suddenly an interesting place to play. Amazon and Nvidia are buying up massive amounts of photonic supply chain capacity to serve their own needs. We need more production facilities online now, not in five years.”

Demands for sovereign infrastructure will increase

The broader context includes a growing interest in sovereign AI infrastructure, where nations seek to develop or host data-intensive platforms domestically. In this scenario, photonics is viewed not just as a performance booster but as a strategic capability. With hyperscalers like AWS, Google, and Microsoft already investing heavily in their own optical stacks, the window for regional innovation is limited unless policy, academia, and capital converge to accelerate local efforts.

Beyond hardware, further innovation is expected in optical memory and switching architectures. Although optical computing remains a longer-term proposition due to limitations in nonlinear function processing and phase change memory durability, near-term advances in network architecture are likely to have a significant impact on AI deployment economics.

Research into neuromorphic photonic processors and chiplet-based interconnects is also gaining ground, offering promising pathways to integrate optical solutions into broader system designs.

Companies like Siloton and Duality Quantum Photonics in the UK illustrate how specialist photonics applications, ranging from health diagnostics to quantum communication, are also building commercial momentum. Their ability to manufacture and validate photonic components locally will be critical to reducing supply chain friction and enabling faster go-to-market for emerging infrastructure products.

In parallel, telecom and datacentre suppliers are exploring how photonic switching and routing could replace electronic alternatives in specific workloads. The goal is not necessarily to rewire everything optically but to achieve meaningful savings and performance gains in targeted, high-throughput scenarios. Co-packaged optics, for instance, could see adoption in machine learning inference workloads, where repeated operations strain traditional network fabrics.

As AI infrastructure grows more complex, technologies that improve efficiency without requiring entirely new paradigms will gain traction. Photonics, by slotting into existing data centre architectures while addressing clear operational limits, is one such enabler.

For investors, operators, and governments, the case is becoming clearer – if AI is to scale sustainably, data infrastructure must evolve and photonics is ready to meet that demand. The next wave of infrastructure innovation may not come from chips that compute faster, but from chips that help everything else move faster.

As Grimm puts it, “the entire industry is rethinking what infrastructure looks like for AI. If you can build the system that moves data better, faster, and with less power, you’re building the foundation for everything else to scale.”

Related Story:
Marc Ambasna-Jones
Marc Ambasna-Jones / Editor-in-chief

Working as a technology journalist and writer since 1989, Marc has written for a wide range of titles on technology, business, education, politics and sustainability, with work appearing in The Guardian, The Register, New Statesman, Computer Weekly and many more.

THE CONVERGENCE OF CRITICAL TECHNOLOGIES

Semiconductors & future mobility

Book now

WORKSHOP

National Telecoms Experimentation Facilities

Register now

Handpicked stories from the intersection of innovation and business

The Foresight Digest

Subscribe