How Photonics Can Light the Way for Higher Performing Multi-Die Systems

Kenneth Larsen, Twan Korthorst

Nov 21, 2023 / 5 min read

Multi-die systems are making big waves in the semiconductor industry. Through heterogeneous integration of dies in a single package, design teams are discovering new ways to breathe more life into Moore’s law. At the same time, these complex architectures are opening pathways to explore new types of components that can further optimize systems for power, performance, and area (PPA), such as photonics.

Photonics harness the speed of light for fast, low-power, high-capacity data transfer. A tremendous amount of data needs to be moved swiftly across different components in a multi-die system. Considering this, exploiting the advantages of light is one way to mitigate heat dissipation and energy consumption concerns while delivering fast data transmission. Simply put, for bandwidth-intensive designs such as high-performance computing (HPC) and AI/machine learning, copper interconnects burn up too much of the power budget. By overcoming some of the limitations of traditional electronic circuits, photonic ICs are ideally suited for this realm.

Technologies for heterogeneous integration make it possible to marry components of different process nodes and types, including electrical and photonic circuits. Read on to learn more about how integrating photonic components into multi-die systems can offer another answer to greater demands for bandwidth, energy efficiency, and density of next-generation chips. 

photonic integrated circuit ic

Computing at the Speed of Light

Optical fiber connections have traditionally been used between racks in traditional data centers (with copper interconnects connecting their various hardware components). However, as copper runs out of steam for higher bandwidth designs, there’s a growing trend toward using optical connections for increasingly shorter distances. Thus far, optical I/Os have been demonstrated in core silicon such as switches, CPUs, and GPUs, while the technology is being explored for optical interposers and die-to-die interconnects. Where even a few years ago, optical components were not a part of the multi-die systems discussion, today they are very much part of the conversation.

Thanks to the properties of light, photonics can enable, extend, and increase data transmission. The result is just what compute-intensive applications need: increased bandwidth and speed with reduced latency and power consumption. In the world of high-performance applications, it is becoming cost-prohibitive to use copper interconnects due to their bandwidth limitations and power disadvantages.

Limitations in functional scaling and yield of monolithic SoCs for compute-intensive HPC applications are driving the move to multi-die systems. Multi-die systems can take the form of a disaggregated approach, where a large die is partitioned into smaller dies for better system yield and cost versus monolithic dies. Or, they can consist of dies from different process technologies, assembled to achieve optimal system functionality and performance. Different dies are interconnected horizontally or stacked inside 2.5D or 3D packages. Regardless of the approach, these complex, interdependent systems deliver higher bandwidth, ultra-short latency, power efficiency, and size advantages compared to large monolithic SoCs.  

Optical I/O chiplets that bring silicon photonics together with electrical components can deliver greater performance in a multi-die system without the power penalty for AI and other next-generation workloads. Down the road, we could see optical interconnects supporting the short reach between dies in a multi-die system, enabling the components to be placed closer together for greater density. With data in the terabytes and beyond, optical interconnects can support less energy consumption and higher performance compared to the silicon interposers that are commonly used today. As distances between components continue to shrink, organic interposers with small silicon chiplets may be tapped to drive electrical signals. What if those chiplets were made of optical components? Optical links along with higher performance electrical links could answer the continued call for greater chip density and bandwidth. 

silicon photonics

Balancing Bandwidth and Energy Efficiency Demands with Photonic ICs

There are two key applications being investigated or developed that bring photonics together with multi-die systems: optical I/O for CPU/GPU systems and optical I/O for high-bandwidth memory (particularly useful for addressing the memory bandwidth gap in AI training applications). The technology to build transceivers themselves is starting to move to silicon photonics. For example, we are seeing high-speed transceivers in silicon driving optical connections. If optical links can be used to disaggregate compute in the data center—providing high speed with very little latency—the number of connections in these applications can also be reduced.

As data centers continue evolving, there’s a continued trend involving optical chiplets and I/Os being placed closer to silicon ASICs. Given the trajectory, we could eventually see these components stacked inside 3D packages to deliver even greater density and energy-efficiency advantages. Down the road, pluggable transceivers—including high-speed optical transceivers—may give way to co-packaged optics that bring together electrical and photonic dies for increased bandwidth capacity and power mitigation. Co-packaged optics as part of a multi-die system provide an even more powerful solution to balance bandwidth, density, and energy efficiency demands.

With lasers driving the photonics circuit, the integration of the “optical power supply” emerges as a key challenge. There are concerns around heat, reliability, serviceability, and cost to contend with. Integrated lasers are demonstrating an advantage in mitigating these concerns. OpenLight has an open silicon photonics platform with integrated lasers that eases the process of integrating silicon photonics into chip designs.

Illuminating New Pathways for Innovation

Designing and verifying multi-die systems containing photonic circuits comes with unique challenges. The traditional chip design flow must be adapted to account for the interdependencies from architecture exploration through signoff and manufacturing. For any design containing co-packaged optics, the optical and electrical components must be simulated together, not in isolation. Synopsys can help address these challenges. The Synopsys Multi-Die System Solution, which includes EDA and IP, enables early architecture exploration, rapid software development and system validation, efficient die/package co-design, robust and secure die-to-die connectivity, and enhanced manufacturing and reliability. Synopsys photonic solutions provide the industry’s only seamless design flow for photonic systems, devices, and ICs. With our team’s expertise and tool flows, we’re the only EDA vendor providing a comprehensive multi-die/multi-domain solution for simulating, implementing, and verifying optical signals and photonic circuits.

Looking ahead, it’s hard to fathom a semiconductor future without photonics or multi-die systems. As our world becomes increasingly intelligent and connected, bandwidth demands will only continue to rise. Through envelope-pushing innovations such as photonics and multi-die systems (and the ability to bring the two together), engineers are laying the foundation for new innovations that will surely transform how we live, work, and play. 

Drive Multi-Die Innovation

Synopsys is empowering technology visionaries with a comprehensive and scalable multi-die solution for fast heterogeneous integration.

Related Resources


Related Blog Articles