Cloud native EDA tools & pre-optimized hardware platforms
Douwe Geuzebroek, chief technology officer at BrillianceRGB, co-authored this post.
Ever use a retailer’s app to see how their furniture might look in your home? Or played around with a social media app that lets you alter your appearance? And, even if you didn’t directly participate, you probably remember witnessing the Pokemon Go craze, right? These are just a few examples of how augmented reality (AR) has made its way into our lives.
While Google Glass did not live up to its hype, there is now steady progress in the development of, and demand for, AR systems that overlay computer-generated content onto the real world. The ideal pair of AR glasses—one that is functionally sophisticated yet lightweight and comfortable enough to potentially wear all day—has not yet been invented. Delivering a solution with high-resolution imagery and low energy consumption has proven elusive. However, as engineers continue to envision what might be possible, it’s becoming clear that their path to success may rely on photonic ICs (PICs).
By harnessing the power of light, PICs deliver fast, low-power, and high-capacity data transfer. They also support the miniaturized laser beam scanning technology that could open the doors to higher performing, more comfortable AR glasses. BrillianceRGB, which develops photonic chips for laser projection applications in AR and virtual reality (VR) glasses, certainly understands the value of PICs for AR applications. Read on to learn how this Enschede, Netherlands-based company has used Synopsys photonic technology to create the smallest, most efficient RGB laser for AR glasses.
AR technology is not a recent invention. Its roots actually date back to 1968, when a Harvard computer scientist, Ivan Sutherland, created an AR head-mounted display system. Commercially, the first AR application appeared in 2008, an advertising application developed in Germany. Today, the potential for AR has broadened. While the gaming industry and those involved in the meta verse have obvious needs for AR systems, the technology also is finding its way into industrial applications, namely to create greater efficiencies. For example, faced with travel restrictions during the COVID-19 pandemic, ASML created an AR solution that enabled its customer support engineers to “visit” its customers’ cleanrooms to help keep their lithography machines serviced and up and running. In the medical field, doctors are using AR for training, to prepare for surgeries, and to access patient vitals.
Writing in Forbes, futurist Bernard Marr projects this to be the year that we encounter more of an “immersive internet,” where technologies like AR and VR usher in a more engaging, collaborative, and interactive online experience. To bring AR systems like glasses into the mainstream to support an immersive internet, engineers will need to address various challenges. One of the toughest is whether it’ll be possible to create glasses that will work for everyone, including people who wear prescription lenses, as well as in any environment, such as a bright, sunny day. But first, designers will need to produce AR glasses that look and feel like the glasses people are accustomed to wearing. So far, the products that are being rolled out carry small improvements, but no one has been able to deliver the Holy Grail. At least not yet.
AR utilizes optics to create a simulated environment that enhances the real one. AR glasses work in a similar manner to heads-up displays in vehicles, where the simulated content is projected via the glasses in front of the wearer’s eyes. The eyes are essentially the receivers, while the microdisplays or a laser are the light sources and the lenses are the optical elements. The microdisplay light sources can be from organic light emitting diodes (OLED) or liquid crystal displays (LCD). The optical elements combine light from the microdisplays or lasers with light from the real world, projecting augmented information from the microdisplays or lasers onto the real world.
Current designs use light guides to transfer the image created by the source to the eye, with designers using multiple tools to simulate the resulting performance. Along the optical workflow, grating couplers are designed to inject and extract the display image into and out of a light guide, ideally formed by conventional eye wear, using RSoft DiffractMOD™ rigorous coupled wave analysis (RCWA) and FullWAVE™ finite-difference time-domain (FDTD) software. The gratings are added to models in LightTools illumination design software, simulating the radiometric performance to ensure that the wearer can see a clear image with reduced contrast degradation due to stray light. These design iterations are simplified by tool integration as various components are refined and combined along the workflow.
PICs have gained favor in applications like hyperscale data centers, which benefit from their fast, low-latency data transfer. For AR glasses, PICs provide a way to handle light more efficiently, resulting in lighter, more energy-efficient glasses that can deliver crisp, vibrant, seamless holographic images. Using PICs, BrillianceRGB has developed the smallest and most efficient red, green, and blue (RGB) light source. At 4x7mm2 (with 4x4.5mm2 being the latest target) and with output power up to 100mW, this solution enables its customers to overcome challenges around miniaturization, integration, energy efficiency, and general comfort for AR projection applications.
To create the smallest possible RGB laser module, BrillianceRGB uses silicon nitride-based PICs. Photonics and semiconductor development methods have also enabled the team to develop a very scalable manufacturing and packaging process. Its laser chip is placed on its back against other advanced components using “flip chip” technology. The company is now working to turn its existing proof of concept into customer-specific prototypes that integrate with customers’ glasses for volume production.
BrillianceRGB laser chips are designed using Synopsys OptoDesigner, a solution many of its veteran engineers are deeply familiar with having used it since their university days. The tool’s algorithmic module helps engineers design and optimize photonic components, waveguides, and complete chips. In PICs, waveguides confine and direct the flow of light energy and, when well designed, will provide an efficient combination and transmission of the red, green, and blue colors. With the OptoDesigner solution, BrillianceRGB achieved first-time-right waveguide routing, saving time and effort thanks to the solution’s built-in photonic aware algorithms. OptoDesigner is now fully integrated with Synopsys OptoCompiler, the industry’s only unified electronic and photonic design environment, with support for simulation, layout, and verification of PICs. OptoCompiler is also used to design and optimize meta-surfaces for AR applications.
The next couple of years should be interesting, as we could start to see more mainstream use of AR glasses. So, don’t be surprised if you see someone working on a virtual screen during your train ride to work. In the meantime, photonic technologies are demonstrating their value in bringing more innovative applications of AR systems to the world.