Cloud native EDA tools & pre-optimized hardware platforms
Picture thousands of shimmering chandeliers, each made of intricate copper wires and suspended from the ceiling in a large, cold ballroom. Inside the delicate collection of loops in each chandelier lie several blocks, frozen by a helium pipe to sustain extremely low temperatures. Imagine that this contraption of human ingenuity can solve problems two hundred million times faster than the fastest supercomputer can do today. While this setup may seem straight out of Jules Gabriel Verne’s Voyages Extraordinaires anno 2020, this describes a real-life center of quantum computers with the potential to revolutionize the world of problem solving.
Harnessing the fundamentals of quantum mechanics and operating at temperatures as low as zero Kelvin (-273.15 °C), quantum computers can theoretically perform a million computations all at once and solve the most complex of problems in a matter of minutes — a task that would take today’s classical and supercomputers thousands of years to complete deterministically. They can take on tasks such as financial risk calculations, molecular science, intelligent traffic control, vaccine discovery, and weather forecasting. They’re well-equipped to drive applications across industries powered by 5G, artificial intelligence (AI), and the internet of things (IoT). The possibilities to power exciting advances in various fields are endless.
Although quantum computing has been around since the 1980s, it is only in the last few years that the timeline of its growth and discoveries of its applications to build a new computing era – from circuits to atoms – has become a little clearer. While a saturating Moore’s law continues to slow the growth of traditional semiconductor chips, the pace of innovation and interest in this emerging field from even non-traditional industries continues to rise.
Read on to learn more about the fundamentals of quantum computing, why companies are betting on the quantum leap, its impact on chip design, the opportunity for photonics, and what to expect going forward.
While classical computers store information in binary bits representing either a 0 or 1, quantum computers use quantum bits (qubits) that can store data in a state of superposition where they can be 0, 1, or a hybrid of both simultaneously, owing to the peculiar nature of quantum physics.
Each individual qubit can occupy a continuum of states that represent an infinite number of values. This allows a qubit to process information in parallel, unlike the sequential nature of classical computers. In qubit terms, the exponential speed from say, 30 qubits, means that a billion calculations can be performed all at once!
As an analogy, let’s take the scenario of a coin toss. According to the classic theory of probability, when you flip a coin, there is a 50:50 chance for it to land either heads-up or tails-up. However, when you bring in the theories of quantum mechanics to the mix, the outcome can be heads, tails, as well as the superposition of both heads and tails at once, depending on the coin’s orientation. This opportunity to explore an exponential number of possibilities with large amounts of data allows quantum computing to outperform a traditional system, making it an incredibly attractive and practical solution. The well-known thought experiment of Schrödinger’s cat, where an imaginary cat can be both alive and dead simultaneously as a result of its fate being linked to a random subatomic event that may or may not occur, illustrates the same paradox of quantum superposition.
Through a property called entanglement, pairs of qubits can also be intertwined with each other, allowing two qubits to exist in a single quantum state, but in a different dimension. This cuts down the time and compute power required to process information irrespective of how far apart these systems are. Ironically, this unusual phenomenon goes against the traditional scientific norm that no information can be transmitted faster than the speed of light, baffling several researchers and scientists, including Albert Einstein who called it “a spooky action at a distance.”
With the quantum computing market projected to reach $64.98 billion by 2030 from just $507.1 million in 2019, a handful of companies, venture capitalists, and governments worldwide continue to invest heavily in the industry.
What draws foundries, EDA, and semiconductor companies toward quantum computing is its extensive capability to search, optimize, and simulate enormous amounts of data across systems of linear equations, exponentially faster. The use of computer simulations to develop and optimize semiconductor materials, process technologies and devices is key for optimized chip performance. Currently, the industry’s fastest two-qubit gate in silicon boasts an impressive 0.8 nanoseconds to complete an operation, approximately 200 times faster than other existing spin-based two-qubit gates.
Commercial EDA tools enabled for superconducting electronics (SCE) can support larger scale designs, higher quality, and broader adoption of the superconducting properties. Creating new circuit designs could make simple superconducting devices much cheaper to manufacture and deliver the promise of quantum computing sooner.
Unlike existing electronics design processes where there are known commercial methods to scale the device and make transistors smaller, denser, and more interconnected to create enormously complex processors, there are still a lot of unknowns to prepare designs that can both scale quantum computing applications and be resilient against sensitivity to fabrication variation.
Today, some of the key approaches used to build scalable quantum architectures include building systems with ion traps, CMOS (complementary metal–oxide–semiconductor) silicon, new innovations in photonics. To advance the design of quantum computers, a combination of silicon and photonics architectures as well as enhanced automation tools to support SCE and deep cryogenic temperatures will be vital.
For quantum computing to truly offer a fundamentally different way of calculation and shift from scientific research to widespread deployment, there are three key challenges that need to be addressed:
Companies need to invest in multiple next-generation tools for designers to successfully simulate, emulate, and perform thorough verification of chip designs, adding a lot of time and computing power to the equation, even with the best of classical computers in the mix. However, if quantum computers can indeed perform these calculations at once, it will drastically bring down simulation from weeks to hours and give EDA suppliers and customers new ways to innovate high-tech and intelligent hardware platform and software applications.
A hot niche in the quantum computing realm is the use of photonics. The field of integrated photonics began in the early 1990s for long-haul telecommunication, where copper links were replaced by optical fiber links to feed more wavelengths or colors of light simultaneously into the fiber for more data to pass through. As its applications advanced to today’s high-speed optical transceivers, researchers and commercial companies learned that devices could be developed that mimicked typical electronic components like transistors and capacitors manipulating the properties of light given birth to trends like bio-sensing, programmable photonics, quantum computing, and 3D sensing for applications like health monitoring and autonomous vehicles.
Driven by data and cloud-driven communications, the relentless demand for bandwidth and connectivity at low power has spearheaded recent advancements in photonics and photonic IC technologies. From developing new types of quantum light sources to finding ways to manipulate and detect quantum states of light for use, innovations by photonic integrated circuit (PIC) designers, migration to 2.5D and 3D form factors, and growing interest from major foundries in photonic enablement have led to greater photonic-electronic integration in silicon design plans.
This extends to silicon photonics which leverages the applications of photonics with components that work with light (photons) and uses silicon as the main optical medium. Combining powerful electronic and photonic circuits not only offers a link between computation and communication, but helps consume less power, increases efficiency, and enables major advances in integration and bandwidth to support the latest advancements in optical data communications.
With research and discoveries in the field of integrated photonics accelerating, combining silicon switching technologies with single-photon detectors using superconducting nanowires, we expect its technological benefits to lay a heavy emphasis on addressing the current challenges with quantum computers in the not-too-distant future.
As organizations and scientists look to expand their quantum journey and advance the possibilities of testing how qubits perform in different environments, the use of computer simulations to optimize semiconductor process technologies will be essential. As part of the Synopsys TCAD product family, we offer a comprehensive suite of products that include industry-leading process and device simulation tools, as well as a graphical user interface (GUI) driven simulation environment to manage and analyze simulation.
QuantumATK is one such atomic-scale modeling simulation tool that simulates the properties and transport mechanisms of varied materials like electronic, structural, magnetic, optical, thermal, etc., before wafer-based data is available, heavily reducing the time and cost investments.
We are also seeing an explosion of new design activity as integrated photonics expands from data communications into markets such as AI, LiDAR, and quantum computing. Harnessing the properties of light is opening new possibilities for accelerated performance, power, and footprint for next-generation technologies. This means that design teams need proven design tools to verify complex PIC designs quickly and accurately. Supporting the industry’s growth in this segment, Synopsys 3DIC Compiler platform integrating OptoCompiler is the first unified semiconductor and photonic design platform that combines mature photonic technology with our proven 3DIC integration platform for speedy design implementation and verification.
We believe that optical data communication has the potential to be the preferred technology for interconnects at decreasing distances from active optical cables to silicon die-to-die connections. Giving IC designers the ability to bridge the silicon-photonic gap with photonic circuit simulation tools to create PICs like a standard CMOS flow will support the development roadmap of optical communications and light-based quantum processors going forward.
The quantum race is already underway, and the rapid growth in this space is exciting. That said, we are still in the early stages of benchmarking and commercializing the promise of quantum computing to its full potential; Richard Feynman’s revolutionary idea is still seeing practical applications of deployment. With the growing investment of tens of billions of dollars from venture capitalists, academia, and governments, it is very likely that quantum computing will reach the current speeds of AI innovation in the next 10 to 15 years.
The future with photonics is also encouraging. Integrating quantum photonics and large-scale semiconductor-based quantum computing using CMOS technologies will help pave the way for numerous breakthroughs and advances in quantum architecture, making the promise to deliver enormous computational power and solve complex problems not just imaginable, but possible.