Cloud native EDA tools & pre-optimized hardware platforms
This article was originally published in EE Times
Over the past decade, artificial intelligence (AI) has matured from a laboratory curiosity to a pervasive technology applied to everything—from fraud detection and website chatbots to the smartphone app that reads you the weather on request. Propelled by a host of consumer, business, healthcare, and industrial applications, the AI market is poised to reach $1,597 billion by 2030, up from $87 billion in 2021, for a CAGR of 38.1%. This explosive growth rests on a foundation of semiconductor innovation, from the AI systems on chip (SoCs) that perform parallel data processing at high speeds to the AI tools used to design and fabricate those chips.
Today, the semiconductor infrastructure that powers AI is no longer the sole domain of incumbent chip vendors and foundries. Aided increasingly by AI tools, an array of enterprises ranging from automotive manufacturers to hyperscalers to device builders is focused on developing innovative solutions by creating custom chips.
In this article, we look at AI trends in 2023 that promise to both drive this innovation and help address some of the core challenges the industry faces today.
With over 2B AI-enhanced devices projected to be shipped in 2025, edge computing continues to be a huge market driver for AI, spurring innovation in technology and applications. Performing more AI processing at the edge promises to steer the focus of much effort and innovation over the coming year to one aspect in particular: integration. For edge devices, the AI system needs to perform a variety of different tasks, which requires not just different types of compute capabilities, but also different types of memory, connectivity, and, of course, sensor input. Building systems that integrate heterogeneous components in a functional, power-efficient, and manufacturable design presents a significant challenge. Beyond semiconductor development and fabrication, the effort must encompass other engineering domains such as mechanical design, optical design, electrical design, and both digital and analog semiconductor design.
Integration in the data center presents a different but still very diverse set of challenges. Although the data center use case might be more digital and predictable, it doesn’t make AI easier. To deliver the deep compute performance required, multiple die need to be integrated into a single silicon device. These components comprise mostly dense digital logic, like the raw compute horsepower that is needed to accelerate large neural networks. An example might involve the integration of eight identical CPUs into a single 3D device, presenting unexpected challenges involving substrates, die-to-die communication, heat, noise, and, primarily, power.
As chip and fabrication complexity continues to rise, the adoption of AI design tools is growing exponentially. Last year, Wired Magazine reported on the world’s first AI-Designed Chip. In just one year, the number of commercial chips designed by AI has increased by at least an order of magnitude. We expect this trend to continue, going from the hundreds of designs that Synopsys has helped tape out this year to thousands of designs in 2023, as proliferation of AI design technology accelerates, training data sets become more comprehensive, and design teams begin to leverage the strengths of the new breed of tools both deeply and broadly.
As the technology matures over the coming year, new AI-driven design capabilities will deliver productivity breakthroughs in new areas of chip design and emerge to help create more complex designs to meet power, performance, and area (PPA) demands. Don’t be surprised to see AI reinforcement-learning-based applications used to solve various design challenges reach market in 2023 and see very fast adoption—from digital design to custom design and verification.
One of the most challenging and time-consuming aspects of developing a new AI application is the process of creating a model and then optimizing and training it to perform a specific task. It’s critical work that at present can’t be fully automated. As AI applications proliferate, model building is increasingly becoming a barrier to advancement. This has given birth to more research into what are known as foundation models.
A “foundation model” is an AI model that you design once and then train using very large data sets to achieve various objectives. Once trained, the model can be adapted to many different applications. The goal is to spend less time architecting and engineering a new model specifically for each application. Instead, you can take an established foundation model and teach it to do new things by presenting it with different kinds of data. The sheer scale of foundation models allows users to achieve entirely new capabilities that were previously unattainable using earlier architectures.
Foundation models are driving another AI evolution that will enter center stage in 2023: Generative AI. This new wave of AI focuses on creating new content, based on underlying models’ ability to train on very large bodies of work, including text, images, speech, and even 3D signals from sensors. Based on the input, the same foundation model can be trained to synthesize new content, such as creating art, music, or even generating text for chatbots. Generative AI will make creating new content breathtakingly easy as models are not designed for a specific task; they’re designed for the art of learning.
AI applications are enormously powerful, but they can also consume massive amounts of energy. A recent study published at the ACM Conference on Fairness, Accountability, and Transparency, recorded that a single training run of a popular transformer model can consume roughly eight times the average annual household energy budget in the United States. Clearly, if things don’t change, scaling out these AI applications is going to critically impact our energy consumption, our infrastructure, and the way we build data centers.
Excitingly, AI design tools can directly help our global efforts toward net zero by better optimizing AI processor chips for energy efficiency. According to an informal study Synopsys recently conducted, optimizing an AI chip with AI design tools delivered, on average, about 8% energy savings across the board. Now, apply that 8% savings to every datacenter in the world. That represents an enormous amount of energy, and this study just scratched the surface.
To that end, we see the benefits going beyond chip design; think optimizing the nation’s energy grid performance, maximizing crop yield, minimizing water consumption, reducing energy usage required for cooling buildings, and more. These are just a few examples of ways that AI can both reduce its carbon footprint and compensate by bringing benefit to the environment.
Over the next year, advances in AI tools and functions will support innovation across the industry and provide easier access for new entrants while leveling the playing field for smaller enterprises. To keep moving forward, enterprises need AI chipsets that deliver greater levels of compute at higher speeds while consuming less power. Synopsys’s broad portfolio of AI design solutions and IP provide companies with an avenue to competitive differentiation, whether they are in the application space, the chip development space, or both. We believe in the ability of AI to change the world through disruptive technologies. That’s why we are focused on making it a little easier and faster to create these potent silicon devices.