Cloud native EDA tools & pre-optimized hardware platforms
In the past year industry thought leaders Bryan Dickman and Joe Convey have published a series of articles and whitepapers entitled, “The Quest for Bugs.” A quest is usually defined as a long search for something that is difficult to find. In fact, it’s a long, difficult search, which is always constrained by cost, time, and quality. As they like to point out, “ASIC hardware verification is a resource-limited quest to find as many bugs as possible before shipping.” Don’t be disheartened. It’s a hard problem, but high quality is achievable with the right set of integrated approaches.
So, what are the key challenges facing ASIC or IP core hardware verification teams today? Let’s characterize the key challenges in this quest for bugs as “bug avoidance,” “bug hunting,” “bug analysis (or debug),” and “bug absence.”
Avoiding bugs at the point of design capture is one of the most effective practices to deliver high-quality designs that work. Of course, complexity is your enemy and design bugs are inevitable, but early validation of the design architecture, before RTL coding, is the best way to eliminate costly high-level design/architecture errors. Moving forwards with a badly architected design can mean power and performance requirements become difficult to achieve in the later implementation stages, and you may find yourself endlessly iterating the RTL code, introducing complexity and technical debt with the subsequent risk of introducing difficult and potentially critical bugs. Historically, many teams have done this architectural analysis statically, with spreadsheets. The modern and more effective way to do this today is dynamically by simulation of a virtual prototype running realistic software payloads and realistic I/O traffic profiles. At this abstraction level you want capabilities that enable you to explore architecture decisions and analyze both the functional performance and the power performance of your chosen solutions.
Further down the development lifecycle, as hardware designers start to develop the RTL code, there are good practices that they can apply to ensure RTL is “correct-by-design.” An assertion-based verification flow really helps to cross-check the intent of the coded behaviors, with the added benefits of facilitating both static (formal) verification, and dynamic (simulation or hardware acceleration) workflows downstream to catch more bugs and catch them faster and more efficiently.
For those inevitable bugs that have not been avoided earlier, there is the ever-present challenge of finding them, and for this you usually need a multitude of complementary tools and methodologies. As Bryan and Joe mention in “The Dilemmas of Hardware Verification,” there is a dilemma of completeness which is a constant challenge to all verification teams. Constrained-random strategies such as UVM have become mainstream for RTL simulation over the last two decades and enabled high levels of productivity in the building and execution of effective bug-hunting testbenches. Computer hardware and simulation engines have evolved so that verification efficiency and verification effectiveness are orders of magnitude better than they were 10 years ago. At the same time the capabilities of formal engines and formal verification methodologies have matured to a level where a static approach to bug hunting is complementary and highly effective. As you wring out most of the bugs using both dynamic simulation and static formal, there is a need to elevate the bug quest to faster dynamic platforms that enable you to run realistic software on the full system or chip level at multiple MHz speeds. Hardware acceleration with effective debug capabilities in the form of fast emulation or fast FPGA prototyping systems are the platforms of choice for this later stage of bug hunting. The bugs that remain at this stage will be more complex, and if you miss them, they are most likely to end up causing problems in final silicon.
Studies have shown that debug is one of the biggest time consumers in the world of hardware development. It’s usually a team sport. Verification engineers discover unexpected behaviors and typically designers get involved in the analysis and debug process. This normally involves waveform analysis at some stage. It’s the part of the process that engineers get the most excited about; the reward of diagnosing the bug and then fixing it and validating the fix. Every successful debug takes the product one step closer to final release. Given that debug is such an essential and significant part of the process, debug tool effectiveness can have a big impact on time to market. Hardware teams need usability and consistency of debugging capabilities and experience across the multiple verification platforms (simulation, static, hardware acceleration).
At the end of the day, you are faced with the “delivery dilemma.” You recognize the “completeness dilemma” but you need to ship the RTL, or you need to proceed to “tape-out.” How do you assure yourself and your customers that the product is bug-free? The “endless quest” needs to be concluded with confidence that all verification targets have been fully met. There is no further effective verification that can be done. A methodical and measurable approach is needed. This can only be achieved through effective verification planning and execution practices. Verification planning (or test planning) is the cornerstone of verification and essential to the verification sign-off process. Detailed feature/coverage-based verification planning requires engineering experience and detailed design implementation knowledge, but also effective planning and results management frameworks. Verification is an inherently data-intensive activity and teams will need to manage, analyze, and visualize many verification data sources in a way that allows them to make effective correlations and gain the valuable insights into verification status and verification completeness required for sign-off.
It has often been said when talking about the hardware verification problem, that there is no “silver bullet,” and this assertion still holds true today. Verification engineers know too well that they need an integrated platform of tools and methodologies to tackle the problem, and each one can find bugs at different stages of the hardware development lifecycle. Each must be used at the appropriate stage to have the highest level of confidence and deliver the best possible quality in the face of the aforementioned dilemmas of verification. Moreover, experienced verification leads understand the value of having more than one approach at every stage in the process. As we said earlier, formal verification and simulation are complementary approaches, not competing ones. Each can find different bugs. This is one of the facets of verification that makes it such a challenging and compelling career choice. There are many tools to master, many engineering challenges to overcome, and many innovation opportunities for both hardware development teams and EDA solution providers. Over the decades, Synopsys has invested heavily in verification research and development and today offers the most comprehensive array of integrated verification solutions in the form of the Synopsys Verification Continuum® platform.
The Synopsys Verification Continuum delivers a highly integrated suite of verification solutions enabling hardware developers to choose the optimum solution to meet the four challenges at every stage of the development lifecycle.
Virtual prototyping with Synopsys Platform Architect™ allows architects to get the system architecture correct-by-design and avoid bugs arising from intrinsic design problems before committing to RTL coding.
Bug hunting is supported at multiple stages of the hardware development lifecycle with Synopsys VC Formal® next-generation formal verification engine (along with multiple formal productivity apps), and Synopsys VCS®, the industry’s highest performance simulation engine doing most of the heavy lifting for RTL bug finding and elimination. At the other end of the development lifecycle, the fastest available hardware acceleration technologies allow teams to validate the hardware and software running together before committing to tapeout and first silicon. Synopsys ZeBu® EP1 is the industry’s fastest emulator, delivering emulation speeds of up to 10MHz for designs of up to two billion gates in size, while delivering full-visibility debug at speed. The Synopsys ZeBu® Empower emulation system delivers power analysis and power sign-off capabilities that allow power bugs to be found and fixed while running realistic software payloads. Then the industry’s most capable FPGA prototyping system, Synopsys HAPS®-100, enables development teams to achieve the “deep cycles” that Bryan and Joe talk about, by scaling out systems to achieve volumes of bug hunting otherwise only achievable post-silicon, while retaining high-performing debug capabilities.
The power of the Verification Continuum also lies in the common parts that run across all of these individual solutions. For example, unified compile (UC) with the VCS solution enables a seamless transition between simulation, static verification, formal verification, emulation, and prototyping environments. Unified bug analysis and debug with Synopsys Verdi® automated debug platform allows engineering teams to find and fix bugs across all domains and abstraction levels for dramatic increases in debug efficiency.
Finally, for bug absence, i.e. sign-off, you have multiple Synopsys VC SpyGlass® solutions for RTL static signoff, and your entire verification test planning, coverage analysis, and execution management is brought together with Verdi Planner and Synopsys VC Execution Manager, allowing you to plan, execute, and sign-off verification once coverage goals are met.
So, while avoiding and finding bugs, and debugging in complex hardware systems may feel like an endless quest, the challenges are clear:
Even though the challenges are ever-increasing thanks to the complexity, sophistication, and sheer size of modern IP cores and ASICs, corresponding advances in verification tools and methodologies are well placed to scale-out with current and future demands.