Should analog/mixed-signal verification be more like digital verification, with separate verification teams, a methodology like the Universal Verification Methodology (UVM), and metric-driven verification (MDV)? Yes, according to three mixed-signal engineers at a panel discussion at the Cadence EDA360 Theater at the Design Automation Conference (DAC) June 8.
The panel was titled, "Stop thinking, start acting - methods to shrink the verification deficit." Panelists were asked to address three questions: why is mixed-signal verification so challenging, how are you addressing these challenges today, and what advancements in EDA tools are needed? I served as moderator, and the panelists were as follows:
- Jonathan David, senior staff engineer, Qualcomm (speaking in photo below)
- Martin Barnasconi, product manager for AMS/RF System Design Methodologies at NXP Semiconductors (seated at left)
- Hao Fang, senior design manager, LSI (seated in middle)
Photo by Joe Hupcey III
Each panelist gave a short presentation about their particular mixed-signal verification challenges, solutions, and desires for EDA support.
Qualcomm: Speed, Power, and Timing Closure
David showed a block diagram that typifies the wireless transceivers his group verifies, including such elements as processors, A/D and D/A converters, and filters. He identified several challenges. First, he said that analog/mixed-signal simulators are too slow, especially when RF frequencies are involved. Another issue is low power. Analog power intent is captured directly in schematics, but digital simulators expect a separate power format file (as provided by the Common Power Format, or CPF).
Timing closure is another challenge - most timing analysis tools don't handle mixed-signal netlists well, David said. Automatic test pattern generation (ATPG) tools typically don't handle non-RTL portions of the design, but there's still a need to generate test vectors to test the chip. Finally, the typical "bottom up" approach to analog design may result in late integration and short test periods.
One solution Qualcomm uses is real number modeling, which allows ranges of analog values to be represented in a digital simulation environment. This does require some additional resources for model development and verification, however. EDA "opportunities" identified by David include automatic modeling tools, an ability to extract power intent from schematics, mixed-signal parasitic and timing analysis, mixed-signal ATPG, and UVM in the Virtuoso Analog Design Environment (ADE).
Making the case for separate analog verification teams, David quoted science fiction writer David Brin, who said that "criticism is the only known antidote to error." This means that someone other than the original designer should review design work, David said. "You don't necessarily want the guys who are really good at analog design to start thinking about object-oriented programming and UVM," he said. "I think it's time to have a separate skill set for analog verification."
LSI: Increasing Digital Circuitry Calls for Changes
First, a bit of background: Hao Fang was a co-author of a DVCon paper on "UVM-MS," presented by Cadence and LSI, which I described in a previous blog post. So, it is not too surprising that his DAC presentation focused on the need to bring digital verification techniques into the analog world.
Fang noted that LSI's hard disk drive technology requires read-channel systems-on-chip (SoCs) along with pre-amps that interface with a customer's transducer. The pre-amps pose lots of challenges, including multi-Gbit/second data rates, multiple channel counts, and traces that are lossy non-uniform transmission lines. Fang observed that his company's traditional "Large A, Small D" designs are turning into "Large A, Large D," with increasing digital control circuitry for operation modes, calibration, trimming, power management, and test.
LSI's current solution includes analog block-level simulation with the Cadence Spectre simulator, digital block-level verification with UVM, and chip top-level AMS simulations. Cadence and LSI collaborated to develop a wreal model for fast execution. What's needed now, Fang said, are dedicated digital and analog verification teams, an executable verification plan, a self-checking testbench, analog "coverage," and analog assertions. He noted that the Cadence Accelerated Parallel Simulator (APS) helps reduce run-times by 4X or more.
In the future, Fang said, LSI intends to use UVM-MS to apply MDV to analog. "Instead of a bottom-up approach, we want a more top-down approach from a specification to drive the design and the verification," he said.
NXP: Verification from Chip to System
Barnasconi noted that NXP is verifying not just chips, but networking systems, and must thus consider system and IC verification across analog/digital and hardware/software boundaries. Such systems are likely to include analog/mixed-signal transceiver functions, digital hardware such as controllers, and embedded software. To guarantee a fail-safe system, coverage-based and assertion-based mixed-signal verification techniques are essential, he said.
Barnasconi said that NXP is introducing a "structured" analog/mixed-signal verification methodology, answering the questions "how do I go from a spec to a design, and from a verification plan to a testbench?" NXP uses proprietary verification IP for mixed-signal flow automation and creation of self-checking testbenches.
One challenge today, Barnasconi said, is learning how to map techniques such as UVM, MDV, and assertion-based verification into the analog/mixed-signal domain. Barnasconi, who also chairs the Open SystemC Initiative AMS working group, called for AMS language extensions for UVM, SystemVerilog, and SystemC.
Much Work Ahead
A question-and-answer period revealed many of the challenges of bringing digital techniques into mixed-signal verification, including the skill sets of today's analog designers, the meaning of analog "coverage," the development of analog assertions, mixed-signal simulation language support, and directed versus random testing.
Offering some closing advice, Qualcomm's David noted that if companies try to adopt a new verification methodology too quickly, without enough people, there's a risk of slipping tapeouts without improving coverage. "I want to find the right project where there's not too much risk and I have enough people working on it," he said. "Adopting this methodology on this project should be the right way to mitigate the overall risk for all projects."