Setting up a metrics-driven verification environment isn’t just a matter of tools – it also requires a mindset change along with support from management, according to panelists at the Cadence Ecosystem booth at the Design Automation Conference. Panelists from Broadcom, STMicroelectronics, and Verilab described their challenges and successes in setting up a metrics-driven, or as some say “coverage-driven,” verification flow.
For the first half of the discussion, panelists talked about their own experiences with coverage-driven verification. They discussed topics such as code and functional coverage, Open Verification Methodology (OVM) testbenches, constrained-random simulation, the need for universal coverage metrics, and what kinds of metrics might make sense for formal verification. As moderator, I then posed the question of what it takes to move from a conventional verification methodology to a metrics-driven approach.
“I think the biggest challenge is the mindset,” said Gurvinder Sareen, engineering manager at Broadcom. “You have to have buy-in from the designers and the management. They’re used to a traditional flow driven by ad-hoc mechanisms, directed tests, and scenarios not bound to any metrics. Shifting that mindset is a big challenge.” He added, however, that “we did succeed in doing that a few years back, and now people see the value.”
Gregory Augier, read/write channel verification leader at STMicroelectronics, noted that his group is in the process of moving to metrics-driven verification and has recently decided to use OVM testbenches and functional coverage. While this transition is not yet completed, Augier said he thinks it is an “absolute requirement” for his group.
Initially, he said, there was resistance from engineers and management “because they did not fully understand what functional coverage or metric-driven verification is about, and they wanted to see an environment where they could run any kinds of directed tests. I think now they are starting to realize the value of this methodology, and they realize that starting with a good verification plan is the key to verification success.”
Jason Sprott, CTO of verification services provider Verilab, noted that his company is a heavy user of metrics-driven verification – but there’s some work involved. “The challenge we face, and try to help our clients with, is managing the human aspects of functional verification,” he said. “How do we deal with the fact that people have to specify the metrics we need to collect, and understand the data we need to analyze?”
Mike Stellfox, Jason Sprott, Gregory Augier, and Gurvinder Sareen (left to right) speak
at the verification panel at the Cadence Ecosystem booth.
Panelist Mike Stellfox, principal verification solutions architect at Cadence, noted that many customers are now moving from directed testing to coverage-driven approaches. “There’s a problem, and it’s the metric,” he said. “The metric that management is used to is the test list, and they get really nervous when the DV [design verification] guys get this new constrained-random tool and say ‘don’t worry, you won’t see a test list.’ The metric is no longer how many tests you have, because you’re not writing tesets. The metric is now coverage.”
One does, of course, need a tool flow that supports coverage-driven verification, and Stellfox noted some of the things it should include – such as an executable verification plan linked to metrics, a unified coverage database for simulation and formal, and an “open” methodology like OVM that supports multiple languages. The end goal is a “closed loop process” leveraging the coverage metrics obtained from various tools.
Some other points made during the discussion were as follows:
Sprott noted that embedded software should be part of the functional verification process because it provides the actual “use cases” under which a chip is exercised. He suggested it might be possible to take a Unified Modeling Language (UML) diagram and generate a coverage model from that.
- It’s a myth you can’t get 100 percent code coverage, Sareen said, because Broadcom has accomplished it. “Of course, that requires a lot of work.”
- Sprott made an important point about functional coverage. “It doesn’t check anything,” he said. “It’s a metric for the completeness of your stimulus.”
- It’s not yet clear what metrics make most sense for formal tools. What people really want to know is which portions of the design have been sufficiently verified by formal tools such that simulation is not needed.
There was much else said during this lively discussion, which was almost entirely Q&A (no presentations), but what sticks with me is the realization that the “people” part of establishing a verification methodology is just as important as the “tool” part. “The mindset switch won’t happen overnight, it will take its time,” Sareen said. “But you will feel very good about it in the end.”