Home > Community > Blogs > Functional Verification > getting more value from the ovm using metric driven verification
 
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more convenient.

Register | Membership benefits
Get email delivery of the Functional Verification blog (individual posts).
 

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

Getting more value from the OVM using Metric-Driven Verification

Comments(0)Filed under: Functional Verification, Testbench simulation, Verification methodology , Enterprise Manager, OVM, eRM, CDV, Coverage-Driven Verification, SystemVerilog, metric driven verification (MDV), coverage driven verification (CDV), Open Verification Methodology

With all of the momentum around the OVM these days there has been a lot of good discussion about testbench methodology. The OVM provides a great methodology framework for building modular, reusable verification environments, but there is more needed to get the full value of adopting an HVL like SystemVerilog or e.

I have seen a lot of people adopt an HVL and continue doing directed testing -- that leads to a whole lot of work to learn a new language without getting a good return on your investment.

The real value of SystemVerilog and e is in enabling a Coverage Driven Verification Methodology which is a much more scalable, automated approach to verification.

At Cadence, we have developed a Metric-Driven Verification (MDV) Methodology on top of the OVM to help customers move from directed testing to a coverage driven approach so that they can get the full value out of adopting an HVL.

Just like OVM provides a structured, cookbook approach to building verification environments, MDV provides a structured, cookbook approach to take full advantage of constrained-random stimulus generation, move from using test metrics to coverage metrics, and manage the large amounts of failure and coverage data that is generated by these types of automated verification environments.

Just this past week, we announced significant new advancements to our Metric Driven Verification Solution.

I plan to go into more of the technical details of Metric-Driven Verification over my next series of blog entries in the coming weeks. I will give a short overview here. Metric-Driven Verification begins with a methodology for capturing a hierarchical, feature-based verification plan which is focused on the "what" vs. "how" to verify.

Once the plan is captured, the next step is to decide what is the best verification approach to apply to each section of the plan. This might include any combination of formal assertion-based verification, SystemVerilog or
e coverage-driven verification, and HW/SW hardware emulation.

For each of these approaches, there are important verification metrics which can be used to measure progress -- for example, the best metric for SystemVerilog or
e coverage-driven verification is functional coverage.  These metrics are mapped to the sections of the verification plan in a tool-readable format we call a "vPlan" which is read by the Incisive Enterprise Manager.

The Enterprise Manager has facilities for running all of the verification jobs, collecting and merging all of the verification metric and test failure data, and providing an engineering cockpit for analyzing this data.  With the verification plan captured inside of Enterprise Manager, the merged coverage results of a regression run are projected against this plan so that you get dynamic feedback to measure exactly where you are in achieving verification closure against your specific verification plan.  As we all know, verification is an open-loop problem -- you can never fully verify a design.

With our metric-driven approach, we are trying to approximate a more closed-loop approach to verification.

Many customers have found that applying this "Plan to Closure" approach enables them to get the maximum value when they adopt SystemVerilog or
e for verification or when combining the several different verification approaches mentioned above.

More details to follow...

Mike

Comments(0)

Leave a Comment


Name
E-mail (will not be published)
Comment
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.