Home > Community > Blogs > Functional Verification > everything assertion based assertion based verification abv comes of age for complete block level verification
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more convenient.

Register | Membership benefits
Get email delivery of the Functional Verification blog (individual posts).


* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

“Everything Assertion Based” -- Assertion-Based Verification (ABV) Comes of Age for Complete Block-Level Verification

Comments(0)Filed under: Functional Verification, Formal Analysis, coverage driven verification (CDV), PSL, VIP, ABV, MDV, IEV, formal, AMBA, IFV, metric-driven verification, NextOp, assertion synthesis, Zocalo, assertions

Preface: are you having trouble (re-)igniting interest in formal, muti-engine, and Assertion-Based Verification (ABV) among your colleagues and management?  If so, the following article is the perfect primer to share with such skeptics (whose knowledge of ABV might be way out of date.)

Like many things in EDA, what I'm about to say isn't conceptually new, but after years of development and promises the technology and methodology are now mature enough to finally declare victory.  In short, I dare to claim that for designers of blocks of less than 1 million flops, ABV now provides a complete verification flow, combining the strengths of formal and dynamic simulation. One can describe all elements of verification -- test generation, checks and coverage -- using assertions, and multi-engine tools like Incisive Enterprise Verifier (IEV) can drive all verification technologies to prove and fulfill every aspect of your assertions. Skeptical?  Heard all this before?  Fair enough -- but give me a chance to make the case referencing technology and methodologies that are available right now.

Let's take a step back for a brief review to level-set everyone. Assertions are typically written on signals/wires of design. Object types used in assertions are synthesizable, auxiliary code is synthesizable, and assertions themselves are synthesizable. Properties of inputs are captured in assertions (also known as an environment "property" or "constraint"). They are used to generate test vectors that can drive into the design to understand design behavior and throughput, and thus with a little automation you are quickly on the way to understanding and debugging the design. To rephrase: these environment properties or constraints are in effect a very succinct way to write design tests (and here is the best part) in a way that exactly mirrors the designers' original intent for the behavior and performance.  People skilled in formal verification tend to take this for granted, but the ability to get test vectors from assertions "for free" is something every design engineer and their managers should realize.

Once environment properties are done and the basic design bring-up is complete, one could start asking deeper questions in form of checks.  Here, a check is defined as an assertion that's structured like a "watchdog" -- for example, the assertion will "fire" (output a message) if a certain undesirable condition can be violated.  These checks can be proven to be never/always true using formal verification engines alone or combined with dynamic simulation engines.  Either way the result will be very fruitful "bug hunting". One question that is often asked is, "How good are these proofs for checks?"  In short, the quality of proof depends upon completeness of constraints or environment properties. Coverage is generated (line coverage, branch coverage, state coverage) to answer this question objectively.

One of the primary hurdles with checks has been the occasional occurrence of inconclusive proofs. In formal verification terms, the check "explores" -- it's not initially resolved by the tool.  The most basic remedy, which is typically effective 90% of the time, is to simply extend the time allocated to the solver to find a solution to prove the check. 

Alternatively and/or for very stubborn proofs, you could start thinking about the "design scenario" being addressed by the check.  Specifically, a design scenario is like a chain of worst-case steps between point A and point B, and hence the checks reflecting these sub-steps are in some ways analogous to directed tests in traditional simulation.  In this situation you could use a combination of formal and constrained-random simulation to hit intermediate points, and then use them to construct the next level point and so on.  Each intermediate point can be captured using a cover statement.  Once a meaningful test is constructed, you could see how your checks and covers are doing along this pathway, using formal and simulation in interleaved way to generate a complete trace.  Using these traces, one could see how all coverage metrics are being fulfilled.  The bottom-line: verification sign-off -- with mathematical certainty, and thus peace of mind -- can be a reality for a design blocks verified with checks, covers and coverage metrics.

There is a caveat to all of this in the two primary historical challenges with this technology.  The first is in regard to writing assertions in the first place.  Despite mature, standard languages for assertions (PSL or SVA), and a wealth of Assertion-Based Verification IP (ABVIP) for standard protocols (OCP, AXI, ABMA ), manually writing a large volume of high-quality assertions can quickly overwhelm even expert practitioners.  In recognition of this problem, in the past year both existing vendors and new companies have begun to offer users new automation.  My team has been proud to introduce "Automatic Formal Analysis" capabilities that automatically extract assertions from the device under test (recall this recent blog post on this capability).  Additionally, new vendors Zocalo and NextOp are Cadence partners providing tools to help users automate the generation of assertions on an industrial scale.

The second problem in ABV adoption has been scalability -- more accurately, the fear of a lack of scalability.  While there are no hard and fast rules, to be sure pure-formal analysis can run out of steam in blocks that exceed ~1 million flops, especially when the logic within hosts a wealth of possible state spaces.  The good news is that users no longer have to rely on pure formal algorithms alone.  In short, as noted above, tools like Incisive Enterprise Verifier (IEV) can automatically mix formal and dynamic simulation technologies so you can thoroughly traverse large, complex state spaces.


To summarize with reference to the above diagram: PSL/SVA based assertions can effectively comprise all three pillars of verification: formal testbench/constraints/assume, checks/assert, and functional coverage.  New technologies and methodologies enable users to combine simulation, formal verification and mixed technology based on assertions in positively reinforcing ways to accomplish the three main processes in verification. This makes for a complete block level verification flow that has the unique value of providing mathematically certain results.

Vinay Singh
Cadence R&D

On Twitter: http://twitter.com/teamverify, @teamverify


Leave a Comment

E-mail (will not be published)
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.