Home > Community > Blogs > Functional Verification > the role of coverage in formal verification part 2
 
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more conveniennt.

Register | Membership benefits
Get email delivery of the Functional Verification blog (individual posts).
 

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

The Role of Coverage in Formal Verification, Part 2

Comments(1)Filed under: Functional Verification, Formal Analysis, Coverage-Driven Verification, verification strategy, Model-checking, PSL, SVA, verification, Incisive, ABV, IEV, formal, methodology, IFV, assertions, coverage

As noted in the prior installment of this series, there are three main questions to be answered with coverage in formal verification:

  • How good are my formal constraints?
  • How good is my verification proof?
  • How can I feel confident my verification is complete?

In Part 1 we began to address the first question, and in this post we will continue to discuss it in the context of debugging over-constrained verification environments.

Recall that Formal verification exhaustively verifies a user's check, which in plain English translates to mean that all legal input behaviors or traces allowed by the specified constraints are mathematically tried out on the user's check.  In other words, a proven check cannot fail by any legal traces from constraints.

But what if we had "over constrained" the analysis, and we thus might be unintentionally blocking the analysis of legal behavior?  As promised in Part 1, allow me to elaborate on this problem and show how to quickly debug this issue within Incisive Enterprise Verifier (IEV).  Specifically, IEV drives your design inputs using stimulus generated from PSL/SVA constraints. This trace generation process could be severely restricted by over constraining, resulting in truncated legal traces. In fact, really bad over-constraining could even truncate all legal traces. Let us take one example:

//psl C1: assume always ({a; a[*1:5]; !a && b}) @(posedge clk);

Constraint C1 tries to capture an input behavior where, "as ‘a' goes high it should remain high for 1 to 5 cycles, and then it should go low as 'b' goes high".  However, as per the semantics of the always {sequence}, it starts off a fresh sequence every clock cycle. This would mean that multiple sequences in flight need to be satisfied together, which puts requirement on "a" to be high and low simultaneously.  This contradiction in the levels of "a" further implies that all traces are truncated from constraint C1.

Fortunately, the "all traces are truncated" problem is readily caught by the "Vacuity" check capability in IEV.  Specifically, the following example shows IEV run on the above property where the Vacuity problem is reported on constraint C1 and clock constraint. In general, the minimum constraint set involved in conflict is reported.

FormalVerifier>
FormalVerifier> search 1000
Modeling check mode:
formalverifier: *E,VACCCS: Verification cannot continue as vacuity check has detected conf cting constraints. Remodel your constraints before invoking the tool again.
//psl C1: assume always ({a; a[*1:5]; !a && b}) @(posedge clk);
|
formalverifier: (./test.v,8|0):test.C1 is a conflicting constraint.
formalverifier: test.clk is a conflicting clock constraint.
Session Status: Vacuous
  Conflicting Constraints:
    C1
    clk

How do we start correcting this problem?  First, modify constraint C1 as follows:

//psl C1: assume always ({a} |=> {a[*1:5]; !a && b}) @(posedge clk);

Now there is no vacuity issue -- with this constraint, it's legal for "a" to be low.

However, if you look carefully enough, some of traces are still truncated.  Specifically, if "a" is high for one cycle, it needs to be high for subsequent cycles.  Thus, each cycle of "a" being high needs sequence "{a[*1:5]; !a && b}"  to be satisfied, and hence overlapping sequences would need "a" to be high and low in same cycle.  How do we stay out of this trap, now that only some traces are truncated from constraint C1?

IEV's "Assertion Driven Simulation" capability provides a solution to this problem.  Once the built-in dynamic simulation runs into a state where constraints are conflicting with each other or with the design, IEV stops and outputs a detailed debug report.  This report identifies conflicting constraints and design variables which are assigned and inputs which cannot be assigned because of the conflict(s).  It also provides a waveform that takes you right to the dead-end state.  The following console snippet shows such a report for constraint C1 above.

FormalVerifier> search 1000
Modeling check mode:
  Vacuity check finished
Verification mode:
Search Seed :    1
No searchpoints added, only running simulation
formalverifier: *W,SRDEAD: The tool has encountered a deadend state. Use debug -solver -deadend command to view the conflicting constraints.
FormalVerifier> debug -solver -deadend
Deadend-state constraints/conditions
      test.C1, file ./test.v, line 8

Deadend-state assignments
      test.a = unassigned
      test.b = unassigned
      test.clk = 1'b0

Counter-example launched.


Based on the above report, if "a" is high for more than one cycle then sequence "{a}" matches  in consecutive cycles. This causes overlapping "{a[*1:5]; !a && b}" sequences to be satisfied resulting in a dead-end state.

Hence, constraint C1 should be further modified as follows:

 //psl C1: assume always ({rose(a)} |=> {a[*1:5]; !a && b}) @(posedge clk);

Now, when "a" goes high, "rose(a)" matches at transition of "a" from low to high - "a" being high for multiple cycles is not a problem anymore!  This can be confirmed by viewing the "witness traces" / waveforms generated from IEV - you will not see any truncation problem now.

Referring back to our original process diagram below, with all of the "over constrained" cases cleared away, you can now proceed to visualize the traces and move further with "Constraint development and design exploration" phase.

 

Figure 1: Phases of verification and coverage collection

In the next segment of this series we will continue discussing this "Constraint development and design exploration" phase in the context of the second question, "How good is my verification proof?"  Until then, happy verifying!

Vinaya Singh
Architect
Cadence R&D

On Twitter: http://twitter.com/teamverify, @teamverify

 

Comments(1)

By JoergM on January 24, 2011
Excellent Stuff!

Leave a Comment


Name
E-mail (will not be published)
Comment
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.