There is no doubt in my mind that assertions will play a significant role in analog verification, be it verifying individual analog blocks or a complete mixed-signal SoC in the near future. So yes, it is for real and it is here to stay. I hope to convince you in this blog that you should take a closer look at adopting assertion based verification (ABV) for your next mixed-signal design. I have worked on assertions extensively over the past several years and still remember very well the resistance I faced from customers in early 2004 when ABV was making its official debut into the digital world. I see similarities in the barriers faced in both the analog (today) and digital (a few years back) worlds for wide adoption of this powerful verification technique.
Let's go back to year 2004 for a moment:
Digital engineers have been using procedural languages such as Verilog and VHDL for a very long time to design and verify their chips. Advanced verification languages such as Specman e, VERA, and C++ opened up new dimensions to traditional verification methodologies. Metric driven verification (MDV) improved the productivity of verification engineers by allowing them to test designs exhaustively, using random stimulus and at the same time finding bugs earlier in the verification cycle. SystemVerilog was evolving as the new standard language.
Around the same time formal verification was also catching up. Formal tools worked on proving design properties. Properties describe the design functionality in an easily readable format called assertions. The formal tools drive the design into numerous states exhaustively, and prove that the properties will always hold true or that the design state relevant to the property can never be reached, hence increasing the confidence of the verification engineer. It did not take the methodology leaders very long to leverage the power of assertions in a constrained-random verification environment. The exhaustiveness of the random simulations made assertions a perfect fit for MDV. Assertions once written hang around forever no matter what the status of the verification cycle. It is like a 24 hour security guard that is continuously looking to spot problems.
So assertions were adopted overnight by all digital verification teams across the globe, right? Wrong. Let's look at the barriers faced by assertions in the digital world.
1) Remember Verilog was/is still king. So it was no surprise that a declarative and abstract language feature such as assertions spooked the engineers in the beginning. The advantages of a declarative language over a procedural language are obvious by now but it is worth mentioning them one more time.
- Verilog is verbose and it becomes difficult to read and maintain code for complex verification environments.
- Verilog is not well suited for testing parallel events and if not written carefully it can miss capturing certain triggered events.
- Verilog does not have built-in mechanism to provide functional coverage.
Assertions, on the other hand, are succinct, easily readable and maintainable, support spawning off multiple threads (parallel events) and have built-in mechanisms to collect coverage.
2) Who should be writing the assertions? The designer or the verification engineer? This is an ongoing debate. The logic is very clear but the logistics (such as resources) make these decisions in our extremely time constrained world. At the block level the individual designer must write the assertions. It is a great way to document the design functionality before handing the block over to the verification team. The verification engineers will write the interface level and system level assertions.
3) Which assertion language should I use? The existence of multiple flavors of assertion languages such as OVL, PSL, OVA, SVA and several other proprietary languages used by individual semiconductor companies left the EDA vendors and chip companies confused. As the dust settled and some standards evolved there was still resistance for wide adoption. Why?
- Lack of re-usable assertion IP and real world examples.
- Lack of a well defined ABV methodology.
- Lack of sophisticated assertion debugging methodologies.
But as tools matured over time ABV has found its home in the mainstream functional verification methodology and is an integral part of every SoC verification environment today.
Now let us come back to 2011:
The complexity of SoCs has made analog mixed-signal verification a must (Applying Digital-Centric Verification Methodologies to Analog). Excellent progress has been made in the evolution of new language extensions catering to mixed signal verification. The first and the biggest challenge in verifying these complex SoCs is performance, and the introduction of analog behavioral modeling has proved to be a viable solution. Replacing the transistor level analog circuits with functional behavioral models have allowed the SoC integration and verification engineers to get their job done in a reasonable time.
Now that we have the analog blocks represented in the top level SoC, are we done? Is it enough to just verify the interface between the analog and digital blocks? No. Absolutely not. We have an opportunity to enhance the entire SoC verification environment by bringing in the functionality of the analog blocks as part of the MDV environment. Wouldn't it be great if the analog blocks contributed to the overall coverage closure of the verification plan? Wouldn't it be great if the analog blocks get verified in the context of the complete SoC verification environment for their core functionality? And what better way to accomplish this than analog ABV?
Like most complex methodologies it is easier said than done. In part 2 of this blog I will discuss the barriers for the adoption of ABV in the analog world, the solutions available today from Cadence and where the industry is headed.
Srikanth V Raghavan