It was not surprising that a customer Q&A panel at the Logic Design Technology Event, held at Cadence last week, would focus almost entirely on functional verification. As one panelist noted, verification consumes over 50 percent of the design effort. What I found interesting was the amount of discussion around static formal checking and equivalence checking, and the extent to which it can replace gate-level simulation.
As moderator, I noted that all three user presentations at the event discussed the use of Cadence Encounter Conformal products, and all cited uses other than the equivalence checking for which Conformal was originally best known. In a presentation before the panel discussion, Arvind Chopra, design manager for the microcontroller product group at NXP Semiconductors, talked about Conformal Low Power. Camille Kokozaki, director of design automation services at Integrated Device Technology (IDT), talked about Conformal Constraint Designer and Conformal Equivalence Checker. Vishvabhusan Pati of Qualcomm, who didn’t join the user panel, talked about automated ECO handling with Conformal ECO Designer.
I asked panelists why they use Conformal. Some answers:
“We use it for equivalence checking and more and more for low power checking. There’s less time to run things at the gate level, hence it’s very important to use other techniques,” Chopra said. He added that “it does reduce the need for simulation at the gate level.”
“There is definitely a reduction in simulation,” Kokozaki said. “Once you change something and you cannot formally prove it is equivalent, guess what – you have to do a lot of simulation.”
Fred Jen, director of physical design at Qualcomm, said that “in the ideal world, we would eliminate all gate-level simulation. It would reduce a huge run time.”
Fred Jen (Qualcomm), Rajiv Parameshwaran (Sandisk), Camille Kokozaki (IDT), and
Arvind Chopra (NXP) discuss verification at the Logic Design Technology Event (left-right).
An audience member noted that automatic test pattern generation (ATPG) requires gate-level simulation. “It’s true,” Jen responded. “But there are two parts to it. First you want to make sure your functional design is correct. ATPG is actually straightforward in terms of test.” Kokozaki said ATPG is a “checklist item,” and noted that “the debug and analysis of corner cases is where you spend most of your time. If you prove everything is functionally equivalent, you don’t have to go through that.”
Chopra, however, noted that “we have found some value in gate-level simulation. Sometimes there are things like clock glitches that are very hard to catch through any other means.” Later on, an audience member asked him why formal tools can’t do clock sequencing. Chopra responded that it would probably be feasible to put some assertions into a formal tool that would detect clock sequences that cause glitches.
Some other perspectives that came from the panel are as follows:
To simplify verification, stay with standard interfaces and buy standard test packages (verification IP) rather than writing them yourself (Jen).
- Analog blocks pose a big verification challenge, and one problem is that analog designers rarely have expertise in modeling (Kokozaki).
- Block level power estimation is easy. A hierarchical power analysis is much harder, and needs more support (Rajiv Parameshwaran, staff engineer at Sandisk).
- Nobody uses pure statistical timing analysis. People find violations first and then use statistical timing to see if they go away (Jen).
- Assertions are very helpful, but are not signoff quality (Chopra).
I’ll close with a pointed comment by Chopra – that there is no single tool or methodology that allows you to “sign off” and be assured that the design is completely checked and that it works. All verification approaches have both advantages and limitations. A big part of the verification challenge is finding the right approach at the right time.