Home > Community > Blogs > Industry Insights > user view how metric driven verification improves asic and fpga quality
 
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more conveniennt.

Register | Membership benefits
Get email delivery of the Industry Insights blog (individual posts).
 

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

User View: How Metric-Driven Verification Improves ASIC and FPGA Quality

Comments(0)Filed under: Industry Insights, ASIC, FPGA, OVM, Metric-driven verification, Incisive, MDV, SystemVerilog, metric-driven, C++, ATE, Teradyne, Kheyfets

To keep bad chips and boards from going into the field, automatic test equipment (ATE) has to be reliable. That's why Teradyne, a major ATE provider, takes verification quality very seriously. With help from Cadence, Teradyne converted from a "home grown" methodology and made the switch to metric-driven verification (MDV) several years ago.

In a Jan. 10 press release announcing a Cadence Silicon Realization verification capability, a Teradyne representative noted that MDV has helped his company improve predictability and quality for both ASICs and FPGAs. I talked to Vlad Kheyfets (right), semiconductor design engineer at Teradyne, to get some more details about that company's experience with MDV.

From Home-Grown to Standard Methodology

Kheyfets' group is verifying ASICs with up to 10 million gates, as well as the largest currently-available FPGAs. These ASICs and FPGAs go into Teradyne's automated test products. The ASICs include some mixed-signal content.

Before the adoption of MDV, Kheyfets said, Teradyne used a "home grown methodology" based on C++ and Verilog, and comprised mostly of directed testing. "We created a test plan, and then we created the feature list, and then we mapped the test plan to the feature list manually," he said. "We ran the tests and mapped them back to the features in the test plan."

Teradyne began the adoption of MDV in 2006, with help from Cadence AEs. "We realized our home-grown methodology had some flaws," Kheyfets said. "It is better to go with a standard methodology." These days Teradyne uses MDV with Incisive Enterprise Manager. The company still uses C++ in some projects but uses SystemVerilog in others, along with the Open Verification Methodology (OVM) and pseudo-random test generation.

Teradyne uses MDV in a wide range of verification projects, including large FPGAs. Because of the size of the FPGAs, Kheyfets said, Teradyne uses an ASIC verification methodology. "In fact the metric-driven part almost becomes more important with FPGAs because they're delivered in multiple stages to the lab, as opposed to ASICs, which are delivered in a single tapeout," he said. "The metric-driven approach helps us measure how well we're doing for each of those stages."

How Teradyne Uses MDV

For each ASIC or FPGA, Kheyfets noted, Teradyne engineers develop a verification plan (vPlan). They come up with a feature list and note the metrics needed to "close" verification on every feature. The plan includes both directed and random tests. MDV is especially important with random test generation, Kheyfets said, because "it is not straightforward to see what you have achieved unless you use well-defined metrics."

Teradyne engineers use both functional coverage and code coverage. Functional coverage is the most useful, but Kheyfets noted that code coverage "is still a must. If you don't do it, you open up possibilities for bug escapes."

MDV, Kheyfets noted, makes it much easier to report results and communicate goals - especially when dealing with offshore verification teams. It also improves schedule predictability and verification quality. "Our hardware teams [who build the testers] have noticed the quality," he said. "They understand our current techniques are much more thorough."

Looking forward, Kheyfets would like to see some "bug-based" metrics and schedule-based metrics in the verification plan. "We would like to track a lot of data over the life of the project," he said. "A full regression history is important to us." MDV could also help analog/mixed-signal verification, which is accomplished today with completely different tools and methodologies, he said.

Adopting MDV and SystemVerilog

Kheyfets initially had some concerns about SystemVerilog, but said that the language "is better than I thought it would be. Originally I thought it would be a very truncated version of C++, but the more I used it, the more I realized it is good enough for verification." With OVM, he noted, many routine things are implemented in the library classes, and project engineers can focus on defining the scenarios needed to verify a feature.

Adopting MDV involved a learning curve and required help from Cadence application engineers, Kheyfets noted; "it doesn't come for free." Thanks to MDV, he said, Teradyne has been "amazingly thorough at doing verification and amazingly good at coming in on schedule. If predictability is your concern, MDV is the way to go. You just have to buy into the extra ramp time that it's going to take to get the processes in place."

Richard Goering

 

Comments(0)

Leave a Comment


Name
E-mail (will not be published)
Comment
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.