Home > Community > Blogs > Industry Insights > dvcon 2014 panel did we create the functional verification gap
 
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more conveniennt.

Register | Membership benefits
Get email delivery of the Industry Insights blog (individual posts).
 

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

DVCon 2014 Panel: Did We Create the Functional Verification Gap?

Comments(0)Filed under: DVCon, SoC, Functional Verification, UVM, Stellfox, DVCon 2014, verification gap

Could standardized functional verification approaches actually be an impediment to success? In a panel provocatively titled "Did We Create the Verification Gap?" at the DVCon 2014 conference on March 5, verification experts debated this question and noted the need for new methodologies that address system-on-chip (SoC)-level verification along with embedded software.

Organized by J.L. Gray, senior architect at Cadence, the panel took a deep look into the "verification gap" between what we need to do and what we actually can do to verify large IC designs. While standardized approaches such as the Universal Verification Methodology (UVM) address IP block and subsystem design, panelists noted, much more needs to be done to support chip-level and system-level design.

Panelists were as follows, as shown from left to right in the photo below:

  • Jim Caravella, vice president of engineering at NXP
  • Harry Foster, chief verification technologist at Mentor Graphics
  • Bill Grundmann, fellow at Xilinx
  • Janick Bergeron, design verification engineer at Synopsys
  • Mike Stellfox, fellow at Cadence

Moderator John Blyler, chief content officer at Extension Media, opened with a challenge. "What if the verification gap is occurring as a result of continued adoption of industry standard methods? Are we blindly following industry best practices without keeping in mind the actual point of our efforts, which is to create a product with as few bugs as possible?"

Here are some of the more notable questions and answers from this one-hour panel session.

Q: Is there such a thing as doing too much verification of a design, and is this a problem that contributes to the verification gap?

Bergeron: Verification is an insurance policy against failure. You have to do as much as is necessary to be confident of having a successful product.

Caravella: From my perspective and experience, every chip has bugs. The only question is whether it really impacts the system or the customer. As long as that's not the case, we did enough verification.

Stellfox: In the last few years there's been more focus on systematic approaches like UVM. Those approaches are really focused on bottom-up, exhaustive verification. But more and more product functionality is determined by the software. There needs to be more focus on verifying the product from the perspective of the software and how it could be exercised.

Grundmann: The problem is that there is no metric in the world that says you're done. If you're truly done, the product will never have a bug or a failure. The question is what's the metric for good enough? Is it a measurable thing or a gut feeling?

Q: There's a tendency to add more and more verification. How do you limit it?

Caravella: If we have infinite resources and time, it still doesn't mean a chip is bug free. That's because you are making assumptions about the system that may be wrong. We struggle a lot with system specs that are fuzzy.

Foster: The point is to mitigate risk and minimize bugs, not get them all.

Q: Is there anything the software world can help us with moving forward?

Stellfox: The challenge I see is a divide between the hardware teams and the software teams. It's often a cultural issue. The biggest issue is getting teams together and actually designing the hardware with the software in mind from the beginning—and doing early and often hardware/software integration.

Q: So, is it almost an organizational change?

Grundmann: Possibly. You have some kind of hierarchy and a handoff of information. The handoff is flawed in a lot of cases. Somebody developing an IP block for use in an SoC may write a 150 page to 300 page document for the software guys to go through.

Q: Why are there so many verification approaches today, and can they be simplified?

Foster: We're dealing with an NP-hard problem. There is a certain level of complexity, and on top of that there is a power domain, and then security domains, and clock domains. There is not a single solution that can be applied—you've got to have multiple solutions.

Stellfox: The key to having all these different technologies—formal, simulation, constrained random—is the ability to optimize the flow and integrate all the different technologies. With customers, we are trying to focus more on the flow aspect. We see where formal applies best, integrate it around a common plan and a common way to capture metrics, and we track progress against the plan.

Caravella: What I see all the time is that people are immediately jumping into a specific tool and flow without really thinking about what they're trying to do.

Stellfox: People need to be able to apply the technology as opposed to just jumping in and saying, "Let's start building testbenches in UVM." We've seen people adopt UVM and do directed testing. You're learning all this complicated SystemVerilog stuff and you're still doing directed testing!

Q: Why is there no effort to solve the SoC verification problem?

Stellfox: UVM is really designed for IP and subsystem verification from the bottom up. For most companies I see, SoC flows are kind of a wild west right now. There's a big opportunity there and that's where I'm focused.

Q: What will it take to move verification into the design cycle earlier?

Bergeron: The problem is that RTL works well from a designer perspective. It's not painful enough. When it becomes a verification issue, you'll see an impetus for adopting those [system-level] technologies. It will only happen when verification is so painful it forces a change. We're on the cusp.

Foster: In the late 1980s when I was doing design, we had a severe verification gap because gate-level simulation was too slow. There has always been, and will always be, some sort of a gap. We've always solved it. I'm convinced we'll solve it now too.

Richard Goering

Other DVCon 2014 Cadence Blog Posts

Jim Hogan at DVCon 2014: Functional Verification Faces "Abundant Chaos" from New Technologies

DVCon 2014 in Review: Formal Verification, Value Chain, and the Industry's Future

Lip-Bu Tan at DVCon 2014: EDA/Silicon Ecosystem Crucial to Innovation

 

Comments(0)

Leave a Comment


Name
E-mail (will not be published)
Comment
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.