Home > Community > Blogs > Functional Verification > are you playing with a full deck
 
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more convenient.

Register | Membership benefits
Get email delivery of the Functional Verification blog (individual posts).
 

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

Are You Playing with a Full Deck?

Comments(0)Filed under: Functional Verification, OVM, SoC, SystemVerilog, e, Low-power, CPF

A professional gambler confidently place bets because she know the odds, but she would be crazy to play at a table that didn’t use a full deck because the odds change in an unknown way.  If you use a simulator that doesn’t enable low-power verification in every test run, you are just as crazy.

Why? Let’s take a look at what gives the verification engineer confidence – finding bugs.  If we have a power-aware structure like Figure 1, we certainly need to verify two aspects – the power management circuitry itself plus the reaction of the power domain to those control signals.  If we find and fix some errors we grab an extra donut knowing that, we’ve got the bugs beat.  Right?


  


Not if we ignore the bluff.  What if the structure in Figure 1 is a peripheral in the SoC of Figure 2?  How do we know that it will function properly in the rest of the circuit?  Certainly, we simulated the directed test that ramps the supply voltage due to an external stimulus. However, if that stimulus can originate in both hardware and software then we need to validate both sources.  Furthermore, the voltage ramp in this block may occur asynchronously to other functions in the SoC but those functions must be able to handle the appearance and disappearance of the resources associated with this block.  If you are running 1000s of system regressions, but only a handful of directed “low-power tests” how can you be sure the directed tests possibly cover all of the situations in which the random tests might encounter a power mode?   This is just like a poker player who thinks he knows his opponents tell thinking "I only have to worry about his hand when he stacks his chips." Unfortunately for that player, fate will always call that bluff and there will be cases where he didn't expect a good hand but one is played.   For low-power designs, the engineer may know the cases that should cause power transitions, but what happens if there is a bug that causes an unexpected transition.  If you don't run with low power enabled, this bug will go undetected.

The trouble is that some verification engineers are willingly not playing with a full deck! Their simulators use a PLI application to validate power-aware structures, slowing their test runs so much that they choose to only run power-aware on a limited number of directed tests.  Only the Incisive Enterprise Simulator validates power-aware structures natively which can actually result in faster test runs when blocks of the design are shutdown.

In poker, it’s okay to lose a hand or two, but in chip design can you afford to lose a bug or two?

=Team genIES

 

Comments(0)

Leave a Comment


Name
E-mail (will not be published)
Comment
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.