Home > Community > Blogs > Functional Verification > constraint layering fine tuning your environment part 2
 
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more convenient.

Register | Membership benefits
Get email delivery of the Functional Verification blog (individual posts).
 

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

Constraint Layering - Fine Tuning Your Environment - Part 2

Comments(2)Filed under: Functional Verification, Testbench simulation, Verification methodology , verification strategy, IES, e, Specman, IEEE 1647, Aspect Oriented Programming, AOP

In my last post, I talked briefly about constraint layering in which I gave an extremely simple example of how users can layer constraints on an existing base environment to change how that base environment behaves, all without touching the base environment.

Obviously, we all know that our verification tasks are more complicated than that simple example.  Therefore, in this edition, I want to go over some additional features that can help address some of that additional complexity and further assist in the concept of constraint layering.

Soft Constraints
What happens when you want your "foundation" to have some "default shape"?  This is one role of soft constraints.  Using soft constraints the base code writer can provide some limits to keep the generation from completely going wild.  Then later on a user of that base code can further constrain the environment to make it fit a more specific situation. Imagine the following simple example:

 

<'
//file: base_packet_def.e
struct packet_s {
    packet_addr : uint(bits:16);
};
'>

 

Many of you know that randomization in e is what is referred to as "infinity minus".  This means that a user does not have to specify which fields are randomized but rather every field is automatically randomized to its fullest extent by dafault.  Usually this scheme requires the user to specify limits for that randomization to make the values realistic.

In the context of the example above, this means that the packet_addr can take on any 16 bit value.  What happens if this large range does not make sense for the design being tested?  For example, the address can only be between 0x1000 and 0x1FFF.

One way to implement this constraint is to have the base code writer set some default limits on the address to keep it in the limits specified.  Here is some example code illustrating this concept.

 

<'
//file: base_packet_def.e
struct packet_s {
    packet_addr : uint(bits:16);
    keep soft packet_addr in [0x1000..0x1FFF];
};
'>

 

Now when used without further constraints the packet address stays in the valid range specified earlier.

You might be asking yourself "Why use a soft constraint and not a hard one?".  Well the answer comes back to the concept of layering.  We said that the address was only legal in a certain range.  However, as verification engineers we may want to test illegal cases for example.  If the base code set uses hard constraints, then a test writer would have to change the base code to allow setting the address to a value outside that range.

Furthermore all of the "legal" tests would have to include a constraint to keep away from the newly added illegal value.  Using soft constraints and constraint layering the following example provides a much more elegant solution.

 

<'
//file: base_packet_def.e

struct packet_s {
    packet_addr : uint(bits:16);
    keep soft packet_addr in [0x1000..0x1FFF];
};

<'
//file: env1_test.e
extend packet_s {
    keep packet_addr == 0x3500;  
};

'>

 

In this example, the test writer was able to force an illegal value without changing the base code.  Also note that the rest of the existing code will adhere to the limits in the soft constraints and generate legal addresses without further constraints.


Reset_Soft()
Many of you have probably played around with the soft constraint mechanism described above.  If so, you have undoubtedly faced another real world example that also applies to our discussion.  What happens when those default limits are no longer valid or have changed?  Here again there is an opportunity to leave the base code alone and layer on some additional code to set a new default range.  For this example assume that the address range has now changed to 0x3000 to 0x3FFF.  We will keep the existing base code file found above but this time, add a configuration file to change the defaults and adjust the illegal value test.

 

<'
//file: base_packet_def.e
struct packet_s {
    packet_addr : uint(bits:16);
    keep soft packet_addr in [0x1000..0x1FFF];
};
'>

<'
//file: env2_packet_config.e
extend packet_s {
    keep all of {
        packet_addr.reset_soft();
        soft packet_addr in [0x3000..0x3FFF];   
    };
};
'>

<'
//file: env2_test.e
extend packet_s {
    keep packet_addr == 0x4500;
};
'>


In this example the environment config writer (First Layer) was able to "remove" the soft constraint from the base environment and apply new ones.  Then the test writer (Second Layer) was able to apply the specific constraints that were needed to produce an illegal test.


Hopefully this session has inspired you to think some more about areas where you can use constraint layering to facilitate your verification code reuse.  As I mentioned in the last post please feel free to get involved in this blog and submit back any comments, questions, or topic ideas that you may have. This blog is for you (the user community) and as such I hope that you will become involved to help shape it going into the future.

Until next time!

 

Brett Lammers
Advanced Verificacation Core Competency Team Member
Cadence Design Systems, Inc

Comments(2)

By bk on March 7, 2012
seems like with the latest IES 10.2, you dont need this explicit reset_soft().. Can you please clarify/confirm?

By hannes on March 9, 2012
well, actually, this has nothing to do with any recent versions. The semantics of soft have not changed. I think the example was not that clear and could have been better.
Yes, you are correct that in the above example the reset_soft(0 is not required. However,
imagine the following:
struct foo {
   a1: byte;
   keep soft a1 in [5..10];
};
extend foo {
   keep soft a1 in [1..15];
};
In this case, since there is an overlap of 2 soft constraints, the second one will not override the first one, and you'll end up with values between 5 and 10. So in this case the reset_soft() does make sense.

Leave a Comment


Name
E-mail (will not be published)
Comment
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.