Home > Community > Blogs > System Design and Verification > using virtual platforms for software verification
 
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more conveniennt.

Register | Membership benefits
Get email delivery of the System Design and Verification blog (individual posts).
 

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

Using Scoreboards and Virtual Platforms for Software Verification

Comments(0)Filed under: Incisive, ISX, posedge, linux, virtual platforms, virtual prototypes, system verification, software verification, scoreboards

Today I'm running a guest article written by Henry Von Bank of Posedge Software, a Cadence Verification Alliance Partner. For some background refer to the interview I did with Henry back in November 2008. Henry has been working on advanced system verification using Incisive Software Extensions (ISX) and Virtual Platforms, and shares some of his ideas and techniques below.

Jason Andrews

At Posedge Software, one of the things we have often considered is whether tools or methodologies from one discipline would work well for another. After working in both hardware and software development, there are definitely cases where I have liked some tools for software development better than the comparable hardware tools. One of the first projects I worked on was a code editing tool for hardware languages that provides features available in many software development environments.

In hardware verification, one of my favorite concepts is a scoreboard. For anyone not familiar with this concept, a scoreboard is a data structure/class used to track the state of a device, and check that for a given input, the output matches (and often checks if it is in the correct order). There are a number of examples out there for creating a scoreboard with UVM. I find this to be a very intuitive way of organizing a verification environment (VE), and wondered how this would work for software verification. My research revealed very little on using a scoreboard for software verification, although similar concepts are probably used, but with different terminology.

When performing software testing/verification for a PC application, or running on the target embedded hardware, it would be difficult to implement a scoreboard directly. You would need to modify the target software extensively to catch the relevant events occurring, which could add a significant amount of overhead.  Not to mention the chance that this instrumentation altered the behavior of the program.  However, it becomes much easier to implement this type of VE when using a virtual platform.

Open Virtual Platforms (OVP) includes a library of processor models and allows you to integrate these models into SystemC virtual platforms to model a variety of systems. One of my favorite features of OVP, and one of the most powerful, is intercepts. An intercept is a mechanism that lets you transparently add instrumentation to your environment, such that you can be notified when a particular function is called, or if a certain variable, register, or memory location is read or written to. OVP can also be integrated with Incisive Software Extensions (ISX) to create a complete HW/SW verification environment.

So what would a software verification scoreboard implemented with OVP and ISX look like?  Below is a block diagram of a verification environment that was used to test a device driver. The hardware design is modeled as a virtual platform, while the testbench is implemented as an e verification environment.  ISX, along with OVP intercepts, are used for stimulus and the various monitors.

For most software testing, the stimulus is usually a call to a function. This is the primary purpose of ISX. In this example, it might be to call the driver initialization function, which would in turn set the values of some variable in the software, and write to registers in the peripheral. A sequence of events like the following might occur:

  1. Stimulus generator calls the driver_init() function on the target through ISX, while also calling the compute_transactions() method in the testbench.
  2. The compute_transactions() method determines which peripheral registers should be written to (along with variables that will be updated in the software) for the driver_init() function, and pushes these to the scoreboard.
  3. In the scoreboard, the register writes are pushed into a list in the order in which they are expected to occur.
  4. The init_driver() function begins executing, and runs until a peripheral register write occurs.
  5. An intercept is triggered when the register is written, which notifies the bus monitor.
  6. The bus monitor updates the scoreboard with the register write. After checking that this write was next on the list of expected transactions, it is removed from the list. If the transaction doesn't match, an error is reported.
  7. The test continues to run in a similar fashion.
  8. Upon completion of the test, the scoreboard checks that the list is empty and all writes occurred.  If not, an error is reported.

The above case is very similar to how scoreboards are used in hardware verification, and this concept is being used in some system-level verification environments. However, this approach could also be used further up the software stack. For example, in the Linux kernel there are a number of subsystems such as virtual memory, schedulers, or hardware subsystems like USB, which have a relatively small number of entry points. These may be good targets for this type of verification, as these systems can often be abstracted down to a relatively simple behavioral model.

Another area that may be a good fit for scoreboarding would be concurrency, and multi-threaded/multi-processor applications. This parallelism is a major challenge in software development, but is something that the hardware world has been dealing with for a long time, in part by using methodologies and mechanisms like a scoreboard.

Henry Von Bank

 

Comments(0)

Leave a Comment


Name
E-mail (will not be published)
Comment
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.