Many engineers today use C language software running on an embedded processor model to build testbenches for hardware verification. This "software-driven verification" technique is an ad-hoc methodology that often uses home-grown tools. But it's something you may hear more about in 2013, as software becomes a more and more important part of the overall SoC development process.
In a column posted just before Christmas 2012 at Electronic Design magazine, Frank Schirrmeister (right), product marketing director for the System Development Suite at Cadence, wrote that "it is time for software-driven verification to be adopted at a faster rate in 2013." He noted that many customer projects use processors to execute test software in conjunction with the hardware block that's being verified. In some cases, teams use a dedicated separate processor for those tests, which functions much like a built-in self test (BIST) capability in silicon.
Intrigued, I talked with Schirrmeister about software-driven verification and its advantages and challenges. He noted that there are three aspects to software-driven verification. One is getting hardware and software to work together. Another is using software to develop testbenches for hardware. A third is using software to model the environment in which the chip resides.
Since a testbench may have as many bugs as the design itself, developing a clean testbench early in the verification cycle is very valuable. But the real advantage of software-driven verification, Schirrmeister said, is the ability to reuse the testbench in all the different phases of design and development. This includes RTL simulation, acceleration, emulation, FPGA prototyping, and even the silicon itself.
The testbench stays at the C level, but you add more details to it as it goes through the various stages of verification. You can start at the transaction-level modeling (TLM) level and run a number of scenarios. Then, you can refine the testbench to run at the register-transfer level. Or, keep the software at TLM so it runs faster, and use transactors to connect it to RTL verification. In any case, emulation with a platform such as the Cadence Palladium XP makes it possible to add even more detail to the testbench. Finally, you can run the testbench in an FPGA-based environment such as the Cadence Rapid Prototyping Platform.
"If you have an e testbench, you need Specman to run on a host to drive the verification signals," Schirrmeister said. "If you have software running on a processor doing the same thing, that helps you with verification at all levels." However, one disadvantage of a software-driven testbench is that you cannot probe every signal directly, as you generally can with e or SystemVerilog. Thus, the types of tests may be different - in the case software-driven verification, there may be more tests from the "outside looking in."
Modeling the System Environment
Another aspect of software-driven verification is the use software to model the system environment. This is especially helpful if a chip has a lot of interfaces to the outside world, like MIPI or PCI or USB. When you're running a regression with Palladium, Schirrmeister noted, you don't want to ask an engineer to plug five different kinds of memory sticks into a Cadence SpeedBridge rate adapter. "That's something you want to virtualize, and that's where software-driven modeling of the environment comes in with transaction-based acceleration," Schirrmeister said.
Of course, while virtualization is great for regression automation, the connection to the "real thing" - like a PCIe interface or an actual memory stick - should be verified before tapeout. That's why both virtualized connections and real connections using rate adapters like SpeedBridge have their place in the verification flow.
While most software-driven verification environments are home-grown, tool support is starting to emerge. One pioneering product is Cadence Incisive Software Extensions. Working with the Cadence Incisive Enterprise Simulator and the Palladium XP Verification Computing Platform, it gives the verification testbench access to software executing on processor models. It lets teams apply metric-driven verification to system-level behavior including software processes, function calls, and variables. Users can build software tests that automate stimulus generation, including constrained-random variability. Finally, the offering provides post-process software debug.
Cadence Incisive Software Extensions
More information about Incisive Software Extensions is available in a detailed whitepaper.
Over time, EDA software will provide more automation for software-driven verification. Possibilities include software verification IP, transactors for TLM-to-RTL connections, and advanced debuggers. Verification operating systems are also starting to appear, such as VTOS (Verification and Test OS) from Kozio. Software-driven verification will be an interesting "space to watch" in 2013!