The EDA industry is all abuzz over the new vision paper "EDA360 - The Way Forward for Electronic Design"; and for good reason - in 2010 the electronics world is finally starting to transform in ways that have been long anticipated by Specmaniacs and our "Trailblazer" program partners. Specifically, quoting this document:
"EDA360 helps close the profitability gap [i.e. the difference between what you can make, and what you can make money on] through integration ready IP-creation, IP integration, and optimization. Given that embedded software can take up half the cost of SoC development, EDA360 also supports hardware/software integration and verification. It thus expands the scope of EDA well beyond its original boundaries."
Sound familiar? This statement should resonate with many of you, since Specmaniacs and Trailblazers like you were the ones that inspired it! Furthermore:
EDA360 supports three important capabilities: System Realization is the development of a complete hardware/software platform that will provide all necessary support for end-user applications ... SoC Realization is the completion of an individual SoC (or alternative packaging choice, such as 3D IC). Along with the integration of silicon IP developed through Silicon Realization, SoC Realization includes "bare-metal software" such as drivers and diagnostics. ... Silicon Realization represents everything it takes to get a design into silicon ... an analog or digital IP block for an SoC, an IP subsystem, or a complete IC without embedded software.
So here is the $64 question: how does Specman and e relate to all of this? What follows is Team Specman's take on this question (we're very eager to hear your thoughts - please comment below, or speak to us at DAC and other forums!). For starters, we need to take a quick step back to the original design philosophy behind the e language itself.
Generic Verification Platform
When the e language was created it was specifically designed to be a generic verification platform. That is, e testbenches are specifically designed to be totally independent of the format of the DUT. Specman (whether stand-alone or as part of IES-XL) has faithfully implemented this philosophy, where we've had customers connect up DUTs comprised of Verilog, SystemVerilog, VHDL, C, C++, SystemC, Matlab, AMS models, pure SPICE models, Xtreme and Palladium systems, FPGA prototypes, post-silicon boards, test instruments, and countless simultaneous combinations of all of the above.
The reason for this flexible approach should be obvious to any verification professional: in a nutshell, you just never know what the designers are going to throw at you. Additionally, e's creators realized that they could not predict what design formats and permutations would withstand the test of time. Thus, to insure that e would be relevant forever, they deliberately made it a generic solution with unrestricted extendibility (see: "Aspect Oriented Programming", or "AOP")
When you compare e/Specman's design philosophy to the three levels of EDA360 -- System Realization, SoC Realization, and Silicon Realization -- it should be apparent that e/Specman is a relevant verification solution for all three levels of abstraction. Indeed, beyond the many users of e/Specman at the "Silicon Realization" level, with the introduction of Incisive Software Extensions and the transaction-based acceleration support introduced for Xtreme and Palladium several years ago, we have seen more and more customers use e/Specman at the SoC Realization level. Of course, as ESL solutions like C-to-Silicon and the Palladium XP verification computing platform continue to prosper; we expect this trend to snowball.
That said, despite having a platform that's compatible with any type of verification, there is one significant operational requirement that's critical to address. In order to support verification at ever higher levels of abstraction, the given verification solution has to be within an order of magnitude of the DUT's wall clock speed. Anything less will not suffice.
Team Specman is well aware of this challenge, and we are tackling it head on. As many of you will see in this year's "ClubTs" and other events, improving Specman's wall clock performance (and its evil twin, memory consumption) to support the high-throughput verification that these higher levels of abstraction require is at the core of the Specman development roadmap. Specifically, a Specman "Advanced Verification Option" is planned for the next release that will leapfrog over your current regression simulation run times to provide you considerable verification time savings.
Serving IP Integrators
But what about the emerging class of pure "integrators?" The EDA360 vision paper cannily observes that:
"Some innovators will redefine themselves as integrators. They will integrate at the silicon, SoC, and system levels. They will make heavy use of externally designed silicon and software intellectual property, they will tend to stay at mature process nodes, and they will invest heavily in embedded software development. They will become application-focused platform providers, not 'chip' providers."
As Mike Stellfox wrote in a recent blog post on the "UVM - 10 Years in the Making", from the beginning eRM was, and now UVM is, all about creating integration-ready Verification IP. It bears repeating that the key methodology concepts that are delivered by UVM include:
- Interface verification components as the base reusable building blocks
- A compartmentalized verification component architecture that separates the BFM/Driver from Sequence_driver/Sequencer and monitor, enabling scalability and easy programmatic control
- Common configuration options like active/passive agents, signal port maps, etc...
- Sequences for building constrained-random, reusable stimulus and with a common test-writer interface
- Debug messaging and logging and control of messages
By adhering to these principles, Verification IP creators are assured to give the IP integrators the modular, reusable, scalable, machine controllable, "plug and play" verification platform they must have in order to be successful.
Finally, whether it's in the Comments section below, on Twitter, on Facebook, or at the many events throughout the year (like at the upcoming DAC, June 14-17 in Anaheim, CA); Team Specman is eager to hear your feedback on EDA360, its relationship to your verification challenges, and how you see e/Specman serving you today and for all time.
Joe Hupcey III
For Team Specman
P.S. The philosophy of taking a generic approach to verification challenges is carried through to other aspects of the e language. One example is the e language's "infinity minus" approach to stimulus creation. Assuming advance knowledge of where errors exist is one of the leading causes of verification failures. Thus, all stimulus to the DUT is randomized unless otherwise specified so you can unearth unimagined side-effects and bugs.