One of the things I learned when Verisity purchased Axis was the difference in mindset between verification using emulation vs. simulation. Emulators generally cost more and companies have less of them compared to logic simulators which cost less and companies have more of them. Each technology has its pros and cons that I don't want to get into today, but one of the things I learned from my Verisity friends was that verification could be improved by running massively parallel simulation. Sure, most projects have a suite of thousands of tests that are run every day to make sure nothing is broken when changes are made, but this is not what they were talking about. Running more tests only makes sense if there is a way to create different results and there is a way to gather all the results, identify the bugs, and measure what really happened during all of the simulations. This improved verification was due to actually finding new bugs that were triggered as a result of a smarter verification environment. Verification tools like Specman and Enterprise Manager made all this possible. One of the results of the transition to constrained random stimulus generation, checking, and functional coverage has been a digital verification flow that typically uses a farm of machines to run many simulations in parallel, not just to cut down the time needed to run the same tests over and over, but to automatically create new tests that hit new bugs. Even small companies run hundreds of simulations at a time and larger projects may run thousands. Recently, I heard one company is running hundreds of thousands of simulations in parallel. This is a very high license to engineer ratio.
Analog design and verification is something I know almost nothing about, bit it seems to include simulation and a lot of inspection of waveforms. This has lead to a flow where each engineer sits at the screen, simulates, inspects, and iterates until he is happy with the design. Under this flow the license to engineer ratio is 1.
Recently, I was talking to someone who has been around Virtual Platforms for software development for a long time. He was lamenting that the Virtual Platform has not progressed from the analog model to to the digital model as described above. His vision for Virtual Platforms has always been massive parallel simulation. Features like check pointing, dynamic scripting, reverse execution, ease of migrating simulations from one machine to another should all support this vision. My conclusion is that both analog and embedded software would like to move to the digital verification model, but are not there yet. In the embedded software space the same type of tools that enabled the transition in digital verification including constrained random stimulus generation, checking, code coverage, and functional coverage are needed.
There is a great opportunity for Virtual Platforms to make a significant contribution to embedded software, just as the logic simulator did for digital design, but without verification methodology the best the Virtual Platform can do is the one per engineer model where every software engineer sits by the machine, runs and debugs code, and manually inspects the results until happiness is achieved. I'm sure a lot of original Specman sales people can tell stories about how they used to visit engineers who would say, "I have a Verilog simulator and a good waveform debugging tool, what else could I possibly need?". It appears the Virtual Platform is playing the role of logic simulator for embedded software engineers, but we know from history that having a simulator is not enough. It's the foundation on which everything else is built (as I describe in the Verification Hierarchy of Needs) but an entire verification ecosystem around the Virtual Platform is needed for it to reach its full potential.
Why am I bringing this up, because I work for a tool vendor that needs to sell more licenses? Not really. Sure more licenses are good, but customers aren't really interested in the licenses, they are interested in the results. It's intriguing that both the analog world and the embedded software world seem to be pushing in the same direction. While reviewing the similarities I came across a paper titled "Coverage-driven verification for mixed-signal systems". Some of the challenges relate to checking that an analog waveform or a C function is doing what it should. It's probably not as easy as checking digital signals for 1's and 0's or busses for hex values, but I'm confident these challenges can be overcome and embedded software on the Virtual Platform will adopt the digital verification flow. I also hope the same thing happens to analog design and verification, but for now I'll keep focusing on the Virtual Platform and leave the analog part to my more than competent Cadence colleagues.