Early software development on software virtual prototypes is
a great capability, but at some point hardware/software integration requires
the accuracy that only real hardware can bring. When that occurs, there are
three choices - acceleration, emulation, and FPGA prototypes. Even though we
are accustomed to seeing acceleration and emulation as almost the same thing,
each of these three choices is different and each has a distinctive role to
I have these thoughts as Cadence rolls out the Palladium XP,
a verification computing platform that unifies acceleration capabilities from
Xtreme product line with Incisive
Palladium emulation, incorporating some of the strongest capabilities from each
platform. You can read
the press release here, but in this blog I'll look at the larger story
behind the announcement. Why put acceleration and emulation in a single
environment? What role does either play in hardware/software integration? And
how do we define "acceleration" and "emulation," anyway?
I put these questions to Ran Avinun, product management
group director for system design and verification at Cadence. The answers are not
as straightforward as one might think. Ran's answers were as follows:
acceleration, the design is typically running on the hardware while a
simulation testbench runs on the workstation. If the testbench is
transaction-based, the result is transaction-based acceleration, or as
some say, co-emulation.
emulation, the entire design and verification environment is generally
running on the hardware. In addition to the hardware in the emulation box,
portions of the design or the testbench may also be running on external
target hardware through in-circuit emulation.
You can see from the above that there's a fine line between
these definitions. But there's still a distinction. With acceleration, you are
speeding up a simulator, and you should have most or all of the benefits of the
simulator. For example Palladium XP, which is optimized to accelerate the
Incisive Enterprise Simulator (but can support other simulators as well),
supports such features as executable verification plans (vPlans), metric-driven
verification with OVM-based testbenches, pseudo-random test generation, and
coverage metrics in acceleration mode. Metrics can be collected and analyzed by
the Incisive Enterprise Manager.
Emulation generally provides a self-contained verification
environment. It provides extremely fast, testbench-independent verification
speeds - up to 4 MHz with the Palladium XP - and offers fast bring-up times. It
lets you plug in real-world hardware and run real-world applications. Emulation
does not, however, provide the stimulus randomization capability of simulation
and acceleration, and coverage metrics are limited to the design since the
testbench is ported into the hardware.
In the hardware/software integration process, therefore, you
might start with a software virtual prototype, and then find you need more
accuracy to debug hardware/software interactions or run a power analysis. You
can first move some portions of the design into hardware while running the
verification environment on a workstation. This is "acceleration."
The next step would be to move everything including the
testbench into hardware, at which point you are running "emulation." Now you
have fast execution speeds and the ability to directly run applications on
target hardware. You can also run the environment in a hybrid (acceleration plus
emulation) mode in which a design or testbench runs on the workstation while
the emulator is connected to a target. A continuum flow that combines
acceleration and emulation is thus an important feature.
What about FPGA
Can FPGA prototypes replace accelerators and emulators? FPGA
prototypes are, after all, much faster. But as Ran noted in a recent
Q&A interview, bring-up time is unpredictable and accuracy is
questionable, since the design may change as it's mapped to FPGA hardware.
Design changes are often required because partitioning and clocking is not a fully
automatic process. Turnaround time and debug may also be poor with FPGA
The diagram below shows how virtual prototypes,
acceleration/emulation, and FPGA prototypes address hardware/software
development and integration. Virtual prototyping is most useful for high levels
of the software stack that have little or no hardware dependency. FPGA-based
prototyping makes it possible to move further down the stack.
Acceleration/emulation is the best solution for the lowest parts of the stack,
which typically have a strong hardware dependency. Accuracy and short bring-up
times, which are the strong points of acceleration/emulation platforms, are
However, emulation and acceleration may also be
used earlier in the process, while FPGA-based
prototyping is used later in the process when most of the hardware bugs are
A verification computing platform that brings forth the best
capabilities of both acceleration and emulation is thus an important tool for