System design and verification are part
of the RTL flow today, but a higher level of abstraction is now poised to enter
the IC design mainstream, according to Ran Avinun, marketing group director for
system design and verification at Cadence. In this interview he discusses
trends in hardware/software integration, prototyping, and transaction-level
modeling (TLM), and offers a perspective on the recent merger activity in the
virtual platform market. He also provides a preview of Cadence involvement in
next week's DVCon 2010 conference.
Q: Ran, how do you define "system"
design and verification?
are two definitions for "system" in the electronic industry. One has to do with
a higher level of abstraction beyond RTL. The second definition points to system-on-chip
[SoC] and system-above-chip integration, including hardware and software.
Q: Why is it so important to bring
software into the design and verification process?
recently there was a Toyota Prius problem where the antilock braking system [ABS]
gives drivers an "inconsistent feel" when
braking over rough or bumpy surfaces. The core problem is the electronic
interface (hardware and software) between the ABS and the regenerative braking
system, according to reports. This is just one example. In general our devices
are getting more complex, and each one of them has at least one processor, or perhaps
other programmable components. Each one of those processors comes with embedded
software. If you ignore software, you basically just do half of the job.
past, many devices or boards were designed only by hardware engineers, and the
software was not part of the hardware design. Integration was done after
tapeout when the SoC or board was shipped to the system integrator. This worked
well when you had 2-3 years in your design cycle and the design was relatively
simple. But when designs became complex, and design cycles shortened to 6-9
months, it didn't work well.
One of the
paradigm shifts we see is that more and more semiconductor and IP companies are
delivering the hardware together with the software to the system integrator. An
IP company needs to think about all the different configurations in which the
software will interact with the hardware and create regression tests to test
those scenarios. It creates a new set of challenges.
Q: In the hardware design world, we
have a metric-driven verification methodology along with formal verification
techniques. Can this type of formalized methodology be applied to software
software verification techniques today are very primitive, and are based on
pure code coverage or on methods used in hardware design 15 or 20 years ago. I
think there is a place for advanced verification techniques, but they probably
will not evolve directly in the pure software world. Instead, they will be
created and executed by the hardware developer who is integrating the embedded
there is room for growth in tools that address the creation of scenarios or use
cases to improve up-front verification for hardware and software. Our Incisive
Software Extensions product lets users take advanced verification and power
shut-off techniques that were applied to hardware only, and start to apply them
to both hardware and software.
Q: What do you see as the respective
roles for virtual platforms, FPGA-based prototypes, and emulation, and why is
there a need for all three?
different products have different software stacks, the lower levels of the
stack normally have a higher hardware dependency. Acceleration and emulation do
a good job of handling the lower layers of the stack where you have this
prototyping, you start to address higher layers of the software stack. Virtual
platforms enable you to run higher-level applications.
hardware side there are three tradeoffs customers need to make - speed,
accuracy, and bring-up time. In many cases, bring-up time will be a function of
turnaround time. Acceleration and emulation address accuracy and bring-up time
very well, since they are based on existing RTL models and provide very good
automation and compile time.
prototyping provides better performance, but still has issues with bring-up
time, which is unpredictable. To some extent, FPGA prototyping is inaccurate -
in many cases it requires you to make changes to your design as you map it to
the hardware. Turnaround time and debug are also poor with FPGA-based
prototyping provides high speed but not accuracy. Bring-up time is not good for
new designs, but for derivative designs where you already have the models, it
can be faster. The main issue with virtual prototyping is that you never know
if what you build represents your real design and implementation. This is an
issue that the industry and EDA vendors
will need to address in the future. While today most customers are using one or
two of these platforms, I expect all three will be used by customers in the
Q: There's been some recent merger
activity in the virtual platform market. Synopsys bought Vast and CoWare, and
Intel bought Virtutech. What's your perspective about this activity?
acquisitions confirm that hardware/software integration is a significant issue
facing our customers. These acquisitions, however, only focus on one small
piece of the total solution. The hardware/software integration challenge will
not be solved by amassing a segregated collection of point tools.
approach is to address the problem by connecting the hardware and software
worlds to the design, verification and implementation methodologies and flows.
We provide a comprehensive TLM [transaction level modeling] driven design and
verification solution, including high-level synthesis and full system
integration using acceleration and emulation. Our flows and methodologies are
based on standards. As part of this flow, we provide customers with a single
verification environment, verification IP and a metric-driven verification
methodology for TLM, RTL simulation, acceleration, emulation, and
Q: Some people may be hesitant to invest
in acceleration/emulation. Why is it so important for system design and
A: Although many new IP blocks are created in
TLM, when it comes to SoC integration, most designers are still relying on their
RTL. If you want to bring up your system including hardware and software at RTL
pre-silicon, there is only one way to do that - hardware-assisted
verification. If you need accuracy for
hardware, and you need to bring up new complex designs in a short time frame,
emulation will definitely provide the fastest bring-up. If you don't have
acceleration/emulation you may miss the target because it takes you 6 or 9
months to bring up the environment.
prototyping is a good hardware/software integration solution for a sub-system
but is not scalable. Once you get into larger designs, the time spent mapping and
bringing up the design is very long, and also the performance will degrade
significantly, maybe even to the point where you're running slower than
emulation. Also, debug visibility into the chip is very limited.
companies are using a combination of emulation and FPGA prototyping. We see
this happening in two ways. "Dual mode" is where they use each platform at
different phases of the design. "Hybrid mode" is where they're connecting FPGA
prototyping to acceleration/emulation and running them together at the same
Q: Cadence is putting a lot of
emphasis on a TLM-driven design and verification flow. TLM describes hardware.
Does TLM modeling help with software verification?
Absolutely. You can use a TLM model as a single source, and apply it to both a
virtual platform and to high-level synthesis in order to link to
implementation. It might be that as you do this, you generate two models - one
will be faster, another will have more details about implementation, but the
point is that you start from a single model that you can continue to update as
your golden source.
Q: How does transaction-level
modeling help with the design and verification challenge?
gives you a productivity improvement in both hardware design and verification.
transition to TLM, our goal is to provide our customers with a 3-10X design
productivity improvement, 2X shorter verification cycle, and 10X faster
exploration and architectural trade-offs. As you write a smaller amount of code
and keep a separation between your functionality and design constraints, your
initial design and especially IP re-use becomes faster. When it comes to
verification, you can simulate your design and debug your problems faster,
because you're debugging at the protocol level and there are likely to be fewer
bugs because you're writing less code.
Q: What's needed to bring TLM into
the design and verification mainstream?
moving from the early adopter or missionary deployment into the mainstream, and
I think right now we're at the inflection point. I think there were two issues
that didn't allow this to happen in the past. One is that TLM models were not
created to be synthesized and implemented -- they were created only for
simulation and modeling and therefore resulted in a discontinuity between TLM
In order to
change this, TLM needs to have a link to mainstream logic design and
implementation. We're doing that with C-to-Silicon
Compiler by embedding RTL Compiler into it, and building a TLM to GDSII
flow. Now we need IP providers to start delivering IP in TLM. We see that
today, but only for modeling and simulation, not for implementation. I think
that will start to happen as the methodology becomes part of the mainstream
Another point is that verification engineers were not part of this [TLM] environment. They
waited until the architecture was done and an RTL description was ready. We
have recently started to see large companies preparing their infrastructure in
order to extend their verification environment to TLM. This is a good sign,
showing us that we are at the tipping point of the change.
Q: How will Cadence participate in DVCon 2010, Feb. 22-25?
there is a SystemC
day Monday, February 22 and we will participate in multiple ways. Mike
McNamara [Cadence] and Mike Meredith of Forte will give a tutorial on the
SystemC synthesizable subset. On the same day, Brian Bailey will give a presentation
on TLM-driven design
and verification based on the Cadence methodology.
An OVM tutorial [Feb.
23] will show, for the first time, our OVM acceleration methodology applied to
our Incisive Palladium
products. I'm also going to participate in a panel [Feb. 25] that
will discuss how to minimize verification
time and effort. And Lip-Bu Tan [Cadence CEO] will be the keynote speaker [Feb.
24] for DVCon, and will talk about the new directions will Cadence bring to the
further information about Cadence participation in DVCon, click here.]