Has ESL – meaning Electronic System Level, not English as a Second Language – outlived its usefulness as a label that supposedly describes the next step forward for IC and systems design? “ESL” has become a vague term that applies to many different things. A more specific term, such transaction-level modeling (TLM), gives us something we can understand and evaluate.
While I have written many articles about ESL over the years, I would be hard-pressed to define it. Some would say that ESL involves hardware/software co-development. But the best practical definition I can come up with, considering its use over the years by multiple vendors, is “anything that takes place above RTL.” ESL has grown to encompass such tools and technologies as:
Algorithmic development tools that were there all along (SPW, Matlab)
- Virtual platforms for software and/or architectural development (Vast, Virtutech, CoWare, others)
- High-level synthesis (such as Cadence C-to-Silicon Compiler)
- Hardware/software co-verification tools (such as Cadence Incisive Software Extensions)
- SystemC TLM modeling for implementation and/or verification
All these technologies are important, especially as software development looms as the biggest single obstacle to getting electronic products out the door. While interrelated, all are different. There is no one ESL market or methodology. There are different markets with different users, many of whom still think “ESL” stands for English as a Second Language.
For many years ESL – and its predecessor, if anyone remembers ESDA – were handy ways of referring to a diverse set of technologies aimed at raising the abstraction level of IC design. But now that the underlying technologies are starting to take hold, it’s time to look underneath the all-encompassing “ESL” label.
Cadence this week is rolling out a TLM-driven design and verification solution. It includes enhancements to existing tools such as C-to-Silicon Compiler and the Incisive Enterprise Simulator, upcoming methodology guides and manuals, and services. Unlike ESL, TLM is a clearly understood term – there are even standard definitions provided by the Open SystemC Initiative. Many design and verification teams already use TLM in some form.
Not only is TLM easy to understand, but its benefits are clear and, in some cases, quantifiable. A TLM-based flow promises faster design creation and bug fixing, much faster simulation, fewer bugs, and better support for hardware/software co-verification. But perhaps the most compelling benefit is IP reuse; if you design IP at the transaction-level, it is much easier to port to different micro-architectures.
Early attempts at ESL all too often tried to impose a new methodology or non-standard language from the top down, with little or no connection to the downstream design flow. The nice thing about TLM is that it’s an evolutionary step up that leverages the strengths of today’s design environments, rather than trying to replace those environments.
While incremental, the move to TLM puts us in a better position to drive innovations in other technologies that have fallen under the ESL umbrella – including algorithmic tools, virtual platforms, high-level synthesis, and hardware/software co-design and co-verification. For example, transaction-level models can help build virtual platforms. They can also be used in hardware/software co-verification. And high-level synthesis is a critical enabler of the TLM-based flow.
The move upwards in abstraction can start right now, with TLM-driven design and verification. All technologies identified as “ESL” will benefit. But as we move up, let’s speak in clear terms engineers can understand -- there is no need to invent a “second language” to describe a natural, evolutionary process.