Home > Community > Blogs > System Design and Verification > back to school and back to the embedded software challenge
 
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more conveniennt.

Register | Membership benefits
Get email delivery of the System Design and Verification blog (individual posts).
 

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

Back to School and Back to the Embedded Software Challenge

Comments(0)Filed under: System Design and Verification, virual platform, DAC

The kids have a week of school in the rear view mirror and it's time to get back to the embedded software challenge.

Remember when every EDA vendor started saying "Verification is taking 70% of the time on every chip design project"? It seemed like every paper, article, and presentation started with this assumption. I don't know when it started, but it was already heavily used by 2004. Using your favorite search engine you can find all kinds of papers that started with something like this in the opening paragraph (I'm not sure if you will find any that I wrote, you might). Grant Martin even wrote a Mythbuster blog entry last year on the topic with many responses.

I'm starting to think the next such cliché is upon us.

At DAC 2009 and in multiple places since, there is a graph showing the rising design cost of embedded software in SoC projects. There are probably many references, but I'll share a few and readers can add more.

The first reference I remember was an article back in May by Richard Goering titled Meeting the Embedded Software Challenge. Richard referenced data from International Business Strategies (IBS) stating how much of the overall dollars spent on a project are for hardware vs. software, and how at 22 nm 3/4 of the cost will be on software development. The article also referenced the International Technology Roadmap for Semiconductors (ITRS) 2007 report projecting roughly the same 3/4 of the dollars will be spent on software and 1/4 on hardware in 2012.

I have seen multiple instances of the graph in question used in presentation materials. All are similar in that they show the rapidly increasing cost of embedded software in the design process. I don't want to bother with permission and copyright issues so I'm not posting a picture of the graph here, but a public article is available titled Software-to-silicon verification at 45 nm and beyond by Tom Borgstrom and Badri Gopalan of Synopsys published in on EE Times July 13. The graph shows IBS2009 as the source.

I'm not sure what the cliché will be related to this graph, something like "embedded software is responsible for 70% of the cost of every chip design project".

To make real progress we need to dig deeper and see what this cost is made up of. Just showing the graph and assuming users need our tools to solve the rapidly growing embedded software challenge is not good enough. Mapping the money spent on embedded software into tasks engineers are doing would be very helpful, things like:

  • coding new applications 
  • writing new device drivers
  • porting applications from other platforms 
  • writing diagnostic programs to test the hardware
  • testing a device driver works with one or more hardware configurations
  • writing hardware abstraction layer code  
  • porting an OS to a new device

Once a breakdown of the tasks engineers do is identified we need to identify the main challenges they are face, in other words, what could make them more productive if they had it. Things like:

  • Having a Virtual Platform earlier in the project
  • Using a better debugging tool to get to the cause of problems more quickly
  • Doing metric driven verification to better quantify verification and find more bugs, earlier
  • Analyzing how software and hardware interact to tune performance and identify bottlenecks

As I go about my daily work, I look for ways to learn more about what embedded software engineers are doing and what the challenges are they face. Only time will tell how EDA companies like Cadence can contribute to solutions to drive down the cost of embedded software, but next time you see this graph showing increases in software cost, ask questions (or give answers) about why the costs are increasing. Is it because software is lacking automation in some area and the manual tasks are not scaling as the design size grows? Is it because multi-core design is making things more complex and the added complexity of of designing and testing multi-core software is not scaling? Is it something else?

Remember, to improve we need to avoid general statements that are not true for all cases, like "all dogs have four legs", and work together on the specifics of how to improve productivity and save money.

Jason Andrews

Comments(0)

Leave a Comment


Name
E-mail (will not be published)
Comment
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.