Should the end user's application experience, rather than "cool" features in silicon hardware, be the primary focal point of the system development process? That idea was explored earlier this year in the EDA360 vision paper, and it's become a hot topic of discussion. That discussion came to a new level at last week's ARM Technology Conference (ARM Techcon), where panelists tackled some challenging questions related to application-driven system development.
The panel was entitled "Are System Developers Ready for Applications to Drive and Define a New World Order?" It was moderated by Ron Wilson of UBM Electronics (EE Times). The panelists, left to right in the photo below, were:
- Ashok Mehta, senior manager, TSMC
- Peter Ryser, director of system integration and validation, Xilinx
- David Rusling, CTO, Linaro
- Vahid Ordoubadian, senior manager, Engineering Mobile Platform Group, Broadcom
- Mike McNamara, vice president and general manager, Cadence
Here are some of those challenging questions, and some answers:
Question 1: If the whole point of systems development is the user's application experience, are we evolving to the point where there is no reason to spend money on differentiating the underlying hardware, and where we'll produce some small, generic hardware platforms and focus all our efforts on software and firmware?
(Note: This is the same question that arose at the ARM-Cadence "fireside chat," as I described in a blog post last week).
Hardware differentiation will not go away, Mehta said. "Dynamic power can be managed by software -- but leakage, not quite so," he said. "At that level foundry process improvements will be required." Another way hardware provides differentiation is through the use of sensors.
Software development is very important, but underlying differentiation in hardware is still needed, Ryser said. "If you can do that on the fly in a running system that is not fully fixed, I think you have a big advantage."
If we end up with a single hardware platform, the result will be software "with no trace of innovation," Rusling said. "The embedded and mobile space is one of incredible diversity and innovation, and from a software perspective you need to exploit all that diversity."
Mehta said he had a "little bit of a concern that if we are moving towards a software-driven world, there may be a scenario in which in order to be portable, software can only use the very common parts of the hardware. That will defeat the purpose of having advanced hardware." McNamara concurred: "The worst kind of flow is a least common denominator. If we say, ‘I won't use this [hardware] feature until every single possible application supports it,' that's the worst case."
Question 2: If the object is to support the user's application experience, what does that mean for integration and verification?
"Today we have the ability to put together systems that model the environment we're going to get - a virtual model of the end device," McNamara said. He noted the importance of modeling the entire system. You may have a component that can do 24 frames/second video, but once put into a system, it's forced to share a small memory capacity and it can't provide a good viewing experience.
"Models are great, but a lot of the time you still need physical interfaces you can validate," Ordoubadian said. There are "situations in which you want some kind of emulation environment, where you can run your code and make sure the way you were thinking is how it is going to work."
Ryser pointed out the difficulty of debugging end applications that are built on top of layers of hardware and software. "When the application developer has a problem, how does he figure out what the real problem is?" The solution, Ryser said, is to "build on top of components that are very well defined."
Question 3: How can we balance everyone's love of diversity and competition versus the fact that every new design is going to cost something like $50 million to launch and 90 percent are going to fail?
Mehta put it bluntly. "If there is no diversity there will be no innovation. The bottom line is that we have to reduce the $50 million to $100 million cost to develop a new system." He added that "at every startup I ever worked at, I had to reinvent the wheel. There are no industry standard methodologies you can pick up and use. If we want diversity, we have to reduce that $50 million cost to $20 million, and I can assure you that you just spent $20 to $30 million redoing what every other startup company is doing next door."
Question 4: When applications rather than cool hardware features drive business, are there fundamental changes in how you do business?
"The cost of software development is now much higher than hardware because the teams are bigger," Ordoubadian said. "What we have learned is that when you're dealing with complex SoCs, you can have a ratio between hardware and software that ranges from 1-to-4 to 1-to-10."
Mehta said it isn't strictly true that applications drive hardware, or that hardware drives applications - "it's a two-way street." If you put a sensor on a piece of silicon, you enable hundreds of new applications, and in this instance hardware is driving the applications. On the other hand, Intel's MMX instruction set extensions were driven by a demand for the processor to handle multimedia, so "in that case the application drove the hardware."
This is exactly the kind of discussion the industry needs to be having. Thanks to ARM Techcon for providing this opportunity.
Other Cadence blog coverage of ARM Techcon
Steve Leibson, EDA360 Insider
Realizing the ARM Cortex-A15: What does the road to 2.5GHz look like?
ARM and Cadence: Playing with fire
ARM Cortex-A15 - does this processor IP core need a new category...Superstar IP?
The era of superintegration: The Marvell and ARM story - more than one billion chips served
Richard Goering, Industry Insights
ARM Cadence Fireside Chat: Hardware Differentiation in an Apps-Driven World
ARM Techcon: IBM Speaker Outlines Path to 22nm and Beyond