Device driver software is an essential part of any system-on-chip offering. But who develops and verifies this software, and what tools and methodologies do they use? This is an increasingly vexing question for many design teams - but it's absolutely critical as the industry moves towards application-driven design.
Why are drivers so important? Because they provide the link between a software application and the hardware it runs on, allowing the OS and the application to manage hardware resources. Application developers today don't want to have to know about special features in the hardware. They rely on a hardware abstraction layer that includes drivers to abstract those details away. If the drivers don't present hardware features accurately to the OS and the applications, the applications can't use those features, and all the hard work that goes into differentiating hardware goes nowhere.
Drivers Tacked On Later
So how are software drivers typically developed? The EDA360 vision paper notes that hardware is typically built first and drivers are tacked on later. This leaves two bad choices. One is an expensive, custom driver development effort, typically by a software person with limited hardware knowledge (or a hardware person with limited OS knowledge). The other choice is the procurement of a generic device driver that won't reflect the unique capabilities of the hardware.
For these reasons, the vision paper suggests that device drivers be provided as part of an IP "stack" (right) that also includes design IP, verification IP, and design constraints. Optimally, the drivers are specified and designed in tandem with the rest of the IP stack. This eases IP integration, avoids redundant driver development efforts, and makes it easier for the OS to directly control hardware resources such as memory bandwidth to meet the needs of the application.
Verifying Device Drivers
Who verifies the drivers, and how, is another tough question. Techniques such as metric-driven verification (MDV), coverage metrics, and constrained-random test sequence generation are widely used in hardware verification, but are virtually unknown in the software world. However, these techniques would be extremely helpful in tracking down tough bugs that may have their origins in the interface between hardware and software.
A newly-published Cadence technical paper suggests a new approach. It shows how Incisive Software Extensions can extend the Cadence Incisive simulation environment for driver verification and debugging, bringing techniques such as MDV and random sequence generation into the software domain. One result is a unified hardware/software debugging environment that can be used by both hardware and software engineers.
The real point of this post, however, is that software device driver development and verification should no longer be thought of something that is "outside EDA." It is very much a part of silicon design. In fact, if it's not done well, all those neat features you put in silicon won't matter much at all.