Remember DFT? “Design For Test” faded into the background in recent years as the industry turned its focus to DFM, but if anything test is an even larger concern than it was 10 or 15 years ago. That’s because test is becoming more difficult and expensive at nanometer process nodes, especially with the drive for low-power design and the increasing prevalence of on-chip analog and mixed-signal circuitry.
A recent discussion with Sanjiv Taneja, vice president for Encounter Test at Cadence, showed me that the traditional way we’ve evaluated test costs is way too limited. Test cost is traditionally calculated in terms of capital costs and operating costs. Test engineers focus on optimizing throughput in order to minimize the amount of time each IC spends on the tester. One way that’s done is minimizing test data volume.
While minimizing time on the tester is still important, Sanjiv notes that there are additional criteria that must be considered to evaluate the true cost of test. These include:
Impact of power on yield and test cost. If a test program turns on all the power modes during test, it can result in very high switching activity and excessive IR drop. Good chips can fail on the tester, reducing yield and increasing costs.
Integration of DFT with design implementation flow. Synthesis tools should optimize for testability as well as area, timing and power. Otherwise, test structures can impact routability and timing closure. Poor design/test integration will decrease productivity and thus increase costs.
Analog/mixed-signal test. If analog IP takes up 20 percent of a system-on-chip, it probably accounts for over 50 percent of the test cost. Analog test often requires an expensive, manual approach. Sometimes built-in self test (BIST) is used, but this requires extra work and planning.
Cost of escaped defects. If you think test costs are high, what does it cost for a defective chip to escape detection until system test, or until it’s out in the field? Shipping bad parts to customers can not only kill budgets – it can kill companies.
Ramping to volume production. Process interactions and process variability make it difficult to ramp to volume production at 45 nm and below. Given today’s time-to-market concerns, a delayed yield ramp can be a huge expense.
All of the challenges listed above impact designers, and all can be alleviated through EDA tools. For example, power-aware automatic test pattern generation (ATPG) can make the right tradeoffs between test power reduction and test time reduction. DFT automation tools can hook up BIST engines to a chip test interface, and translate BIST set-up and run-time sequences to test interface ports. Advanced fault modeling and test-point insertion techniques can reduce the risk of escaped defects. And diagnostic EDA tools can speed yield ramps by figuring out the root causes of failures.
Cadence offers such capabilities in the Encounter Test product line, which is being shown at this week’s International Test Conference (ITC) in Austin, Texas. The ITC program, meanwhile, has a strong DFT emphasis. It includes a keynote and a plenary invited address that focus on the integration of design and test, as well as panel discussions on DFT for analog and low-power design. Cadence has representatives on both panels.
DFT has been around for a long time. I started writing about it in 1984 for Computer Design magazine, well before the term “EDA” was even invented. Here we are now, 25 years later, and it turns out that DFT is more important than ever. Some things never go out of style.