Charlie Huang is chief strategy officer and acting CTO at Cadence Design Systems. In this Q&A interview, he talks about his background and his role at Cadence, discusses technology development in such areas as verification, low power, ESL, and mixed-signal design, and explains Cadence’s emphasis on R&D.
Q: What was your background before joining Cadence?
A: I joined Cadence through Cadence’s acquisition of CadMOS in 2000. I was one of the co-founders there, and our products were in the signal integrity area. The flagship product was CeltIC, which is still in widespread use today.
Before that, I was R&D manager at Synopsys for Pathmill. I joined Synopsys through its acquisition of Epic iin 1997. I joined Epic in 1990, right out of school, and I led the development of three of its products – Powermill, Timemill, and Pathmill. I received my PhD from Carnegie Mellon in 1990. My thesis was on model order reduction, which is widely used to deal with very large interconnects.
Q: What is your role at Cadence today?
A: As chief strategy officer, I help [CEO] Lip-Bu Tan and other management team members make sure that what we’re doing is co-ordinated and balanced and fits a strategy we can all agree upon. As acting CTO, I look at advanced technology and work with Andreas Kuehlmann, head of Cadence Research Labs, to determine whether a particular technology has a place at Cadence and assess its risks. Part of my work with [former CEO] Mike Fister was to assess risks in technologies, which combines a technical assessment with a view of market or business risks.
Q: What are the most important technology development initiatives underway at Cadence today?
A: One is low power. The low power effort is typical of the initiatives we plan to have in that it allows various parts of Cadence to come together and weave a comprehensive and integrated solution. Low power requires such a coordinated approach because power must be dealt with on an integrated basis – it’s no longer about which tool runs how fast. We have proven to ourselves that those “silos” people think Cadence has can come together to provide an integrated solution.
Q: What other technology efforts are important?
A: Enterprise verification is a major effort. Our view is that verification will grow faster than other segments of EDA. This includes traditional logic-level verification as well as the rise up to system verification. This is not a “my Verilog is faster than your Verilog” type of thing – it’s about the whole methodology. There is no silver bullet.
We have lots of tools in the toolbox, such as high-speed multi-language simulation, advanced testbench technology, mixed-signal verification, and formal and semi-formal methods. We’re looking at how to do verification when there are many pieces of pre-built IP, how to deal with problems when IP blocks are ill-defined, and how to deal with system specifications that keep changing. Since verification is, in all practical terms, an infinite problem, how do you know where you are and when you’re done? We believe this space will grow quickly, even if the number of IC design starts continues its downward trend. In a sense, this is our biggest theme.
Q: What about mixed-signal design and verification?
A: We have a very strong analog/mixed-signal design, analysis and verification capability. What we want to explore is how to synergistically integrate that with our SoC solution. Encounter and Incisive have their share of competitors, but our uniqueness is our very strong analog offering. There is no SoC today that has no analog parts. There is not a system today that doesn’t encompass some mixed-signal parts that need verifying.
Q: Do you envision a single system for analog and digital design?
A: I don’t. I think we will continue to have two flavors of platforms. Having a single cockpit is not something that adds tremendous value. But having two platforms that can share component technologies with each other is of value.
Q: There is a lot of competition today in the analog/custom design space. What is Cadence’s edge and how is Cadence responding?
A: Cadence’s edge, and burden, is its vast installed base. Having a large installed base allows one to fend off competitors easily, but it makes it harder for us to introduce new capabilities. On a niche or segment basis, there will be competitors who will try to exploit some weakness or special need. We need to either fight back or learn to co-exist.
We are putting a lot of effort into circuit simulation, including both fast MOS simulation and multi-threaded circuit simulation. In the last two years there’s been some strong competition. Intead of raising the white flag and rolling over, we gave the R&D guys a challenge, a budget, and a timeline within which things needed to get better. With the work we’ve done on Spectre Turbo and APS [Accelerated Parallel Simulator], we have managed to get up to par, or over par, and be competitive in these fields again.
Q: How would you describe Cadence’s strategy in ESL?
A: ESL is an overloaded word. There have been many, many false starts in ESL. People may be more favorable this time, because traditional design methods are finally hitting the wall in terms of both design and implementation costs. If the industry can step up and reach the next level of innovation, designers will be able to design complex chips with 500 million or 1 billion gates.
We have a design front end with C-to-Silicon Compiler. We want to make that a strong solution for both control and dataflow logic. We want to use it as a launch point into both implementation and verification. We happen to have a fairly complete path from C to GDSII.
In verification, we need to be able to co-simulate with IP blocks at different levels. We want to bridge system design with hardware/software co-design and co-verification. We are investigating how prototypes, be they FPGA-based or virtual platforms, can be developed and linked with our Palladium and Incisive RTL verification.
Q: What are the primary problems at 32 nm and below, and how will Cadence respond?
A: Traditional pain problems related to DFM may undergo non-trivial changes. We see more use of double exposure, and we see more use of regular design structures. Both will cause implementation tools to change. We also need to scale to multicore for capacity and run time, or adopt a hierarchical methodology, which will be very challenging for implementation tools.
Q: Cadence just opened a new R&D center with around 700 engineers. Why make this kind of investment in such tough economic times?
A: We think we’re in a recession, not a permanent depression, and that our customers and industry will come back. We need to make sure we have an ability to serve our customers and maintain market leadership. R&D is the hardest to turn on and the hardest to turn off.
Q: Cadence took some criticism in the past for acquiring technology rather than developing it in-house. What’s the situation today?
A: We have proven to ourselves, that when we face a tough and focused competitor, the answer is not to panic and necessarily do an acquisition. Given time and resources, we can fight back and prevail. Circuit simulation is an example. But, if we do see a very strong technology and it will take a long time to get there, we will make a make-versus-buy decision. There is no religious bias against “buy.” Last year we realized that verification IP was an important place to be, so we picked up the assets of three companies and quickly built critical mass for a VIP portfolio.
Q: Looking outside Cadence, what’s exciting in CAD research right now?
A: There is good research in SoC co-design and co-verification. It used to be very academic, but some of it is becoming more relevant. We see research institutes and customers investing in ground-breaking technologies like 3D ICs and FinFETs. Also, a lot of people are focusing on how to go beyond 22 nm. We spend a good amount of time tracking these issues. I think a lot of the problems people see today will become easier to solve.