Home > Community > Blogs > Functional Verification > 2012 ces top 3 trends impacting eda this year
 
Login with a Cadence account.
Not a member yet?
Create a permanent login account to make interactions with Cadence more convenient.

Register | Membership benefits
Get email delivery of the Functional Verification blog (individual posts).
 

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

2012 CES: Top 3 Trends Impacting EDA This Year

Comments(2)Filed under: Formal Analysis, DAC, ARM, DVcon, formal, EDA360, Joe Hupcey III, 14nm, CES, apps, OLED, Intel, LG, OLED 3D, Consumer Electronics Show, TV, CES2012

For years now consumer electronics have driven (nay, saved) the EDA industry.  Hence, many events at last week's annual Consumer Electronics Show (CES) in Las Vegas can be extrapolated as leading indicators for the EDA business.  While I couldn't personally attend CES this year, I had two trusted agents (specifically, Unified Communications (UC) expert David Danto of Dimension Data, and Joseph Hupcey Jr., video & communications systems architect and father of yours truly) on the ground to field check the myriad of reports streaming in from legacy and new media.  Thus, allow me to highlight the following three trends from CES 2012 that I suggest will have a big impact on EDA this year.

1. TV's rapid evolution: ahead of the equally dramatic and lucrative transition from feature phones to smart phones, virtually all TVs shown at CES were "smart."  Specifically, as the EDA360 vision paper foretold, it's not enough for TVs to support passive visual entertainment.  Instead, the TV of today must support a library of interactive apps, coupled with some sort of digital video recording (DVR) solution to be viable in all but the most niche markets.  That said, EE's and physicists alike should take comfort that raw hardware can still deliver its own differentiating punch: my agents reported several very good, "no glasses" 3-dimsenional displays, where the best of show was LG's prototype OLED 3D display.  Quoting David Danto's eyewitness account:

"Hands down - no contest, my pick hit of the show is LG's new 55 inch OLED 3D display. ...  I've been evaluating displays for most of my 30 year career and I've never seen anything that looked as good as this.  First of all, it uses LGs passive glasses technology.  No electronic shutters, no charging, no feeling like a reject from Tron.  Then, the actual images sacrifice nothing to achieve the 3D effect - bright, vivid, lifelike colors, deep blacks, true whites.  Then, to top it all off, it is so thin that from the side the thing is practically invisible."

(Note: from my experience, I agree that no picture or contemporary video can do a well sourced OLED justice -- to give you some idea of the bold visual character of an OLED's imagery, try to imagine a living magazine animated with rich, saturated colors like you get from the best dye sub printers.)

So what's the bottom-line for EDA for all this TV innovation?  In a word, growth.  How?  This new generation of TVs deliver a visibly different experience than the best "dumb" TVs of even a few years ago -- they will inspire new sales / upgrades.  Granted, you might be able to extend the "useful" life of a good-yet-dumb screen via web enabled boxes like TiVo DVRs and BlueRay players (like I'm doing with my 4 year old Sony Bravia LCD).  But either way -- via new TVs or new set top boxes -- many more SoC's and peripheral IC's will be shipped.

2. ARM vs. Intel competition:  While ARM is justifiably proud of the plethora of sockets they supported across the show floor, Intel fired a substantial broadside with the announcement they would be sourcing Atom architecture processors for Motorola Mobility (read, "soon to be Google", and all the Android market reach that implies) and Lenovo (read, "China and rest-of-world!").  In a provocative article "How Intel's Medfield Will Dismantle ARM", author Sebastian Anthony asserts that these announcements are only the opening salvo, and that Intel will be able to leverage their research and manufacturing strength to "physically" outpace competitors stuck on bigger nodes.  Indeed, while ARM has a lot of momentum to say the least, Intel's tantalizing hints last December that they have 14 nm devices working in the lab gives credence to Anthony's aggressive theories.  As for myself, I'll make the oh-so-daring prediction that whoever can deliver the best MIPS/watt by a >= 50% margin will be the ultimate victor.

3. The future of CES itself, and lessons for DAC: despite record breaking attendance, my agents confirm that there was palpable undercurrent of dissent -- fed by Microsoft's announcement that this was their last CES -- that the value of CES has diminished and/or is not aligned with the needs of the industry.   This expression of frustration about misalignment between show management and attendees triggered a personal flashback to the very last COMDEX (despite it being a very productive show for my company, where we sold hundreds of eyemodules in the temporary Handspring store between the exhibit halls).

Suffice to say there has been a lot of punditry about the future of the Design Automation Conference (DAC) over the past several years.  Aside from concerns about maintaining a forum for startups to introduce themselves, what seems to be missing are customers coming forward en-masse demanding that the show must go on.  To wit, in past years major semiconductor and systems houses would send dozens of front line engineering managers and developers, CAD staffers, and R&D executives.  Now you are lucky to see those same companies send a lone VP accompanied by 1-2 CAD managers.  And from the vendors' point of view, is it all worth it, in terms of marcom dollars ROI, or from the more abstract "branding" value?  Time will tell ...

Joe Hupcey III

On Twitter: http://twitter.com/jhupcey


P.S. Speaking of trade shows, in the verification space the annual DVCon's clear focus on functional verification technology and methodology has made it a growing, high value technical and trade forum.  Hence, my colleagues and fellow bloggers will be there in force!  In particular I welcome you to join me at the tutorial I'm hosting, entitled "Using "Apps" to Take Formal Analysis Mainstream", on Thursday March 1 from 8:30am-Noon (includes coffee & lunch)  Register today!

 

Comments(2)

By Anu Bohra on January 17, 2012
Hi Joe,
Also while following the EDA industry trends I have understood, that from the Chip design and innovation side,  focus would be more on chips being used in smartphones and tablets as compared to laptops and desktops in the last 5 years. That is one of the reasons Qualcomm  is doing good.  Any comments ?
Further I read the abstract of the tutorial that you are hosting. Great going with collaboration with Nextop and Oski!
Cheers
Anu

By Joe Hupcey III on January 26, 2012
Hi Anu,
I completely agree that smartphones and tablets are a huge, if not #1 driver of semiconductor consumption in general, and will outpace "traditional" laptops and desktop computers.  However, since so much has already been written on that, I wanted to highlight new TV tech as an equally impressive growth driver.
Thanks for the note on the upcoming DVCon tutorial -- I hope you can attend!
Joe

Leave a Comment


Name
E-mail (will not be published)
Comment
 I have read and agree to the Terms of use and Community Guidelines.
Community Guidelines
The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.