Contributing Editors: Peggy Aycinena, Richard Goering, Geoffrey James, Gary Smith
Editor-in-Chief: Gabe Moretti
|vol.3 / issue 2 October 4, 2007|
IN THIS ISSUE:
Richard Goering talks about the multicore wave
DAC is not just an event: it is a continuing process. As one conference ends, another begins. Committees are re-constituted, post-event evaluations made, a new budget is planned, and work immediately begins toward the next DAC conference. Each DAC is the result of new industry situations and past experiences. The professionals and volunteers who work on DAC face the challenge of using what works well, avoiding what was unsuccessful, and adopting positive solutions to the new set of challenges facing the next DAC.
Gary Smith shows that the industry has been only partially successful in providing a design environment that allows the joint architecture of both hardware and software blocks. More needs to be done, including finding a way to provide a common communication and networking environment for professionals in both fields. DAC needs to attract more software vendors and more software engineers in order to establish a discourse that will yield a common design environment. We are also seeing that leading edge companies are applying architectural techniques in solving analog design issues when targeting very deep submicron process nodes. Although some tools to enable such planning exist, more are needed. The job of developing and implementing ESL is not finished and the DAC environment is ready and capable to help.
Richard Goering's article identifies the most salient aspects of the multicore design challenge. Although parallel programming has been part of Computer Science curricula for many years, development of computing products, with few exceptions aimed at the small supercomputing segment, has been, until very recently, directed at improving the speed and capacity of sequential digital processing engines. Creative programming teams have adopted available tools to develop parallel execution architectures, but they represent a very small portion of both systems and application programming professionals. New tools are required to foster productive multicore development methods. The 45th DAC can provide a unique forum to explore issues of multicore systems design and use.
DAC and the DACeZine are tools you can use in growing your awareness of the industry and the possibilities it offers. I hope the publication can stimulate new ideas and new approaches. Continue to write: your feedback helps us improve. Send your letters to: email@example.com.
And tell your friends to subscribe to DACeZine as well -- it is a very good way to get ready for the next DAC in Anaheim. They can do so by visiting the www.dac.com web page.
Another vote for support of software development, verification and integration.
Hi Gabe et al.,
Great to see the DACeZine! The articles and layout look good.
I am also glad to see Richard as a contributor. I do, however, have an issue with “intelligent testbenches.” I do not believe that the goal should be tools that help assemble, run, and show coverage of testbenches. Our goal should be to get software running on the hardware/system as soon as possible. Having the software run as intended is often more important than a bug free design.
Our EDA industry likes to stop at the chip/board delivery stage. Yet, a product is a combination of software and hardware. And programming is the critical path. We must facilitate the product, not just the verification.
The software industry has been trying to shorten its schedule and their efforts seem to parallel this intelligent testbench concept. It is called “programming by intention” and the software industry has been trying to achieve this programming model for years. I personally believe we will never achieve this goal. As Bjarne Stroustrup said, “Programming is a human activity. Forget that and all is lost.” With due respect to Ruby on Rails, the software industry has not achieved “intention based programming” and we in the EDA industry should learn from that.
So what to do? As Frederick Brooks said, “There is no silver bullet.” The answer lies somewhere in the connection between software and hardware simulation/emulation and, even more importantly, in the connection between the hardware and software teams.
Let's Not Miss the Multicore Wave
The move to multicore ICs with dozens or hundreds of processor cores may be the biggest single design challenge of the next ten years, but is the design automation community ready? Presentations at the recent Design Automation Conference and elsewhere suggest there is much to do – and yet visible action is still lacking.
At a session on thousand-core chips at this year's DAC, Intel's Shekhar Borkar noted that ICs will have an integration capacity of 100 billion transistors by 2015. This will easily allow thousand-processor core chips, he said, with a potential for a near-linear performance speedup and a substantial power savings.
In the same session, IBM's John Darringer spoke about the design automation challenges posed by multicore ICs. Noting the requirement for innovation in system-level design, Darringer spoke about the need for three enabling technologies: physical architecture design, integrated early analysis, and multicore verification.
To facilitate chip integration, Darringer said, physical architecture must become more automated, borrowing techniques from "extended synthesis." Designers also need early analysis tools to determine which cores, accelerators, interconnect schemes, and memory hierarchies to employ in a multicore system-on-chip (SoC). These tools need better links to physical design. And system verification gets more complex when you bring in multiple cores, asynchronous links, and memory and network contention. Multicore verification requires a high level of specification and a strong reuse environment.
What I found most interesting about Darringer's talk is that I haven't heard established EDA vendors say much about these challenges. You'd think it would be an opportunity for some retooling, or at least an impetus for some new development in areas like electronic system level (ESL) design or verification. Meanwhile, there is work going on in defining new interconnect schemes like network-on-chip, but it's not being done by traditional EDA or silicon IP companies. Rather, it's being undertaken by a new breed of companies, such as Sonics, Arteris, and Silistix, whose business models seem to include elements of both EDA and IP.
Lessons Learned from the 44th DAC
When I left the semiconductor industry to become an EDA analyst, I was struck by two things. The first was the professionalism of the PR firms handling the EDA accounts, and the second was DAC.
As a semiconductor executive I would have given my right arm for a conference like DAC. What we had to deal with was a hodgepodge of regional shows that sucked up time and resources with an often-questionable return on investment. (By the way, in those days regional meant, Chicago, Denver, Boston, Atlanta, and Dallas. Not the somewhat romantic trips to Paris, London, Tokyo and Hong Kong.) DAC, on the other hand, had become the linchpin of the design world. It was Christmas and New Years all rolled into one. For Christmas you received the new EDA tools needed to solve the problems generated by Moore's Law's merciless march. That was followed by the parties, the New Years part of DAC, which announced the beginning of another design year. Everyone was there: designers and CAD managers. And of course if you were an EDA vendor and not in the very early start-up stage, your absence from the exhibit floor was tantamount to saying that you were in the process of going out of business. It was an EDA marketing manager's dream. The conference set the rhythm of the EDA Industry.
Today, the entire EDA communications infrastructure is in question: EDA analysis, EDA coverage in the press, and even our conferences. At the 43rd DAC the air was filled with talk about the lack of commitment from the major vendors, especially Cadence, to the conference. There was the specter of all EDA vendors going out on their own. Those with the biggest marketing communications budget would win: often over those with the best tools. The most refreshing take-away from the 44th DAC is that the conference is doing quite well, thank you. Cadence had a significant presence and the other large EDA vendors showed no signs of abandoning the show floor.
Still we have other problems. Above all we are going through the ESL/DFM Inflection Point. I think we are handling DFM pretty well. Conversations have dissipated, if not disappeared, about how DFM is really a semiconductor equipment market sector, or possibly, a market to be served by mask making vendors. Speculation has subsided about the potential threat that the large Semiconductor IDMs or Consortia would enter the IC CAD market, and monopolize the research necessary to move to the next semiconductor manufacturing nodes. This fear has diminished especially following Cadence's new emphasis on DFM.
Submit a Paper or Proposal and Be Visible at the 45th Design Automation Conference!
Mark your calendars! Deadline submissions for the 45th DAC are fast approaching, with the first being Thursday, November 1, for panels, special sessions and tutorial proposals. Others soon follow, such as the regular papers and Wild and Crazy Ideas (or WACI) proposal deadline on November 19, followed by the Student Design Contest deadline on December 5. Proposals for Hands-on Tutorials are due December 14, while workshop and collocated event proposals are due February 15.
If you are an academic researcher looking for industrial collaboration and support, or if you are a graduate student about to enter the job market, you should present your work at DAC. It is a unique venue to present your paper since your audience will be a mix of academic peers and representatives from across the design and design automation industry.
Verification: It's the
The exhortations of verification tool vendors aside, even today's unbelievably complex huge designs somehow get verified. That new cell phone, laptop, flat-panel, or even sports car you're digging? The designers were up against the wall to get the chips done by the deadline so you could have that thing in your pocket, living room, or garage. And they got it done. They might be fat, pale, and divorced, but they got it done. They got it done not just because they had to, but because they're good at it.
Verification gets done, not by tools, but by engineers -- as surely as a house gets built not by hammers and saws but by carpenters. For those of us in the business of supplying the tools, it is tempting to over-emphasize the tools and the techniques that they automate, claiming that we deliver intact some sort of "methodology" that makes the work easier. We do not.
Imagine that you are a carpenter. Do you go down to Home Depot and say "give me everything I need to build a house" and expect that you can load it all in your pickup and somehow get the job done? You do not. You spend years refining your techniques. You build up a kit of tools that fit your hand, and you learn how to use them. You're constantly shopping, evaluating new tools – will this new power saw help you build faster, or this new plane help you be more precise? These decisions are not taken lightly.
The process is of course somewhat different with EDA tools, as semiconductor companies are spending millions to equip large teams with effective tools. They understandably want to put together the most cost-effective tool chest they can. But the fact remains that there is no shortcut to an effective, state-of-the-art verification process or flow. The practice of applying the tools in innovative, productive ways is up to the verification teams that use them. Indeed, the best teams constantly study and refine their verification methods. They do not adopt a "standard flow" or a vendor-specific "methodology."
In fact, the very notion that a "methodology" is a way to apply tools is flawed. The suffix "ology" denotes "the study of." Thus, methodology is truly the practice of constant study and refinement of the ways in which the verification tools and techniques are applied.
Sometimes verification engineers refine the ways in which they or their teammates construct or represent the design in order to enhance verification productivity. This we can call "design for verification," although I don't really like the term.
Forward to a Friend
|visit us at |