Why Is Analog So Difficult?
EDA vendors and customers square off on what needs to be done to automate analog design flows; custom designs still plague sector.
DACeZine sat down with James Lin, VP of the technology infrastructure group at National Semiconductor; Steven Lewis, director of custom IC solutions marketing for analog, RF and mixed signal at Cadence Design Systems; Shrenik Mehta, senior director of frontend technologies and the OpenSPARC program at Sun Microsystems, and the current chairman of the Accellera standards group; and Sandipan Bhanot, president and CEO of Knowlent. What follows are excerpts of that conversation.
by Ed Sperling
Q: There’s a perception across the electronics industry that analog tools aren’t successful. Why not?
Lin: People use the tools, but they don’t meet our expectations. And the tools are not making any progress. There are no major breakthroughs. The Cadence design framework has been there for a long time. They have integrated more tools together and made it more user friendly, but the fundamental theory remains the same. There is no breakthrough technology.
Lewis: The question is whether you can do revolutionary things in analog like you can in the digital world, where you go from 65 nanometers to 45 nanometers to 32 nanometers. Those kinds of breakthroughs don’t happen in the analog world. We have a lot of customers who still make a good living at 0.35 microns, 0.25 microns, and 0.18 microns.
They’re not interested in going further than that because for power and current needs and everything else, those are the right nodes for them to work on.
Q: But some of those customers are now at 0.13 microns, right?
Lewis: Yes, and Virtuoso has the whole gamut. The question is whether there’s something new and revolutionary. People have talked about analog synthesis for a long time, yet that’s never quite jelled. Is it a tool issue, or is it that the technology doesn’t lend itself to synthesis? If you talk to analog folks, even though they have tools for doing optimization or doing statistical analysis, there’s still a thought of, ‘Well, I think my brain is better than the tools would be.’ It takes a long time for us to prove in the tools space that we can do a good job. I don’t see analog tools being revolutionary, but I also don’t see that as a problem. There are always things people wish tools did, but when you do take steps it takes a long time for people to use them. Are the tools driving the analog guys to be better or are the analog guys driving the tools to be better?
Bhanot: I think you need to qualify this. The implementation side changes all the time. There are timing, DFM, and statistical process variation issues. There is also the verification side. If the digital designers did verification the way analog designers do verification, no chip would ever tape out.
Q: Why is that?
Bhanot: That’s one place where revolutionary technology is needed. There are people designing blocks that are purely analog, and people trying to integrate blocks that have analog and digital portions. This mixed signal verification is very different. At the block level, are you doing all the checks that the spec says you should? You’d be surprised at how many companies—even leading-edge companies—don’t do it correctly. There is no easy infrastructure to say it’s done.
Mehta: The real question is whether analog design is an art or a science. On the digital side, having Verilog and VHDL as standards enabled an ecosystem where people could develop around those standards, whether it’s implementation flows or verification flows. That has allowed leading-edge practices to be replicated by a large number of designers. Ten years ago there was formal verification, but it didn’t get practically applied. Standards did help out. With systems on chip, there is a requirement that you put in an analog portion for networking, communication, a lot of the I/Os and the high-speed memory interfaces. You have to improve the analog if you want to get the SoCs out in a timely manner and also have them functional.
Q: What pieces of the analog tools flow work and what pieces don’t?
Bhanot: These standards are good when you’re trying to integrate vertical blocks. You extrapolate up and try to simulate the chip or a big block. But there are no standards for basic analog verification. Everybody is doing their own thing. There is no standard language. That may be part of the reason why test benches and the verification methodology are not portable across simulators, across companies and across flows.
Mehta: That’s an opportunity for a standards organization.
Lin: Having standards is good, but the first things that need to be fixed are the tools. If you look at [digital] synthesis, what became a standard was based on technology from only one company. Because it was successful, other companies adapted their tools to work with it, so they all work very smoothly. We need to fix the tools first. Once the tools are there, that can be the standard.
Bhanot: I agree. The way to drive standards is not by developing a standard first. Whatever people are using becomes the de facto standard.
Q: Standards follow the technology, right?
Lewis: That’s correct. The reason that Verilog and VHDL are successful is that you can standardize the digital flows. They lend themselves to standardization and synthesis. The analog side is custom. It’s either custom analog or custom digital or custom mixed signal. It’s fine tuning and realizing, ‘Oh, that didn’t quite fit,’ or ‘That isn’t exactly where I need it to be.’ You can find that through some standard testing methods, but you can’t create a standard for analog. In theory, SPICE was the standard language for analog, but there is an alphabet of SPICE. They’re not all exactly alike. What it’s coming down to is whether there’s a way to standardize the interface between this custom stuff and the stuff that is standardized. But what the analog guys do best is tinkering, and I don’t know how you standardize tinkering.
Bhanot: That argument works well on the implementation side, where you’re tweaking transistors. But on the verification side, there is a way to standardize. Eventually, when you have silicon, it is going to be put on the same testers to make sure it works. There’s no reason why you can’t bring it up to the SPICE level and the simulator level.
Q: Is the problem more of a business issue than a technology issue? Should the chip development be molded around the existing tools rather than the other way around?
Lin: I don’t think so. At National we tried to change from chip provider to solutions provider. If you look at a system-level solution from a design point of view, you should be able to run system-level simulation. But in the analog world, you can’t use the current design methodology. You need a system-level design methodology. Today you can design a chip, but when you plug it into the system the chip may not be optimized—especially the power consumption. The analog may perform well, but it also may suck up all the power, which is not good. We need the capability to build optimized systems.
Bhanot: And a system is not just a board. It could be on a chip. Everyone is scrambling right now to solve those kinds of verification problems.
Q: It’s more than just verification. Isn’t it the entire flow?
Lewis: Yes. It’s whether we know what the top-down system is going to look like so we can architect a chip with all of its various components—standard cells, custom cells, analog, the boundaries between them and even RF. The industry is slowly changing to that, but it’s only when people hit real pain points. If you’re making good money, there’s no reason to change. If there’s a way to make more money—enough to compensate for the pain of changing—people will do it. But you have to find the business justification.
Q: Is there an opportunity for bottom-up design in analog, cobbled together from a variety of startups and some of the big EDA vendors?
Bhanot: It has to be both startups and established vendors.
Lin: We quit thinking you can only have one design flow.
Bhanot: The technology is not there for a synthesis tool. We’re not at the point where you push a button and you have a functional analog design or analog block that satisfies all the constraints.
Lewis: Bottoms-up is the tradition today. That’s the way designs have been done. You have the analog guys in one place doing their thing, and the digital guys in another place doing their thing. Someone, in the end, has to figure out how to bring it all together. The question is whether that’s an efficient way to do design. On the surface, the answer is no, because you’re burning a lot of cycles on the end and having to retrofit stuff. So how do you find a way to bring these together? You can’t just swap from one to the other, but there are some real efficiencies in top-down design. You need to find a way to build the bridges. When you’re talking about custom and analog, it’s not revolutionary. It’s evolutionary.
Q: Let’s drill down into this. What are the real pain points for designers?
Lin: Our biggest problem is simulation. If you use the current technology, it takes a long time to complete a simulation job.
Q: How long?
Lin: It depends on the circuits. The shortest ones could be a few hours. The longest ones could take weeks. Because of that, designers don’t have a chance to finish all the simulations. That adds risk. A second problem is parasitics. More and more material will be embedded in the analog process in the future. MEMS (microelectromechanical systems) is one example. That’s not silicon. It’s a different material. You need to be able to simulate that. Another example is SIP—system in package. It’s a very popular technology. You combine two different chips together, and they could be from different processes. One could be 90 nanometers, the other one could be 0.5 microns. You need to be able to verify that. The way analog design is heading, you need to find a way to simulate that as fast as possible.
Lewis: We get that comment from National and other customers. In terms of simulation and verification, it’s a matter of gobbling down the information as fast as possible. As the nodes reduce on the digital side, there’s more information to take from them. As new materials are added on the analog side, there will be new information you have to gobble down there, too. The traditional way of dealing with this was SPICE, but that was too low. Next we swung to behavioral modeling, but that required people to learn it. Some people take to it, while for others it’s anathema. Then came something in between, which was Fast SPICE. The approach was to not do it at the device level. You had to abstract up, but not figure out behavioral languages. For a whole class of circuits, that worked great, which was custom digital. But you couldn’t use it at the device level because it’s not accurate enough. Now the Fast SPICEs are trying to give you the accuracy of the SPICE level with higher performance. The latest solution is multicore hardware, so you can run simulators in parallel. This is the same kind of hardware acceleration we’ve seen on the digital side. There’s a lot of activity on simulators. But what doesn’t work is when you tell an analog designer, ‘You can only do it this way.’ What Cadence has been doing is blending the technology, but the tools need to progress to the point where they can use a common language.
Q: Is this problem solvable?
Lewis: As an industry, we have already started to solve this problem. Most of the improvements in simulation are hardware-related. You can run them on multithreaded machines or multicore machines. It’s not a panacea, but it is an improvement.
Mehta: For a long time, people used to do custom digital design. The trend was to build bigger and bigger designs, and the question was whether you could do it faster. It was not optimized, but it was good enough for the business requirements. Some of the early adopters were people who did ASICs. Today, much of our custom is gone. It’s timing, power or some other constraint that we cannot meet, so we have to go customize it. If the pain point goes higher, people are going to ask whether they need every transistor in the analog design to be optimized. The trend is that there will be more and more opportunity for analog. There also is more opportunity for standardization.
Bhanot: As the speeds of interfaces on analog go up, the business issue is that analog has to budget for five or six re-spins. The chip comes back, you put it in the lab, and it doesn’t work. I know of IP companies where it took them seven re-spins and 1-1/2 years to get it right. Silicon has become the validation vehicle for analog, and that’s a problem. There are ways to catch it much sooner in the design flow. If there is a way to verify everything before you tape out, that would be good. Analog design methods have to meet time-to-market pressures.
Q: Given all of this, are time-to-market deadlines realistic for analog?
Lin: We do a lot of mixing on design and we use an analog and digital flow. But I always feel like the analog side is on a critical path. When the chip comes back because it doesn’t work, it’s always the analog part. You have to re-spin the silicon again. We do feel the time-to-market pressure, but a lot of problems are being solved because we have very good analog engineers. But in the future, if we want to improve time-to-market we will have to improve the tools. We cannot depend on the experienced designers forever.
Bhanot: It’s very hard to completely quantify designers' experience, but if we can capture that in tools and technology analog and EDA will benefit greatly.
Q: Because of these issues, is there a possibility there will be more digital chips replacing analog in the future?
Lewis: Yes. If you don’t have to do it custom, why would you do that? There have been things that in the past were entirely analog, but now are custom digital or in some way are more digital because it’s more standardized.
Lin: I have a different view on that. In the past, you tried to realize a function using a digital circuit as often as you could. That’s not true anymore, especially with the power conservation trend. Digital sucks in a lot of power. In order to save power, you need analog technology. For example, a cell phone base station is always on because it has to sense the signal all the time. There is an analog way to do the same thing without turning on the circuits all the time. With digital signal processing, we use digital to do the processing. Analog was there a long time ago. We know that analog signal processing saves a lot of power. People are going to have second thoughts about digital and go back to using analog.
Q: Is there a possibility for new competition in this market with standardized analog chips that may have 80 percent functionality?
Bhanot: I don’t think so. This is a community where people have been working for 20 to 25 years. If you try to hire analog-centric people overseas, it’s much harder and the quality is not even close to what it is here.
Lewis: If I can have 100 percent, will I accept 80 percent? Would you accept a telephone that is 80 percent as good?
Q: Don’t they already accept a phone network with 20 percent dropped or inaudible calls?
Mehta: Years ago Sprint advertised land-line quality where you could drop a pin and hear it. Today if you get a bad connection, your immediate reaction is, ‘I’ll call you back.’
Bhanot: But if there is a business opportunity for that, it’s likely to come from a company here than somewhere else.
Q: Let’s go back to the original premise, though. If you can’t get the tools from EDA companies, do you develop your own?
Lin: Yes. We prefer not to do it, but sometimes we are forced into it.
Q: What percentage of National’s tools are custom?
Lin: About 10 percent.
Q: And which area falls into that 10 percent?
Lin: The tool to analyze the reliability of the chips. Cadence had a device checker two or three years ago. We had that capability seven or eight years ago.
Q: So for Cadence is this an opportunity?
Lewis: There are always opportunities. For us, from the Virtuoso perspective, if we’re talking to 100 customers, they will all have things that are unique to them. It’s what gives them their secret sauce. For us, the opportunity is to find what’s similar for 50 of them. Can we develop and standardize that kind of thing?
Q: Because you need economies of scale for your own development?
Lewis: Exactly. There will always be things our customers will build for themselves because they need it faster than we can deliver it or because they can do it better than we can. They’ve got their process in front of them. They know exactly what they need and what their designers want. We made our name with the philosophy that we’ll build 80 percent with the ability to customize the last 20 percent. Some of that has changed, some of it hasn’t. But there will be some element that doesn’t make economic sense for Cadence and our customers will have to build it for themselves.
Q: Is that the same for verification?
Bhanot: We have standard suites for things like PCI Express. But even then, customers want to do things beyond the standards.
Q: From a standards perspective, does Accellera want to go after where the majority of designs are going or where it’s customized?
Mehta: We want to go after the majority.
Q: So it sounds like there are limits to how much will get automated in the analog design world, right?
Bhanot: I wouldn’t say that’s just for analog. It’s true in digital, in DFM, and analog. There will always be some level of customization in analog.
Q: But isn’t that level higher in analog than digital?
Bhanot: Perhaps. But it will change.
Mehta: You always add new features. Once people are comfortable with all the standards, they ask you to enhance the standards.