Verification: It's the Method-OLOGY, stupid!
President and CEO
Novas Software, Inc.
The exhortations of verification tool vendors aside, even today's unbelievably complex huge designs somehow get verified. That new cell phone, laptop, flat-panel, or even sports car you're digging? The designers were up against the wall to get the chips done by the deadline so you could have that thing in your pocket, living room, or garage. And they got it done. They might be fat, pale, and divorced, but they got it done. They got it done not just because they had to, but because they're good at it.
Verification gets done, not by tools, but by engineers -- as surely as a house gets built not by hammers and saws but by carpenters. For those of us in the business of supplying the tools, it is tempting to over-emphasize the tools and the techniques that they automate, claiming that we deliver intact some sort of "methodology" that makes the work easier. We do not.
Imagine that you are a carpenter. Do you go down to Home Depot and say "give me everything I need to build a house" and expect that you can load it all in your pickup and somehow get the job done? You do not. You spend years refining your techniques. You build up a kit of tools that fit your hand, and you learn how to use them. You're constantly shopping, evaluating new tools – will this new power saw help you build faster, or this new plane help you be more precise? These decisions are not taken lightly.
The process is of course somewhat different with EDA tools, as semiconductor companies are spending millions to equip large teams with effective tools. They understandably want to put together the most cost-effective tool chest they can. But the fact remains that there is no shortcut to an effective, state-of-the-art verification process or flow. The practice of applying the tools in innovative, productive ways is up to the verification teams that use them. Indeed, the best teams constantly study and refine their verification methods. They do not adopt a "standard flow" or a vendor-specific "methodology."
In fact, the very notion that a "methodology" is a way to apply tools is flawed. The suffix "ology" denotes "the study of". Thus, methodology is truly the practice of constant study and refinement of the ways in which the verification tools and techniques are applied.
Verification teams that study their methods discover where they have spent too much time or made mistakes – and they make changes. They evaluate emerging tools, tactics, and techniques to find out how they might improve the flow. And they innovate – they don't accept cookie-cutter solutions dropped on them by EDA sales and marketing folks.
Sometimes verification engineers refine the ways in which they or their teammates construct or represent the design in order to enhance verification productivity. This we can call "design for verification," although I don't really like the term.
All these "design for something" terms seem to imply a certain tool set or flow. I think this started with Design For Test (DFT), which came to mean insertion of scan chains so that Automatic Test Pattern Generation (ATPG) tools could be used. There is no doubt that scan chains and ATPG have had a dramatic impact on productivity. But is that the end of the road for DFT? Finding the improvements in the methods is the –ology I'm talking about! And it takes hard work by all involved.
I hope that Design For Verification comes to mean refinement of the design process to facilitate easier, faster verification. And I hope that engineers will come to see "methodology" as the constant study of design and verification practices to continually improve them, instead of using the term to mean "the set of methods I'm using right now."
We're at a point of rapid evolution of verification techniques. There is in fact a lot of "method-ology" going on. New languages and abstractions are being adopted. Design for Verification is not just about the design, but also includes modeling of the "rest of world" environment, otherwise known as the "testbench."
Object-oriented programming (OOP) techniques are the most important element adopted in recent years, and the practice of applying them to logic verification continues to evolve rapidly. OOP allows higher levels of abstraction to hide detail, making large, complex designs and the surrounding environment easier to understand. OOP techniques also help to bridge the divide between hardware and software.
One of the key challenges facing design and verification teams today is whether the system software is part of the design or part of the testbench. The lines are blurring. So while platform-based design aims to simplify verification by reusing the underlying hardware as much as possible to leverage the effort across more chips and end products, the necessary inclusion of the software with the platform makes verification harder, not easier.
The emergence of SystemVerilog, the Verification IP market, and the release and refinement of the VMM/AVM/OVM libraries and accompanying books are great steps forward in the normalization of the testbench environment. They have the potential to free engineers to take their study of verification methods to new heights. Instead of worrying through how to create the basic building blocks, engineers can think instead about higher-level matters such as how to verify the software. Again, SystemVerilog and the various libraries are not a "methodology," they are "methods" – and it is up to the engineers to supply the "ology."