Contributing Editors: Peggy Aycinena, Richard Goering, Geoffrey James, Gary Smith
Editor-in-chief: Gabe Moretti
|vol.3 / issue 1 September 6, 2007|
Verification gets 'intelligent' at DAC
The Design Automation Conference usually brings with it some "hot topic" in terms of startup activity and technology announcements. This year, I'm going to suggest, the hot topic wasn't a new or emerging EDA market segment. Instead, I think some of the most interesting technology on the exhibit floor and in the technical program had to do with bringing more intelligence and automation to IC functional verification.
Verification is an issue that's been around as long as there's been a Design Automation Conference. So why would functional verification be a hot topic now? The reason is that as transistor counts head for 65 nm and 45 nm and below, the functional verification problem grows exponentially. Functional verification already consumes much if not most of the design cycle, along with much if not most of the RTL code that's written these days.
The entire verification process has become a bottleneck that's desperately in need of intelligence and automation. Designers need to be able to quickly generate real-world test scenarios, put those scenarios into increasingly complex testbenches, determine how to test various blocks in the design, control numerous simulation runs, apply formal verification, and use hardware-assisted verification with transaction-level interfaces. They also need better coverage metrics and quality control so they can determine when the job is done.
Some years ago, analyst Gary Smith put forth the vision of an "intelligent test bench" that would evaluate a design and apply the correct verification engines to various design blocks. As such, the intelligent test bench would provide an integrated, easy-to-use verification test suite that would avoid duplication and overlap. It couldn't be done from the register-transfer level alone, however, because it requires electronic system-level (ESL) information to work.
"This was a breakthrough DAC as far as functional verification was concerned," Smith said. "For the first time, we're seeing tools that can really carry out the full intent of the intelligent test bench." This is largely driven, Smith said, by the fact that verification is moving up to the transaction level, thanks to ESL-based methodologies. "Some companies popped up at DAC that are taking a look at how to do that," he said.
Among them was startup Certess Inc. (http://www.certess.com ), which bills itself as a company focusing on the "functional qualification" of ICs. Certess argues that a key problem of verification is the lack of objective quality assurance. Current coverage tools aren't good enough, the company argues, because they do not give any visibility as to whether the bug can propagate to an observable location or whether it can be detected.
Certess defines functional qualification as the ability to certify that "if there was a bug in the design, it could have been found," and that's just what the company promises to do with Certitude, a product announced just prior to DAC. It's claimed to be the first industrial implementation of mutation-based analysis, an active topic of academic research in software validation for many years.
Certitude profiles tests, injects faults, runs tests, and determines if they would have been detected. It gives a metric score for the overall quality of the verification environment, along with separate scores for activation, propagation and detection. In addition to its "metric" mode, it offers a "verification improvement" mode that lets users evaluate IP blocks and connections between blocks.
Another step towards the intelligent test bench comes from startup Breker Verification Systems (http://www.brekersystems.com), which introduced itself and its Trek product just prior to DAC. Trek claims to be the first commercial graph-based functional test synthesis tool. It lets users create a visual graph-based verification plan from which functional test vectors are automatically generated.
According to the company, Trek can reduce verification effort in terms of head count, code size and time spent by a factor of ten, while providing 100 percent verification plan coverage. In a case study on a 9-processor chip, the company claims, 12 engineers using a traditional hardware verification language (HVL) approach came up with 355,000 lines of code and took 6 months to build a verification environment. A single person using Trek for the same design generated 17,000 lines of code and was finding bugs within two weeks.
Breker's founder and CEO, Adnan Hamid, first worked with graph-based approaches while he was developing a verification infrastructure for AMD's first x86 microprocessor. While early attempts at using graphs in functional verification failed because the graphs exploded in size and complexity, Breker claims to have solved this problem with a dependency resolution engine that provides a more compact way to analyze verification plans.
Nusym Technology (http://www.nusym.com) also appears to have a significant piece of the intelligent testbench puzzle. Nusym is a stealth-mode startup that has made no announcements, and had only a demo suite at DAC. Still, the company has attracted significant interest, along with several detailed engineer reviews published in John Cooley's recent Verification Survey (http://www.deepchip.com/items/dvcon07-06.html). Nusym was also featured on Smith's "What to see at DAC" list under the "intelligent testbench" category.
Reviewers state that Nusym's technology automates both coverage and test generation. They say it can build a testbench from scratch, or it can work with an existing verification environment to determine how coverage can be improved. It also reportedly has a number of debugging aids, providing an ability to recreate and rerun bugs with short and specific tests as opposed to running simulation for hours.
Smith also placed Axiom Design Automation (http://www.axiom-da.com) on his "What to see at DAC" list in the "intelligent testbench" category. Axiom provides testbench automation, assertion-based verification, debugging, and functional and code coverage with a parallel processing, multiple-CPU capability. Just prior to DAC Axiom bought SysChip Design Technologies, adding protocol coverage technology to Axiom's MPSim verification suite.
Speeding up verification
Many other providers rolled out new functional verification technology at DAC. Among these is startup GateRocket Inc. (www.gaterocket.com), which demonstrated RocketDrive, a combined hardware/software system for accelerating FPGA verification.
Physically, RocketDrive is a box that contains either an Altera Stratix II or Xilinx Virtex-4 FPGA. It hooks up to a Linux PC over a PCI cable. Designers load portions of their FPGA design into the box using their FPGA implementation tools, and then link it to their existing simulation platform. They can then use the simulator's debugging capabilities, and according to GateRocket, can obtain a verification speedup of 10 to 100 times compared to simulation.
Startup ForteLink Inc. (www.fortelink.com) provides Gemini, a hardware-assisted verification system that runs in several modes. With its in-circuit emulation/prototyping mode, it interacts with an external target environment at speeds up to 200 MHz, according to the company. With its co-simulation mode, it allows a device under test to be driven by a Verilog or SystemC testbench.
Co-simulation is, of course, much slower than in-circuit emulation, but it provides better debugging support. To bring out the best of both worlds, ForteLink offers a "simice" mode in which a design connected to a target can be run for a given period of time, and then stopped for debugging. Verification engineers can thus debug while connected to a real system environment.
Established vendors also showcased new IC verification technology. For example, OneSpin Solutions brought out a formal verification solution for multiple configurations of IP blocks. ArchPro Design Automation, acquired post-DAC by Synopsys, offered a multi-voltage verification solution that generates assertions and tracks coverage. Mentor Graphics showed off the transaction-level interface with its Veloce line of emulators. And Cadence Design Systems demonstrated a new "hot swap" capability between simulation and acceleration.
Users tackle verification
Several papers in the DAC program showed how large user companies are tackling IC functional verification challenges. In one paper presentation, Itai Jaeger of the IBM Research Laboratory in Haifa, Israel described an approach to system-level test generation that uses an intelligent test generator to dynamically interleave test scenarios. A video and presentation for the paper are available at the DAC web site (http://videos.dac.com/44th/48_1/48_1.html).
Jaeger noted that system-level verification must validate the integration of several previously verified cores. To do so, engineers must run test "scenarios" that take an abstract view of such interactions as memory reads and writes. Once interaction-based scenarios are created, engineers can stress the system by combining them. For example, read/write interactions can be run in parallel with DMA operations or interrupts, or engineers can create address contention between DMA and processor accesses.
Jaeger's paper showed how X-Gen, a system-level test case generator that interleaves interaction-based test scenarios, was used by IBM for verification of the Xbox 360 chip and for power management of the Cell processor.
In another paper available at the DAC web site (http://videos.dac.com/44th/48_4/48_4.html), Wilson Snyder of SiCortex Inc. outlined a successful verification effort run by his company on a 198 million transistor, 6 CPU compute node. The approach combined C/C++ with RTL verification, used fast behavioral and cycle-accurate models for early debugging, and employed open-source productivity tools. Snyder said his company was able to control the speed, accuracy, and cost of the verification process, while ending up with only a few minor bugs in silicon, all of which could be fixed with workarounds.
Functional verification was, of course, not the only topic of interest at DAC 2007. But from a purely practical level, what stood out most for me was the thought that maybe we can get some real improvement in what many people see as the biggest single bottleneck in IC hardware design today.