Hardware Verification Technologies
The Hardware Verification Technologies department develops methodologies and tools geared toward assisting in the verification of IBM's hardware development. Our technologies encompass both complementing approaches for this domain, i.e., simulation-based and formal verification. The department specializes in the development of solutions for high-end designs focusing on the verification of micro-processors, multi-processor designs, and large systems.
Our principal assets are in the domains of random test generation and model checking. In these areas, we have several well-known state-of the-art solutions. These technologies have been the main focus of our department since its inception, and continue to represent the cornerstone of our activities today:
- Genesys-Pro: Random test generator for processor level verification
- RuleBase: Model checker for formal verification
Over the years, our department has developed several generations of these basic technologies, each time proposing a new paradigm and each time yielding state-of-the-art solutions above the industry standards. Presently, we are working on the two most recent versions of these technologies:
- TPA (Test Plan Automation) is a methodological layer of exploitation of Genesys-Pro for automating test plan verification. Technology-wise, this new approach represents a significant upgrade of the Genesys-Pro platform, primarily the transformation of the basic scheme of generation from instruction-centric to scenario-centric.
- RuleBase SixthSense edition represents the merger of two tools independently developed in IBM during the last two decades: RuleBase Parallel Edition, which was developed in this department; and SixthSense, developed in the Austin EDA lab initially as a tool embracing the semi-formal domain, and which in turn became a powerful full-blown model checker featuring the transformation-based approach. By exploiting the relative strength of both tools, RuleBase SixthSense edition takes model checking into a new realm in terms of speed and capacity, the two key challenges of this area.
These two technologies are fully generic and can thus be applied to any hardware design or architecture. While the model checking technology is generic by nature, such genericness is reached in the area of random test generation by applying a model-based approach. Through this approach, the tool has a clean separation between the architecture dependent and independent parts, thereby making it applicable to any architecture. Whereas this enables quick customization and exploitation for any new IBM designs, it also opens the door for usage by other companies. Indeed, these technologies are licensed to several external customers, with a current focus on (but by no means limitation to) the ARM architecture.
Spawning from these two basic technologies, many additional verification technologies have been developed in the department. One of the most basic ones is X-Gen, our random test-generator for system-level verification. Borrowing the model-based approach from its Genesys-Pro cousin, X-Gen has been (and still is) advantageously applied to all large IBM systems for more than 10 years.
At the core of these test generation tools lies a constraint satisfaction engine which enables to solve the multitude of often competing constraints which naturally emanate from user requests, legal requirements and biasing. Since constraint solving is such a central topic in Test Generation, we have developed through the last decade our own tool, called GEC, for this purpose.
In addition, the department has also projects in the areas of soft-error (InsightER), pervasive logic (Sequence verification), floating-point verification (FPgen), and designer-level verification (Diver). It is worth noting that FPgen, which fully supports the ubiquitous Floating Point IEEE standard, is also licensed to external customers.
We develop solutions for the post-silicon stage, as well. Our main tool in this domain is a bare-metal exerciser called Threadmill which is used by P-Series.
In parallel, and in a sense above most of these activities, we are also investing a significant effort in the Verification Cockpit project, which is developing a platform to assist verification leads to conveniently oversee and control the whole verification process. The basic observation is that this process produces massive amounts of data. Our platform will assist in displaying the process and analyzing its contents, trends, and correlations, thereby helping the decision-making process, and ultimately enabling the automation of some of these decisions.
Lastly, following the idea that verification can have a simpler task if the design is itself cleaner, we have been along the years also investing in developing tools for logic design assistance. Currently, our main activity in this domain is the development of a tool (called Tiger) for timing analysis.
These various projects are distributed into the following three groups:
Our verification technologies are focused on Hardware. Solutions for applications of our verification assets to domains beyond hardware (software, system engineering, etc.) are developed in the Emerging Quality and Design Technologies Department.