Verification Cockpit Platform

Modern verification is a highly automated process that involves many tools and subsystems. These verification tools produce a large amount of data that is essential for understanding the state and progress of the verification process. The complexity of the verification, the amount of data it produces, and the complex relations between the data sources demands sophisticated data science techniques. These include statistics, data visualization, data mining, and machine learning to extract the essence of the input data and present it to the users in a simple and clear manner.

The Verification Cockpit (VC) provides the platform and means to collect, process, and analyze verification data. The data is primarily collected from the project’s version control (git, svn), test submission and failure tracking system (hdwb), bug tracking tool (ClearQuest), coverage (BugSpray) and planning (RTC).

    The VC involves many technologies, including:
  • Extract Transform and Load (ETL) – These processes extract the data from the data source, transform it to the VC unified data model, and load it into the VC data warehouse.
  • VC Configuration Hub – This web-based utility allows the user to configure rules that impact the way data is translated to the unified data model, along with many other aspects of the VC.
  • Data Warehouse – The huge relational database is optimized for data analytics (where a typical query is selecting and manipulating a portion of the data).
  • IBM Cognos – The main reporting engine, used for all kinds of descriptive analysis.
  • VC Advance Analytics Engines – Using the data in the data warehouse and providing complex analysis – predictive and prescriptive.
  • VC REST API – Exposes the data to the entire design and verification community via standard REST API, which encapsulates the real data model.


Raviv Gal, Manager Verification & Quality Analytics, IBM Research - Haifa