At DVCon Europe, Oct 24-25, Munich we presented "Enabling Visual Design Verification Analytics – From Prototype Visualizations to an Analytics Tool using the Unity Game Engine" where we showed how the bug reports generated by PinDown, our automatic debugger for regression tests, can be visualized in a cool way that enables an analytical view. It allows you to see which areas of the design that are error-prone and need some extra attention. It also allows you to identify areas of the design that lack test coverage. Here is a demo. We have explored this field earlier this year, but now we went more into the filtering aspects of the visualizer. You can set the time frame and search for a certain activity level/fault ratio to make it easier to analyze the data.

 

 

Most of the time we spent in the booth talking to verification people about how PinDown uses Machine Learning to predict bugs before verification even starts. This speeds up PinDown's own debug process and it also allow you to run the regression runs from a risk point of view (large test suite for risky commits, small test suites for safe commits).

 

 

At DAC55, June 25-27 at Moscone Center West, San Francisco, we announced that the latest release of PinDown is out now. This release uses machine learning to automatically debug regression failures much faster. Also, the bug prediction is available to users immediately, even before the verification starts. This means that you can direct the verification and debug efforts towards the riskier commits instead of using the standard brute force approach of trying to run as much verification as possible on everything. 

 

 

 

As expected with a hot topic like machine learning this new release generated a lot of interest at our booth at DAC, more than we have ever seen before.

 

 

 

There was a lot of traffic to our booth at DVCon, San Jose February 26 - March 1, 2018, where we presented the new PinDown release that will perform automatic debug using machine learning, something that will dramatically speed up debug of regression failures. We also presented a new way of displaying test results that allows you to identify coverage holes and error-prone areas.


A lot of interest at our booth at DVCon

Machine Learning

The coming PinDown release will have an improved prediction model that is able to identify the risk level of each commit in the revision control system based on analysis of the revision control history as well as the code itself. We have used data that PinDown has stored locally in its database onsite at customers to train the prediction model to recognize high-risk commits. Debugging commits in order of risk allows PinDown to find the bugs much faster.


PinDown Using Machine Learning

Showing Test Results as a Cityscape

We also presented a poster titled An Analytical View of Test Results Using Cityscapes which shows a new cool way of analyzing test results. The intent of this view is to give the verification leads/managers a helicopter view of the status of the project. It shows test coverage holes and error-prone areas which then can be addressed. Here is a demo video:

 


Test Results as a Cityscape