The HDL Complexity Tool | |||||||||||
Home | Screenshots | Download | Documentation | People | |||||||
The HCT is continuously evolving. We are starting with McCabe Cyclomatic Complexity analysis to understand branch complexity. Then we are improving this with more sophisticated complexity scores that are calibrated with real defect data.
A good background on this is available. Please refer to "Measuring the Complexity of HDL Models" by Michael Schafers. Following we have a few excerpts that we use as design criteria for HCT.
There are a few HDL complexity factors that are defined in that paper:
In that paper, the idea of psychological complexity of HDL is introduced and the common aspects as well as differences of the software versus hardware design complexity are analyzed. The point is that a good complexity score will adhere to six rules:
Those are six lofty goals when you start to think about them. The transitivity implied by them is going to be tough to achieve. However, we are shooting for this and should use it as a guiding light.
The HCT development teams uses feedback from users like you working on real projects to identify shortcomings in our parser as well as to iteratively refine our scoring system and metrics. Iteration on a pure and simple generic parsing and computation framework is key to effective iterative design. Therefore, we strive to build an elegant software architecture by leveraging the right mix of ingredients from of our chosen implementation language PERL.
The end goal being to properly score the psychological complexity of hardware blocks in any HDL and use this to predict defect rates and schedule risks.
All of these addresses are lists.sourceforge.net
. Note that in practice most of them go to the same
small group of people, so please be patient and helpful, and please make sure to write your mail
in English.