From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.5-pre1 (2020-06-20) on ip-172-31-74-118.ec2.internal X-Spam-Level: X-Spam-Status: No, score=-1.9 required=3.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.5-pre1 Date: 18 May 93 23:42:50 GMT From: sparky!dwb@uunet.uu.net (David Boyd) Subject: Re: McCabe package for Ada? Message-ID: <1993May18.234250.18247@sparky.imd.sterling.com> List-Id: In article <1993May18.192447.9259@saifr00.cfsat.honeywell.com> shanks@saifr00.c fsat.honeywell.com (Mark Shanks) writes: >Who is using McCabe, and why? We are. We are utilizing the tools currently for two purposes. One is to gather some type of complexity and quality metrics for software. These metrics can be used to support costing and estimates when inheriting legacy software. The other use we are making of the software is for the Tuning phase of an Evolutionary Prototyping Lifecycle. High values for the various metrics tend to indicate modules which should be looked at for tuning. Several organizations are using the metrics for supporting structured testing and looking at the path coverages of their testing. Other uses are selective re-engineering (modules with high complexity tend to have high error rates), and reduncy elimination (modules with similar values tend to be similar). >What values are construed as "acceptable", >"requires further investigation", and "this code is rejected"? This depends entirely on your organization and what limits you want to set out. I am of the belief that there are some algorithms or functions that are inherently complex (the example I use is regular expression compilation/matching). Some constructs such as consectutive case statements generate abnormally high cyclomatic complexities. My general feeling is that these are primarily indicators of where to concentrate reviews/walkthroughs. The following were the ranges I used on a recent project for cyclomatic, essential, and design. All of these were given with the caveat that after inspection an informed cost/benefits analysis on whether or not the module would be re-written would be done. Cyclomatic Essential Design Good <10 <5 <8 Acceptable <50 <10 <16 Marginal <100 <100 <100 Unacceptable 100+ 100+ 100+ >How were these values derived? Enquiring minds want to know... The best reference for this is the National Bureau of Standards publication NBS Special Publication 500-99 "Structured Testing: A Software Testing Methodology Using the Cyclomatic Complexity Metric". This explains the theory and math behind Cyclomatic and Essential. Design complexity is derived from Cyclomatic complexity and is the count of basis paths through a module which interact with other modules. -- David W. Boyd UUCP: uunet!sparky!dwb Sterling Software IMD INTERNET: dwb@IMD.Sterling.COM 1404 Ft. Crook Rd. South Phone: (402) 291-8300 Bellevue, NE. 68005-2969 FAX: (402) 291-4362