From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,42427d0d1bf647b1 X-Google-Attributes: gid103376,public From: Ken Garlington Subject: Re: Ada Core Technologies and Ada95 Standards Date: 1996/04/01 Message-ID: <315FD5C9.342F@lfwc.lockheed.com>#1/1 X-Deja-AN: 145272285 references: <00001a73+00002c20@msn.com> <828038680.5631@assen.demon.co.uk> <828127251.85@assen.demon.co.uk> content-type: text/plain; charset=us-ascii organization: Lockheed Martin Tactical Aircraft Systems mime-version: 1.0 newsgroups: comp.lang.ada x-mailer: Mozilla 2.01 (Macintosh; I; 68K) Date: 1996-04-01T00:00:00+00:00 List-Id: Robert Dewar wrote: > John McCabe said > > I was obviously thinking of validation of Ada compilers in the same > way that _my_ software is validated - i.e a full set of test cases > proving that _all_ requirements have been met. If I cannot prove this, > my software is not accepted by my customer. > > 100% reliability via testing is only achievable for very simple tasks > that can be fully specified formally, and for which the number of > possible independent tests is finite. Since I often find myself expressing the same sentiments as Mr. McCabe, I thought I'd add my two cents: I can't disagree with anything in your response. However, when my company does testing, there are several things that happen. I suspect some of these happen in Mr. McCabe's shop as well: 1. We have a requirements specification that uniquely identifies each requirement. 2. We have a test or set of tests which can be traced back to each requirement. 3. We have consultations with the end user of the system to see if the tests are adequate, and reflect the usage of the system. 4. In addition to functional tests, we may also have other tests designed to meet certain criteria (particularly for safety-critical software). This criteria might include measures of statement/branch/path coverage and/or measures of data coverage. 5. In addition to the use of "tests" in the narrow sense of throwing inputs at the software and looking at the outputs, we can also use other analytical tools with regard to software quality, such as peer reviews of the design and implementation of the compiler, static analysis tools, etc. 6. Not that it happens much in my systems, but if a deficiency were found in a product after release, a test that checks for that deficiency gets added back into the test suite. It's probably just ignorance on my part about the ACVC process, but I don't get that same sense of rigor in the ACVC design. A lot of what's known about good processes for software testing (as documented by Beizer and others) isn't apparent from what little I've heard about the ACVC, and from reading the old ACVC 1.0 tests. I know that NPL has a tool that they sell that tests Ada compilers for bugs, that apparently provides much more coverage than the ACVC. Why should such a tool exist outside of the validation/certification process? Ada provides some wonderful technology for building dependable systems, but (and this sounds harsher than I intend) it's not clear that the compiler vendors always "practice what they preach." It would seem to me that one of the most dependable systems of a comparible size would be an Ada compiler, since Ada encourages the development of dependable software. The presentation of the Verdix Ada vs.C++ compiler at TRI-Ada aside, does this generally appear to be the case? Maybe it's just perception that's at issue here. When someone says, "ACVC doesn't say anything about usability for a particular purpose," I understand why that's said, but my heart sinks nonetheless. Why not an attitude of, "Even though we can't guarantee 100% correctness, we will by God use every tool at our disposal to identify deficiencies"? As it stands now, I get to do that particular task (in parallel with every other user who needs a reliable product)...