From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,42427d0d1bf647b1 X-Google-Attributes: gid103376,public From: dewar@cs.nyu.edu (Robert Dewar) Subject: Re: Ada Core Technologies and Ada95 Standards Date: 1996/03/28 Message-ID: #1/1 X-Deja-AN: 144773464 references: <00001a73+00002c20@msn.com> <828038680.5631@assen.demon.co.uk> organization: Courant Institute of Mathematical Sciences newsgroups: comp.lang.ada Date: 1996-03-28T00:00:00+00:00 List-Id: John McCabe said: "Yes. but at the moment the validation suite only consists of those parts of Ada that are common between Ada 83 and Ada 95 does it not. The fact is that the full validation suite including all the Ada 95 features won't be available until sometime in 1997. Hopefully the fact that the language has been divided into the Core language and the specialised needs annexes willhelp to ensure that Ada 95 validation is superior to Ada 83 validation. From my experience, Ada 83 validation didn't appear to prove much!" Wrong! The validation suite does contain tests for all parts of Ada 95 including all the special needs annexes, and this is true "at the moment" (where DO these rumours come from? :-) It is certainly true that the initial release of ACVC 2.0 and now 2.0.1 does not thoroughly cover all new parts of the language, but as any Ada 95 compiler implementor can tell you, they are definitely non-trivial, and any compiler passing all or nearly all of these tests is a pretty complete Ada 95 compiler. As for Ada 83 validation not proving much, if you feel this way, probably you somehow had completely unrealistic ideas of what validation was supposed to prove. For example, some people, surprisingly, thought that validatoin would guarantee full compliance. Gosh! We are all in the software busines, you would think that everyone knows that testing alone cannot guarantee absense of bugs. Still other people thought that validation would guarantee a usable compiler, even more surprising! One would have thought that the widely known fact that the Ada/Ed Semantic Speciication of Ada was validated would have tipped people off that this might not be the case (the ACVC was never, and still is not, a performance analysis suite). What does validation do? It makes sure that the vendor has implemented the entire language without significant gaps, and that the vendor has implemented large parts of the language (those parts tested) accurately. As a result, it is a good guarantee that the vendor undrstands the language completely and thoroughly. Can a test suite do more than this? No! Can it do a better or worse job of this? Sure. We think the ACVC 2.1 suite will turn out to be more effective, because we have learned something in 12 years! In particular we (the ACVC team and the reviewers) believe that the orientation to more user-oriented testing will be helpful in this regard (compare some typical 2.0 test with 1.11 tests, and you will see that the 2.0 tests are much more like real programs -- the test writer testing a particular feature thinks "how would this feature be used in a real program", and constructs a real program to answer that question. HOWEVER, although the suite will, we believe, be even more effective than the 1.11 suite, no one would claim that it guarantees 100% conformance or usability. If you hear anyone saying this, beware! they do not know what they are talking about. There are many ways to evaluate a compiler. GNAT is validated, but it has also been in wide use by thousands of users, in all sorts of different settings, from ingenious academic tests of the outer reaches of the Ada 95 language, to large (>500,000 lines) real-world applications. Frankly I think this real world testing of GNAT (or any other compiler) is probably worth more than the validation if I had to choose, but I don't have to choose, and it is nice to have both. The validation procedures against 2.0 certainly turned up some problems that are non-obvious, and had escaped the vigilance our thousands of users. I would guess that all other Ada vendors have similar experiences.