From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,42427d0d1bf647b1 X-Google-Attributes: gid103376,public From: Ken Garlington Subject: Re: Ada Core Technologies and Ada95 Standards Date: 1996/04/26 Message-ID: <3180C246.5F63@lmtas.lmco.com>#1/1 X-Deja-AN: 151576880 references: <00001a73+00002c20@msn.com> <828038680.5631@assen.demon.co.uk> <828127251.85@assen.demon.co.uk> <315FD5C9.342F@lfwc.lockheed.com> <3160EFBF.BF9@lfwc.lockheed.com> <829851188.11037@assen.demon.co.uk> <830205883.24190@assen.demon.co.uk> content-type: text/plain; charset=us-ascii organization: Lockheed Martin Tactical Aircraft Systems mime-version: 1.0 newsgroups: comp.lang.ada x-mailer: Mozilla 2.01 (Macintosh; I; 68K) Date: 1996-04-26T00:00:00+00:00 List-Id: Robert Dewar wrote: > > I guess that neither you nor Ken have paid any attention to what is going > on with ACVC 2.1, but the whole idea of this effort is to make sure that > the ACVC suite better reflects actual user usage. I would be interested > in comments from either of you on this reformalation, and in your reaction > to the new tests, but to make useful comments you will have to sepdn more > time actually studying the tests and the ACVC process. OK, well, I'll restate one comment I made AFTER studying the new ACVC process and the tests (so far as the documents on the AdaIC server would permit): If the writers of the ACVC tests are expected to write tests that reflect how Ada is really used, how do they gain this information? For that matter, when looking at a particular test, how do I judge whether that test reflects common Ada usage? I know how I use Ada, but how do I know whether my usage is common or "marginal"? -- LMTAS - "Our Brand Means Quality"