From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,42427d0d1bf647b1 X-Google-Attributes: gid103376,public From: Mike Cordes Subject: Re: Ada Core Technologies and Ada95 Standards Date: 1996/05/07 Message-ID: <4mo6su$s8o@cliffy.lfwc.lockheed.com>#1/1 X-Deja-AN: 153565457 references: <00001a73+00002c20@msn.com> <315FD5C9.342F@lfwc.lockheed.com> <828474655.17825@assen.demon.co.uk> <829673790.5774@assen.demon.co.uk> <830205885.24190@assen.demon.co.uk> <317CB211.3DBA@lmtas.lmco.com> <3180C57E.630C@lmtas.lmco.com> <4m2ke4$rg8@cliffy.lfwc.lockheed.com> <831410273.2370.0@assen.demon.co.uk> content-type: text/plain; charset=us-ascii organization: Lockheed Martin Tactical Aircraft Systems x-url: news:831410273.2370.0@assen.demon.co.uk mime-version: 1.0 newsgroups: comp.lang.ada x-mailer: Mozilla 1.1N (X11; I; AIX 2) Date: 1996-05-07T00:00:00+00:00 List-Id: john@assen.demon.co.uk (John McCabe) wrote: >Just as a brief example, here is a program I emailed to you which >proves the existence of a bug in my MIL-STD-1750A implementation (at >the time you declined to comment on it). I've removed all the comments >it originally had to get an idea of how much Ada is involved :- >This is the typical size of the "vast majority" of examples I have >provided in bug reports to compiler vendors. As you can obviously see, >it is very small, and _very_ non-proprietary. In actual fact when I >experimented further I found it could be made even smaller, but I left >it this way to try to give the compiler extra work to do. > John, How do you verify the existence/correction of compiler bugs? I.e., do include command scripts with the Ada source which verify behavior, or is it simply a general practice to have the compiler generate an assembly listing and have the examiner check generated code? In the example you provided, it would be *simple* (but not *automatic*) to check the generated code for correct behavior. An automated test is possible if you write a command script which compiles (thus identifying the exact set of switches used), links, and executes the example on a debugger/simulator. The automated tests turn out to be larger in size (i.e., more files - not more SLOCs), but the the time to verify is almost nil. This is a *HUGE* advantage when you are talking about running a large set (several hundred or more) of tests. After years of interfacing between Ada developers and Ada compiler vendors, I am still forced to tell the developers that "examine the assembly code for correct behavior" is not a satisfactory verification criteria for compiler error reports. (I'm not going to touch the subject of verification of optimizations and enhancements here ;) Mike ###