From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,e4e62e0a73fb6667 X-Google-Attributes: gid103376,public From: dewar@cs.nyu.edu (Robert Dewar) Subject: Re: The Ada Compiler Evaluation System Date: 1996/04/21 Message-ID: X-Deja-AN: 150738162 references: <4l2nt1$p4k@ns1.sw-eng.falls-church.va.us> <31761BD5.7D11@lmtas.lmco.com> organization: Courant Institute of Mathematical Sciences newsgroups: comp.lang.ada Date: 1996-04-21T00:00:00+00:00 List-Id: Ken Garlington said >If it's OK for users to pay for quality improvements, why not have >users pay for AVO support? If they want NIST certification, fine. If >not, they can use something else. It appears to be the stock answer >for every (other) potential standard method of measuring/improving >compiler quality. Why not apply it to ACVC? I think you answered your own question. The point is really quite simple. (Again I remind you we are talking here only about internal DoD policy). The DoD should require of all Ada vendors only those quality criteria which arguably apply in a cost-effective manner across the board to all DoD users of Ada. Why should we do even that much? That's actually not entirely clear. If all procurement officers in the DoD were to require all vendors to do XXX for their project in order to sell their compiler, that would in practice have the same effect as a mandate, i.e. vendors who want to sell to the DoD would have to do XXX. Of course, as is the case even with the "requirement" of validation, if you don't want to sell to the DoD, you can ignore even validation. I guess the argument in favor of doing that much is not so much to control vendors as to control procurement officers. If the DoD has determined that for it purposes all Ada compilers should satisfy XXX, then presumably it has an interest in mandating XXX and taking away the freedom of procurement officers to say "I don't care about XXX, I would rather save money." That seems to be the motivation behind requiring validation. The requirement for validaton does of course increase compiler costs (e.g. in comparison with C++) but if the DoD determines that this cost increase is commensurate to the quality improvement, then the general requirement makes sense (although in that case, DoD should not complain that Ada is more expensive than C++ to the extent that this increase in cost is caused by the extra requirement in the Ada case -- in practice I think this inrease in cost is fairly minimal, yes validation is expensive, but the expense is not huge in proportion to other factors -- talking from one vendors point of view anyway). Going back to the original issue, Why not do more? i.e. why not have DoD universally require more than ACVC testing. There are two subquestions: 1) would requiring more increase quality for all possible applications of Ada in the DoD? 2) even if the answer to 1) is no, should it be done anyway? Let's take 1) first. If the answer is yes, the DoD still needs to do an assessment of whether it would increase the cost unacceptably. This is certainly difficult to do assess across the board. The current statement of policy is a judgment that either the answer to 1) is no, or that if it is yes, then the cost would be too high. The answer to 2) is almost certainly no. The issue is that if you mandate extra testing or certification that does NOT help quality for certain applications, it will most certainly DECREASE quality for these applications by diverting resources. It's quite interesting that no one else has hopped into this thread. Certainly I assume that all the vendors and users reading this thread are interested in improving quality. Q. Are you interested in improved quality A. Of course. Q. Do you think all compiler vendors should be required by DoD to do XXX? A. Hmmm! not so simple to answer, we would have to look at XXX very closely. Ken worries that if every user does their own evaluation that is inefficient. Sure that is true, but no centralized DoD testing is going to relieve you of that requirement. Early on some naive people thought that validation was some marvellous panacea that would relieve them of the need to carefully evaluate competing compiler products. What Ken seems to be looking for at times is some other formula that would provide that panacea, and I am afraid it is not possible. Even if all sorts of extra requirements were placed, it would not relieve someone starting out on a major project from the need to evaluate to see if the tools to be used met the needs. On the other hand, you certainly don't have to do stuff yourself. You can make the vendor do it. No one suggests that every user do their own ACES testing. Let's suppose that you have determined that ACES testing is useful. In that case you should simply only look at products for which ACES testing results are available (this has certainly been done in some cases). Similarly, if you think ISO 9000 certification is useful, then by all means require it (other users of Ada have done so, Alsys did not get ISO 9000 certification for its own internal requrements, but because customers required it). Ken, in the case of your project, were extra requirements like this set down? If not why not? In the commercial world, there is no central body making requirements on, e.g. C or COBOL compilers, with regard to testing requirements. Individual users can and do make decisions to only acquire copilers that meet certain requirements -- for example it is common to require NIST certification for C and COBOL, common enough that in practice almost all C and COBOL compilers are usually NIST certified. However that particular market place (C and COBOL compilers) does not seem to have any general agreemnt on other testing methods. Ken, what's so terrible about letting the market place decide what requirements to place on tools that are acquired, instead of concentrating on having DoD make general requirements that might or might not be suitable for your particular project. Yes I know, you keep asking why, if this argument is valid, should it not be applied to the ACVC itself? The answer is that there is a general consensus in the DoD that this is a good idea. Furthermore, it is not unusual historically. The GSA very early on required the equivalent of NIST testing for all COBOL compilers. The basic idea is that the ACVC measures conformance to the standard. The DoD requirement is that you use an Ada compiler in certain cicumtances. The ACVC testing essentially defines what constitutes an Ada compiler. There is no mandate to use Ada compilers with certain characteristics, "quality" or otherwise. Ken wants the mandate strengthened in this area, but it is not clear there is any consensus for such strengthening. P.S. Ken, you recommend improving the quality of the ACVC suite. Fine, but can we have some specific constructive suggestions? The ACVC development effort is quite open, with intermediate results available for public review. There is a review committee that represents vendors, language designers and users that evaluates the project on an ongoing basis and is open to suggestions from the entire community. It maybe that Ken knows better than other people on the committee what should go into the ACVC tests and should have been chosen as a committee member instead of me or some other member of the committee. I would be more convinced of that possibility if I saw specific suggestions from Ken based on careful examination of the existing 2.1 suite as it develops. In fact I rather suspect that Ken's reaction to the ACVC is based entirely on version 1.11, which is at this stage very out of date, both in terms of language features covered, and testing philosophy. suggestions have bee