From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: * X-Spam-Status: No, score=1.2 required=5.0 tests=BAYES_00,FROM_WORDY, INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,66752102482bbdca X-Google-Attributes: gid103376,public From: "Ken Garlington" Subject: Re: Required Metrics Date: 2000/05/03 Message-ID: #1/1 X-Deja-AN: 618658719 References: <5DDO4.2237$wb7.194854@news.flash.net> <8ek4ea$5ta$1@nnrp1.deja.com> <390DC8AD.59B5EBEE@averstar.com> <8ep0k3$jlr$1@nnrp1.deja.com> X-Priority: 3 X-MimeOLE: Produced By Microsoft MimeOLE V5.00.2919.6600 X-Complaints-To: abuse@flash.net X-Trace: news.flash.net 957356126 216.215.85.193 (Wed, 03 May 2000 07:15:26 CDT) Organization: FlashNet Communications, http://www.flash.net X-MSMail-Priority: Normal NNTP-Posting-Date: Wed, 03 May 2000 07:15:26 CDT Newsgroups: comp.lang.ada Date: 2000-05-03T00:00:00+00:00 List-Id: "Robert Dewar" wrote in message news:8ep0k3$jlr$1@nnrp1.deja.com... > The trouble with documentation requirements is that the > standard, by necessity, does not define exactly what the > meaning of these requirements is, or even what documentation > consists of. > > In the case of typical implementations on top of an operating > system, about the best documentation of the conventional kind > that you could give for tasking metrics would be to clearly > document the sequence of operating systems calls that is made > for any particular tasking language construct. > > Our view in the case of GNAT is that the sources of the run-time > which are an integral part of our complete technical > documentation contain precisely this information, and so far > we have not had any instances of users requiring the information > in any other form. I don't have any problem with interpreting the term "formula" as a sequence of operating system calls, but I don't think the GNAT document set (again, as an example only) does this. Looking in the most logical place (the GNAT Reference Manual), it says "Information on metrics is not yet available." This implies, to me at least, that this information is *not* available elsewhere in the "documentation". If this statement were completely absent, and/or if there were a general statement somewhere that the source code was also considered part of the documentation, I think I'd have an easier time accepting that the requirement were met. Why wouldn't the reference manual, at a minimum, refer to the applicable run-time sources (making them clearly part of the "documentation")? This is done in other cases (e.g. for storage pools), and it certainly would have helped me when I needed this information! Considering all of the effort that's gone into good error messages and the like, this sort of easily-corrected "gap" in addressing a *language requirement* (not, I want to point out, "desirable documentation") is surprising. > I don't think this is so surprising. We could of course run > benchmarks on particular machines under particular conditions > and publish numbers that appear to meet the requirements of > the RM. Why "appear"? Because in practice they won't be useful > for users. It's perfectly OK to think that a requirement is not useful, and in fact I agree with you to some extent. However, that is beside the point. I get a queasy feeling when it seems that the Ada vendor community may not be addressing requirements they don't like. First, it doesn't seem to be a very "software engineering" oriented solution, given how Ada folks are supposedly more "software engineering" oriented than users of other languages. Second, how can anyone claim that an advantage of Ada is standardization, if vendors don't have to follow parts of the standard that are hard to implement, or that they just don't like? I'd hope we all agree that a compiler that passes the validation suite, but can't handle *any* other valid Ada program, is also in practice not useful for users, and should be universally denounced as not compliant to the standard. However, once we start down the path of ignoring the requirements we don't like, we lose any right to complain about such a case. It seems to me in the Ada83 days that AIs were used to develop and document consensus on clarifications, etc. to the standard. Is this no longer used?