From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=0.7 required=5.0 tests=BAYES_00,MSGID_RANDY autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,66752102482bbdca X-Google-Attributes: gid103376,public From: Robert Dewar Subject: Re: Required Metrics Date: 2000/05/05 Message-ID: <8eukm0$ssm$1@nnrp1.deja.com> X-Deja-AN: 619581186 References: <5DDO4.2237$wb7.194854@news.flash.net> <8ek4ea$5ta$1@nnrp1.deja.com> <390DC8AD.59B5EBEE@averstar.com> <8ep0k3$jlr$1@nnrp1.deja.com> <8es5fv$4ov$1@nnrp1.deja.com> <_HnQ4.7884$wb7.550012@news.flash.net> X-Http-Proxy: 1.0 x43.deja.com:80 (Squid/1.1.22) for client 205.232.38.14 Organization: Deja.com - Before you buy. X-Article-Creation-Date: Fri May 05 14:09:17 2000 GMT X-MyDeja-Info: XMYDJUIDrobert_dewar Newsgroups: comp.lang.ada X-Http-User-Agent: Mozilla/4.61 [en] (OS/2; I) Date: 2000-05-05T00:00:00+00:00 List-Id: In article <_HnQ4.7884$wb7.550012@news.flash.net>, "Ken Garlington" wrote: > > I think that's the wrong take entirely. In the case of metrics > > (see Bob Duff's post), there is a broad feeling that these are > > completely bogus requirements for modern compilers. > > Here's the first problem I have with this statement. Either > > (a) it's not true, and so this is a real issue It's true, but always remember the issue here is not whether the documentation might or might not be useful, but whether it belongs in a language standard. > (b) it is true, and has been true for some time, in which case > how did it manage to get into the standard? It has always been true that documentation requirements like this do not belong in the standard. As to why they got in, I think your posts have vividly illustrated the thinking behind those who argued for putting them in. To me a classical case of confusing desirable implementation features with formal semantic requirements. > (c) something changed between 1994-ish and now (really well > before now, No, nothing has changed > since apparently no one has ever met this part of the > standard), You keep saying this, which means you still do not understand the issue here, which is that we are talking about formal semantic requirements. On the contrary, everyone has met it, since from a formal point of view, documentation is not defined, and therefore can be defined anyway you want. As Bob Duff points out, you can define the compiler itself as an operational definition of its own behavior, and therefore it constitutes documentation in a formal sense. Providing the sources is a more realistic form of this approach to documentation, but still may not meet your *non-formal* implementation requirements. > and if so, > what changed? I think you could argue that things have indeed change from the point of view of informal documentation requirements here. And that probably accounts for the overwhelming lack of interest in getting more accessible forms of this documentation, since it was (like much of the RT annex) framed with bare board single processor implementations in mind, whereas modern reality is more likely implementation over an OS or 3rd party RTOS. But that does not affect the discussion about whether such requirements should be formal semantic requirements in the language. The answer to that is, and has always been, no. The fact that such requirements got in was just a compromise necessitated by a standards process involving many people, not all of whom understand what formal language requirements are about, and who don't mind having undefined wooly requirements in what should be a precise specification. > Here's the second (and larger) problem, as I've mentioned > before. Why should > "feelings" be related to requirements? They should not be. The problem with these requirements was precisely that the "feeling" expressed by you forcefully that Ada implementations should be "required" to provide useful documentation got translated into requirements. > Why go through the motions to > formally establish the requirement and get a consensus documented and > approved, if there's not an equal intent to do something formal (an AI, at > least) to remove such a "bogus" requirement? Lots of us knew these requirements were bogus, including key people on the design team (e.g. Bob Duff). You fight many battles in a consensus process. This one was not worth fighting at the time, since the effect of these requirements is negligible in practice. We knew they would end up being useless of course, but we also knew they would not be harmful. > It appears like the classic > case of people writing down a requirements document just > because the > customer says you have to do one, Well, you paint with too broad a brush. This particular phenomenon (documentation requirements) was indeed a matter of responding to customers who did not really understand the issue. But the great majority of the Ada RM is not vaguely related to this process. > but promptly ignoring it through the rest > of the life cycle. Regardless of the facts of the matter, > should the Ada > community give such an appearance? Of course not, but as you have made very clear, there are significant players in the Ada community who do not understand that requirements like this do not belong in the RM, so even now it might be a difficult battle to win, and frankly it is not worth it, since it is unimportant. > > If they > > ever had any meaning it would only be for bare-board compilers > > for very specific architectures. Note that I am talking about the value of these items as pragmatic implementation features, not formal requirements when I say this. > However, I'd claim that there are still a significant number > of real-time systems that do, in fact, get implemented on > bare-board systems. It's certainly a dieing breed. We see almost no demand for this, not zero, but barely on the radar screen. The trend is clearly to 3rd party RTOS based systems (VxWorks, and also see Greenhills new Integrity push). > I don't > know which are the "specific architectures," but certainly real-time systems > use a variety of them. Since we're talking about the real-time annex, then > surely the environments in which real-time systems operate should be a > strong consideration for the appropriateness of such a requirement. I think if you are in such an environment, you should require your implementor to provide whatever information you need. The RM will be no help in this task, since the issue is not formal requirements, but rather providing information in the manner YOU require. > For example, is there a GNORT target for which the metrics are > meaningful? Seeing as GNORT excludes tasking and the real time annex in its entirety, no :-) > > It is not that the vendor "does not like" this particular > > requirement in our case, it is simply that > > > > a) it is meaningless, but we do meet the requirement anyway, > > perhaps not in the most useful way, but since it is pretty > > much useless, there is not much point in trying to do > > useless > > things in a useful manner! > Again, the issue of whether it's an appropriate requirement is > interesting, but it's not the real issue. I'm still not > convinced that you meet the > requirement in the current release, for that matter. Well try to prove this formally. First you will need a formal definition of "documentation" *derived from the RM*. That will be hard. For example, are you sure that this documentation must be in English? What if I provide it in Navaho or Klingon, is that sufficient, or, more to the point here, what if I provide it in Ada 95. Furthermore, if your idea is that you want to see metrics in milliseconds, then this is clearly infeasible in general working over third party systems, so the conditions of RM 1.1.3(6) would apply: 6 Contain no variations except those explicitly permitted by this International Standard, or those that are impossible or impractical to avoid given the implementation's execution environment; > More importantly, since > this is not about a specific compiler, I'm not convinced > _anyone_ has > bothered to meet the requirement. You still think of this as a requirement that can be met or not met. And it is this invalid thinking that lead to having these kind of requirements in the RM. Since the requirement is semantically vacuous, I think it is clear that all compilers meet it. > I don't have proof that this is the case, > but it concerns me. Most importantly, it seems this argument > can be used to > justify all sorts of mischief. That's FUD unless you can back it up. I see no possible spill over from requirements that are meaningless to handling of requirements that are well stated and semantically meaningful. > > b) our customers have zero interest in us doing any more > > than we do now. > > > > As we often find, CLA readers and contributors seem to have > > rather different priorities from actual Ada users when we get > > to discussing various theoretical issues surrounding the RM :-) > Again, I find the argument of user priorities a very slippery > slope, which can be used to invisibly drop any number of > requirements. No, that's completely wrong, and I begin to think that it is a hopeless task to convince you. No requirements have been dropped by anyone here. The only issue is that you have in your mind some interpretation of these requirements that is not in the RM, and you note that compilers do not follow Ken Garlington's interpretation of these requirements. Very possibly so, but that has nothing to do with whether the requirements are followed. Now take the visibility rules of Ada in chapter 8. Here we don't care what Ken thinks about them, because a) they are well defined b) it is well defined whether a compiler implements them correctly. A totally different situation. It is really critical in a formal language standard that all requirements be objective and well defined. Once again requirements for "good" documentation are as silly as requirements for good performance. As I mentioned before, there were actually people seriously asking for quantitative requirements on performance of tasking constructs in terms of processor clock cycles. I know, that's hard to believe that anyone could be that misguided but it's true. The undefined metrics requirements were a compromise to deal with this unreasonable extreme. Since they are undefined and harmless, it was a way of getting things done without running into a road block. > It's useful for > discussing the options for implementing a requirement (e.g. a fancy > calculator for computing the metric vs. "here's the sources, have at it"), > but good software engineering would say that failing to meet a > requirement > due to perceived user desire The trouble with these undefined requirements is that the only POSSIBLE meaning is in terms of "perceivced user desires". In other words if we try to ask if a compiler meets certain documentation requirements, we cannot look to any formal definitions and we cannot look to the RM, the ONLY way of determining whether such requirements have been met is to interrogate "perceived user desire". Yes, that's a horrible state of affairs and that's what this thread is all about, but it is really stil a tempest in a teapot in terms of overall significance. By the way, the ACVC review board discussed this briefly. They immediately agreed that these requirements were too ill-defined to even consider including them in conformance assessment. There was no controversy in this decision, it was clear to this committee that these "requirements" have to be treated differently, namely ignored completely when it comes to formal conformance testing. Sent via Deja.com http://www.deja.com/ Before you buy.