From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.3 required=5.0 tests=BAYES_00, REPLYTO_WITHOUT_TO_CC autolearn=no autolearn_force=no version=3.4.4 X-Google-Thread: 103376,587e0e0a16d65b10 X-Google-Attributes: gid103376,domainid0,public,usenet X-Google-Language: ENGLISH,ASCII Path: g2news2.google.com!news4.google.com!feeder.news-service.com!news2.euro.net!newsgate.cistron.nl!newsgate.news.xs4all.nl!xs4all!news.tele.dk!news.tele.dk!small.news.tele.dk!bnewspeer01.bru.ops.eu.uu.net!bnewspeer00.bru.ops.eu.uu.net!emea.uu.net!newsfeed.arcor.de!newsspool3.arcor-online.net!news.arcor.de.POSTED!not-for-mail Date: Wed, 25 Feb 2009 00:01:36 +0100 From: Georg Bauhaus Reply-To: rm.tsoh+bauhaus@maps.futureapps.de User-Agent: Thunderbird 2.0.0.19 (Windows/20081209) MIME-Version: 1.0 Newsgroups: comp.lang.ada Subject: Re: Invade wikipedia! References: <49a415c4$0$32675$9b4e6d93@newsspool2.arcor-online.net> <08cbf95f-1a72-4a93-8c21-55b1411b6608@j12g2000vbl.googlegroups.com> In-Reply-To: <08cbf95f-1a72-4a93-8c21-55b1411b6608@j12g2000vbl.googlegroups.com> Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 8bit Message-ID: <49a47c51$0$30227$9b4e6d93@newsspool1.arcor-online.net> Organization: Arcor NNTP-Posting-Date: 25 Feb 2009 00:01:37 CET NNTP-Posting-Host: 69a53a13.newsspool1.arcor-online.net X-Trace: DXC=Xid6cWG\jgLj7E:bke<5HFic==]BZ:afN4Fo<]lROoRA^YC2XCjHcbIQMh\\V]IH0BKQDKiQ7h Hibou57 (Yannick Duch�ne) wrote: >> But then, when implementation matters, one could always >> show how Ada helps with getting details right and readable. > This is not just a matter of implementation, it is also a matter of > precision. The word "precision" can be misused. The precision of an algorithm is measured against what? More than some typical level? Less than it? What's typical? And why? -- That is, should Wikipedia, in an algorithms book, be so precise that it illustrates certain CPU-specific integer overflow detection machanisms in the examples?[*] Or should it delegate discussions of this level of precision to some book about implementation strategies? The following premises produce a partial answer. Any choice of notation (code or pseudo code) has to start from the writer's intentions: 1 - What do you want to explain? 2 - Who is your audience? Suppose you want to express an algorithm that sorts a thousand whole numbers of any size.[**] If you start from just about any Ada-like programming language you are in trouble already: the fundamental type system is not capable of expressing types of arbitrarily large whole numbers because (a) the fundamental type system is really built around the idea that whole numbers basically fit in register words, (b) the language definitions reflect CPU properties by requirement or by advice to *of* *course* take advantage of hardware properties (quick addition in inner loops; an overflow trap). These constraints add real things, but they do not help with achieving the clearly specified goal (of explaining the algorithm): When I want to explain just how to sort a thousand arbitrarily large whole numbers there is no need to refer to the limits of a machine model. First, a finite machine may not be relevant to my audience. They just want the sorting of the numbers they already know. Second, coding this algorithm in Ada or some other "real" language, I *have* to go outside the fundamental type system, because it does not have large numbers. Worse, I start waving hands: "You know, just assume these magic types I gave you can deal with large whole numbers fine. I'm omitting the details of how this is possible etc etc." What you loose, then, is what you wanted to add: precise explanation, of nothing but the algorithm! > If this kind of pseudo code was suffiscient to really and precisely > express algorithms, it would be a real language. Pseudo code of the kind in question is a real language. By analogy, no one writes Turing machine programs in commercial applications[***]. Still, his fictional(!) machine is entirely programmed using "pseudo code"[****]. A Turing machine is defined precisely and its languages are real programming languages. An algorithm, for example a shortest path procedure, can well be explained, formally and precisely, in lines and characters, drawn using a pencil. A picture shows the current state of graph traversal. Each picture has circled letters, some shading, some straight lines. Each is like a picture in a slide show: You follow the pictures as the algorithm steps forward, if necessary you can look backwards at an earlier picture. That is, you follow the snapshots of an algorithm forwards and backwards. The objects of the pictorial algorithm are well defined, as are the steps. To explain it, mo larger amount of precision is needed. There is a *machine* *model*. Obviously, its not that of a PC. The machine *may* be a sequential digital RAM of the kind we think we know. If this is important, it will be instructive to *add* more aspects to the algorithm. Can we afford large numbers? What *is* a large number, by the way? The a real programming language will become important insofar as it reflects the constraints imposed by the digital RAM and the PL definition. > There is an exception about declarative languages, like part of the > mathematical one, beceause these ones express properties. Properties of what? Try it, I discovered many details that need to be masterd before the expressions become clear. They are just not stated in the declarations, but in preparatory texts and preparatory exercises. > pseudo-code is a langage, beside any other one, and I > do not see a way it is better to any other one at expressing > algorithms. Example: The basic idea of a 1-dimensional array in a conventional programming language may be expressed, WLOG, as a sequence of items that can be addressed directly, using index numbers.[*****] When explaining an array algorithm, you can do so with this or that (better or worse) reification of the array idea that some programming language offers. Sure, there are array reifications that produce a lot of overflow errors in real programs. Yes, Ada's tight coupling between the array object and its index type makes the language stand out. Yet, any special features of arrays need to be explained first unless they are self-explanatory. The subject, however, is algorithms, not array reifications. OTOH, the CLRS pseudo code notation has array functions similar to Ada's 'Length, but simpler. It seems to work in teaching. > And indeed, it is a good idea to not talk about algorithms using a > particular "implementation language", beceause algorithms are better > discussed using declarative meanings (implementation decision as well, > should be disccused using declarative meanings). Algorithms being better discussed using declarative meanings is a claim frequently made (and hardly ever shown to be correct) by mathematicians. Statistically, there is a remarkable correlation between those who make this claim and their profession (mathematics). I'm stating this as an observation, no comment is intended. (Still, when I see equational function assemblies of an absurd combinatorial complexity at the source level, I start to think about blind spots in FP.) When we try to understand a declaration or equational definition, our brain does operate(!)! It does not just denote, if it does denote at all. In order to really understand what a "functional equation" means, you have to perform(!) some backtracking: f(0) = 1 f(t + 1) = "*"(t, f(t - 1)) for all t in N To understand what is going on here you need to follow the recursion. This process(!) is never emphasized in a functional showcase.[******] Why not? Proceding on the declarative route, you arrive at assembling(!) functions in tricky ways in order to achieve an efficient evaluation strategy. Isn't this totally obscuring the explanatory goal? _____ [*] For sure, detecting overflow would largely remove the need for all these "security patches" that revolve almost exclusively around the very subject of overflows... [**] Not unrealistic, we do have libraries for these kinds of numbers for a reason. [***] other than programs demonstrating Turing machines, maybe. [****] barring toilet rolls and human hands and heads of state... [*****] Dmitry, could you resist explaining why arrays should really be interfaces? They are, BTW, in CLRS pseudo code, I should think :-) [******] The Little Schemer (etc) was an exception. Vista is driving me mad. It's switching locale settings at unforeseen moments, makes me start tzping funnz words. Do thez ever use anzthing but Office inside MS?