From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,826cd690cb6a7585 X-Google-NewGroupId: yes X-Google-Attributes: gida07f3367d7,domainid0,public,usenet X-Google-Language: ENGLISH,UTF8 Path: g2news1.google.com!news4.google.com!feeder.news-service.com!newsfeed.straub-nv.de!noris.net!newsfeed.arcor.de!newsspool1.arcor-online.net!news.arcor.de.POSTED!not-for-mail Date: Fri, 02 Sep 2011 12:37:17 +0200 From: Georg Bauhaus User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:6.0.1) Gecko/20110830 Thunderbird/6.0.1 MIME-Version: 1.0 Newsgroups: comp.lang.ada Subject: Re: Overloading parentheses and type expectations References: <904e717e-da4c-46c9-bbc2-4bae8368d459@l4g2000vbv.googlegroups.com> <4e5d139f$0$6575$9b4e6d93@newsspool3.arcor-online.net> <3756bc0c-d938-4b45-baa1-b80e59d58055@a10g2000prn.googlegroups.com> <1a879va9fjuwo.1r1b6vbp45lx5.dlg@40tude.net> <4e5dd1c1$0$6638$9b4e6d93@newsspool2.arcor-online.net> <12nar9pjxb1pg.14709ohrczh9p$.dlg@40tude.net> <4e5e54ec$0$7623$9b4e6d93@newsspool1.arcor-online.net> <4e5e6010$0$7611$9b4e6d93@newsspool1.arcor-online.net> <4e5e6199$0$7611$9b4e6d93@newsspool1.arcor-online.net> <1ctkvx8kzbydt.voo17pqegid1$.dlg@40tude.net> <4e5e9c5d$0$7624$9b4e6d93@newsspool1.arcor-online.net> <85cspp5yd1u1$.krrkdtgmlj4j.dlg@40tude.net> <4e5f5583$0$6623$9b4e6d93@newsspool2.arcor-online.net> In-Reply-To: Content-Type: text/plain; charset=UTF-8; format=flowed Content-Transfer-Encoding: 8bit Message-ID: <4e60b1dd$0$6635$9b4e6d93@newsspool2.arcor-online.net> Organization: Arcor NNTP-Posting-Date: 02 Sep 2011 12:37:17 CEST NNTP-Posting-Host: d3049549.newsspool2.arcor-online.net X-Trace: DXC=]b_FDJ4@MfKlU`@c^jLCbJA9EHlD;3YcB4Fo<]lROoRA8kFalo?oOPCY\c7>ejVHdVYQoiBj]5OX50B3e_25IE X-Complaints-To: usenet-abuse@arcor.de Xref: g2news1.google.com comp.lang.ada:20824 Date: 2011-09-02T12:37:17+02:00 List-Id: On 02.09.11 09:54, Dmitry A. Kazakov wrote: > Rubbish, mathematical notation is taught in school. It is easy to show that > all known alternatives (e.g. Polish expressions) are incomprehensive for > normal people. Still, I hesitate to call empirical findings "rubbish". Mathematical notation may, and does, use a circled ⊕, not + when the writer wants to emphasize that she isn't referring to some "known" +. Also, dot minus, ∸, is sometimes used for subtraction never going below 0. The use of ∫ is common for indicating an integral. TTBOMK, the symbol isn't used to denote anything but integrals. Unlike the way () has many uses in Ada. And ∫ is not used in Ada (or similar languages), but + is. Why? Is there anything about + that makes its meaning more special than that of ∫? More worthy of being included in the language? Is + better adjusted to how our brain works than ∫ and that therefore is a privileged symbol? Or is it habit + keyboard? I guess it is. No need to refer to cognitive psychology. When computers had less characters for syntax, and programmers typed <<< x : x in S .ST. P(x) >>> or similar instead of {x : x in S | P(x)}, they quickly gave up overloaded < etc when {} and | became available. Why? Because (a) they are easier to type, (b) they were expected in the first place. There is sanity in finding a break even point between time spent deciphering ASCII symbolism at one extreme and symbolitis at the other. >> Why would a designer want entirely different things in a language >> and then use the same notation for all of them? > > See above. This is the way human brain works. And the things are not > entirely different, they are instances of some class. The language captures > this by using "+" everywhere "addition" is meant. In order to capture meaning of classes, one first has to learn and become aware and become familiar with what a class is before even touching and understanding of class. Those not familiar with CS abstractions, maybe even those who are familiar with CS abstractions, have not found it obvious that a pointer dereference written p() is the same thing as an array indexing operation written a(), or as a function call written f(). Cute as it may seem, common as it may seem, conceptually sound as it may seem, expressing differences using .all, [], or () seems to help programmers understand programs. And then you would have to go the other way, too, from the common aspect and the identical syntax for different thingswhen debugging and trying to determine what kind of () is causing trouble. Proof: The lengthy and recurring discussion of a[i] being the same as *(a + i) in C does not take place when explaining Ada arrays; but Ada programmers will need to do some overload resolution work when seeing x(y). >> First, infix expressions are a (dis)service offered to programmers >> whose wish is less to program a computer, > > I'd like to see a hard proof that, for instance, Forth is more productive, > safer, easier to understand, etc than Ada. (Von Neumann called early Fortran notation a waste...) OK, I'm assuming people insist that infix is really a must; Although ... only with elementary math operators... Oh, and with logical operators. One ingredient of productive programming is the principle of least surprise. Overflow is one of the surprises. : 1 Max_Int + N - ; (- (+ 1 Max_Int) N) 1 + Max_Int - N; (1 + Max_Int) - N; Presuming familiarity with either language, why would the human brain be better at detecting the error on line 3 and not on lines 1,2,4? Even though, if I understand correctly, the brain isn't aided by distinct syntax when breaking things down? Note that the parens above have only one meaning. > Whether "+" is used as an > infix operation or as a function call, its semantics remains exactly the > same. The phrase "its semantics" is stipulating a specific interpretation of binary "+"; Smalltalk is assigning different meaning to "+", at least using very different reasoning (to the untrained). But you do write x + y ... Meaning is also unclear when languages have rules (or not) explaining the order of evaluation of arguments; or the effects of aliasing in the presence of optimization. I'm saying this not because Ada didn't care, it does, but because nothing of all this is visible to the reader studying the expression, infix or not. The human brain needs to learn, and then to rummage through a mountain of, information about the expression and its meaning in order to see, for example, from where this strange and irregularly occurring difference in executing x + y should stem, for seemingly the same x and y, by the looks of it. (Timing dependent shared variable update through y, say.) > Information unnecessary to understanding the program is noise. When also > required by the compiler, it constitutes a language deficiency. (There > could be some exceptions from this rule) Understanding is on a scale. This is why we say "easily understood". This is why I say that I do not want to be forced into being an inference machine without need. Just to understand something that is totally obvious and consistent with some little additions I'm fine with, yes, even redundancies. And redundancy is subjective, of course.