From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: *** X-Spam-Status: No, score=3.8 required=5.0 tests=BAYES_00,INVALID_MSGID, RATWARE_MS_HASH,RATWARE_OUTLOOK_NONAME autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 109fba,baaf5f793d03d420 X-Google-Attributes: gid109fba,public X-Google-Thread: 103376,97188312486d4578 X-Google-Attributes: gid103376,public X-Google-Thread: 1014db,6154de2e240de72a X-Google-Attributes: gid1014db,public X-Google-Thread: fc89c,97188312486d4578 X-Google-Attributes: gidfc89c,public From: "Tim Behrendsen" Subject: Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?] Date: 1996/08/21 Message-ID: <01bb8f18$713e0e60$32ee6fce@timhome2>#1/1 X-Deja-AN: 175446443 references: <31FBC584.4188@ivic.qc.ca> <01bb83f5$923391e0$87ee6fce@timpent.airshields.com> <4uah1k$b2o@solutions.solon.com> <01bb853b$ca4c8e00$87ee6fce@timpent.airshields.com> <4udb2o$7io@solutions.solon.com> <01bb8569$9910dca0$87ee6fce@timpent.airshields.com> <4urqam$r9u@goanna.cs.rmit.edu.au> <01bb8b84$200baa80$87ee6fce@timpent.airshields.com> <4vbbf6$g0a@goanna.cs.rmit.edu.au> content-type: text/plain; charset=ISO-8859-1 organization: A-SIS mime-version: 1.0 newsgroups: comp.lang.c,comp.lang.c++,comp.unix.programmer,comp.lang.ada Date: 1996-08-21T00:00:00+00:00 List-Id: Richard A. O'Keefe wrote in article <4vbbf6$g0a@goanna.cs.rmit.edu.au>... > "Tim Behrendsen" writes: > > >Yes, but you can use the "get a better compiler" argument to > >justify anything. Real programs run on real computers using > >real compilers. The "Super-Duper Ivory Tower 9000 Compiler" > >just doesn't exist. > > This is a bogus argument, because the better compilers *I* was talking > about ACTUALLY EXIST. As a particular example, GCC does self-tail-call > optimisation and SPARCompiler C 4.0 does general tail-call optimisation. > Compilers for other languages doing very well indeed with procedures > include Mercury and Stalin. > > It does nobody any good to pretend that good compilers do not exist. They may exist, but does it exist universally? The point is that to go out of your way to use left-field optimizations is just bad portability practice. For example, if someone does this: void sub(char *s) { ... if (strlen(s) == 0) { /* check for null string */ ... } } and they know it's stupid, but they also know the compiler just happens to have optimization for it, should they still be shot? "Diana, get me my rifle." > >> Interestingly enough, the APL sorting primitive DOESN'T move the data > >> at all. It returns a permutation vector. The APL idiom for sorting is > >> X[.GradeUP X] > >> where the necessary movement is quite visible. > > >But the reality is *not* visible. What has the student really > >learned? > > Scuse please? WHAT reality? The reality in this case is that you can > in fact in just about any reasonable programming language sort data > WITHOUT moving it. Indeed, for a number of statistical and numeric > calculations, the permutation vector is more useful than moving the > data would be. The reality could well be a sorting network made of > comparators and wires. On the machine I'm posting from, sorting > _could_ be done using parallel processes (not concurrent, parallel; > the machine has more than one CPU). And of course there have been > at least two machines built that directly executed APL; APL _was_ > their assembly language. Wait, hold the phone! "Sorts the data without moving it"? What, is APL's sorting algorithm O(1)? Yes, it may not actually get sorted until it gets printed, but that's irrelevent to the fact that it eventually gets sorted. > In the case of APL, the student has learned to express his/her intent > in a concise notation with a clear mathematical semantics (this applies > to APL primitives only, alas; the rest of APL is rather murkier) which > permits effective reasoning at a much higher level. > > "Mathematical skills" came into it somewhere. Indeed, and I think APL is kind of a neat language to learn. But it hides so much about the internals of the computer, that I think it would give a student too many false impressions about how things really work. > [snip] > The ability to do "algebraic" reasoning about a program is quite as > important as the ability to do assembly level thinking (which I agree > is important). When you really need big improvements in performance, > you get it from the high level thinking. I actually quite agree with this; it's how you get the "high level thinking" that I think is an issue. Even in a mathematical proof, you are talking about a sequence of micro-steps. Yes, most proofs are built of other proofs, but I think this is more of a "proof macro language" than a "high-level mathematical language" (whatever that means). My "proof" of this is the fact that when a proof is used within another proof, you can pretty much do a straight insertion of the lower-level "macro proof" into the text of the proof-to-be-proven. There is no "proof compilation", so to speak (I don't know if that last made *any* sense :-) ) -- Tim Behrendsen (tim@airshields.com)