From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: fc89c,97188312486d4578 X-Google-Attributes: gidfc89c,public X-Google-Thread: 109fba,baaf5f793d03d420 X-Google-Attributes: gid109fba,public X-Google-Thread: 1014db,6154de2e240de72a X-Google-Attributes: gid1014db,public X-Google-Thread: 103376,97188312486d4578 X-Google-Attributes: gid103376,public From: ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) Subject: Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?] Date: 1996/09/13 Message-ID: <51b8i4$m9m@goanna.cs.rmit.edu.au> X-Deja-AN: 180339639 references: <01bb8df1$2e19d420$87ee6fce@timpent.airshields.com> <515o3b$d7h@goanna.cs.rmit.edu.au> <01bb9fe6$7299d800$87ee6fce@timpent.a-sis.com> <5182u7$17b@goanna.cs.rmit.edu.au> <01bba125$4e8c4ca0$32ee6fcf@timhome2> organization: Comp Sci, RMIT, Melbourne, Australia newsgroups: comp.lang.c,comp.lang.c++,comp.unix.programmer,comp.lang.ada nntp-posting-user: ok Date: 1996-09-13T00:00:00+00:00 List-Id: "Tim Behrendsen" writes: >Does it fail? I suggest that your machine is equivalent to a >CPU with a single instruction. The single instruction calculates >a particular mathematical formula. I could simulate this using >pencil and paper. Does the word *continuous* convey nothing to you? No, you *couldn't* simulate the machine using pencil and paper. An optical computer works in the *real* (\br{R}) world. >Note that it does *not* say that I have to do the operations using >exactly the same method, only the same operations. Your light >based algorithm is simply a mathematical operation, which I >can do using pencil and paper quite easily. Here you are making a claim to be able to compute non-computable functions. I don't believe you. >This is like saying that since most computers are based on the >flow of electrons through gates, I have to simulate the electron >flow in order to prove Quicksort is an algorithm. Totally false. >Try this on for size: >1. There are two ways an algorithm can be expressed: > a. Using a methodology that expresses a path of execution (aka > procedural languages). > b. Using a methodology that does *not* specify a path of > execution (aka non-procedural languages). This is false. If I write a Horn clause program, one person will look at it and say "ah hah, that is a Prolog program, it specifies a path of execution, it is procedural", while another will look at it and say "ah hah, that is a logical formulal, it expresses a timeless truth which can be computed in many ways." They are both right. Take for example a Definite Clause Grammar written in pure Prolog. ONE AND THE SAME FORM OF EXPRESSION may behave - as a top-down recursive descent parser (Prolog execution) - as a left-corner parser (I've got parsers running this way; this is not a theoretical point) - as a chart parser (XSB execution). Take another example. There is a programming language called Euclid. Euclid is in the Pascal family. It's full of records, arrays, pointers, assignment statements, all that stuff. But there was a paper from Xerox showing how a Euclid program could be *naturally* and *directly* viewed as a pure functional program. ONE form of expression, but BOTH a "procedural" reading AND a "declarative" reading. Heck, it's more than 30 years since (de?) Bakker showed how to view an Algol 60 "procedural" program as a lambda-calculus "declarative" formula. For another example, you accepted SQL as a non-procedural form of expression, but I countered that I can "see" an imperative reading. The point that I am making is that the distinction you are trying to make is _not_ a distinction that is present in the forms of expression themselves; the procedural and declarative views are ways that human beings _choose_ to read texts which are themselves neutral. I must stress that this is not a theoretical point. There are a couple of companies making what seemed to me at the time "big bucks" taking COBOL programs, *viewing* them as declarative expressions, and emitting new COBOL code with the same _declarative_ meaning but different _procedural_ form and behaviour. And there are people using functional languages to design VLSI circuits. >2. Algorithms can execute *only* using a time-based mechanism. You are taking "we can only _tell_ that an algorithm has executed by contrasting a later situation with an earlier one" for a property of algorithms rather than a property of human perception and embedding in the world. You may be right, but we can only notice *anything* by contrasting earlier and later situations (I am using "situations" in a Barwisian sense here). That doesn't mean that it is the most fruitful way to think about everything. One very important computing technique, after all, is converting time to space. (What else is parallelism?) You are also not confronting your assumption that the most important thing to understand about an algorithm is its _execution_. >Now, as for education, I submit that at the beginning it is more >important to focus on what execution means I _really_ don't get you here. The phrase is actually ambiguous. I don't understand either of the readings. If you mean "it is important for students to understand what it is like when a machine executes a program", fine, we agree that it is important, but why is it the _most_ important thing? Case in point: I learned matrix algebra before I learned about machines. I know what the execution of a Gaussian elimination "means" because I know matrix algebra, not because I know machines (though I do know both). If I just knew machines, I wouldn't have the faintest notion what a Gaussian elimination meant. >Certainly >you need some form of expression, but it should be kept as >simple as possible to get the student brain into the general >concept of a flow of execution without the distraction of >methodologies. "Methodologies" is not a word that I use; it is pretentious. Do we teach children about the mechanics of speech before they learn to talk? When I was taught to play the piano as a child, I was taught about forms of expression (how to read music), not about how to build a piano or what happens when you press a key. When my wife bought a knitting machine as a teenager, she was taught how to read patterns, and how to control the machine, not how the machine is constructed or operates. When you are taught to drive a car, you are taught the rules of the road and a few heuristics for operating the machine. You are not taught the theory of combustion; you are not shown the wiring diagram of the car. Heck, people these days don't even learn what gears _are_, let alone how they work. I got a ham license without having to learn what actually goes on inside a valve or transistor. What is so different about computing? >In fact, the algorithm definition you posted above is a good >one. Do you use that in your classes? As I don't teach first year, no. It is, of course, Knuth, The Art of Computer Programming, Volume 1. >Well, I have to admit that when I think about an algorithm, >I visualize data flowing through the algorithm undergoing >transformations until it's what I want. I start with the >highest level "black box" transformations, and then gradually >break down each black box in greater and greater detail until >I have C code (or whatever). >This is certainly not the only way to look at problems, but it >is the closest to the reality of how algorithms execute. And >I submit that it is the most straightforward to understand. I >admit, though, that I have a natural bias toward this point of >view. Just at the moment I'm playing around with trying to speed up back-propagation neural nets. A student here has NN programs that regularly take 4 hours. So far I've worked on just one stage, but already that stage goes 5 times as fast. (That's the practical ceiling for that stage.) What's the trick? To step *BACK* from the procedural view; with considerable effort to figure out that the bulk of the work goes on in what _should_ have been two calls to SGEMV and one call to SGER (level-2 BLAS). (Well, actually I'm using APL notation, not calls to the BLAS.) Then you notice that there are a SAXPY and SDOT lying around that could be fused with the SGER and one of the SGEMVs (making this decision _solely_ on the timeless mathematics), and then you see that two loops can be fused making it possible to eliminate a vector store and reload, and before you know it, almost all of the branches have gone. The bit I've looked at is one of the SGEMV steps, and the speed of that chunk has gone from 8 Mflops to 40 Mflops on the same machine. (To start with, one branch per 50 instructions.) I am already aware of one bottleneck that makes it unlikely that the whole NN training phase will speed up that much. (Anyone know a really fast way to compute a reasonable approximation to a logistic function?) The point here is that people before me had already tried to write good code, but their minds were straitjacketed by thinking the way you want them to think. I had to think at the (timeless) mathematical level to see how to reorganise this code. >Hmm; I'm not sure if I view it the same way or not. If I >visualize Quicksort, for example, I visualize a big block of data >that I divvy into two parts. Each of those parts get divided, >and so on. To map into the computer domain, I think about a >black box that does the divvying process, and then each piece >gets put back through the black box. I suppose my core metaphor >is also spatial, but I'm not sure if it's the same way. >The point is that I (at least) don't view Quicksort as a >static "hall of mirrors", everything reflecting back and forth >with no sense of execution. I see a "movie" of the data breaking >down section by section, and I can see the pattern between >each break and generalize it. Done much parallel programming yet? You are describing an approach to understanding algorithms which makes parallel programming _very_ difficult. -- Australian citizen since 14 August 1996. *Now* I can vote the xxxs out! Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.