From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: fc89c,97188312486d4578 X-Google-Attributes: gidfc89c,public X-Google-Thread: 103376,97188312486d4578 X-Google-Attributes: gid103376,public X-Google-Thread: 1014db,6154de2e240de72a X-Google-Attributes: gid1014db,public X-Google-Thread: 109fba,baaf5f793d03d420 X-Google-Attributes: gid109fba,public From: ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) Subject: Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?] Date: 1996/09/12 Message-ID: <5182u7$17b@goanna.cs.rmit.edu.au> X-Deja-AN: 180078794 references: <01bb8df1$2e19d420$87ee6fce@timpent.airshields.com> <515o3b$d7h@goanna.cs.rmit.edu.au> <01bb9fe6$7299d800$87ee6fce@timpent.a-sis.com> organization: Comp Sci, RMIT, Melbourne, Australia newsgroups: comp.lang.c,comp.lang.c++,comp.unix.programmer,comp.lang.ada nntp-posting-user: ok Date: 1996-09-12T00:00:00+00:00 List-Id: "Tim Behrendsen" writes: >Time is extremely useful in understanding how something works, >although you don't normally invoke it's name. Even in your >optical computer, you have light-encoded information flowing >through mirrors, etc. *This is time*. If you are explaining >to a student how it works, your finger will trace a path >through the machine. This is simply false. When I was taught about these beasts, this simply didn't happen. To the extent that there was a progression of _ideas_, the progression went from the _output_ back to the input. No doubt other teachers than mine have other methods of presentation, but that's not how it was for me. What's more, I was introduced to Feynman's treatment of electromagnetic radiation involving a wave propagating _backwards_ in time. >My entire point before we went down this road is that I think >many teachers take it for granted that cause-and-effect and >procedure are so obvious they don't need to be learned, and >that's not true. There are some people who believe that backwards causation is self-contradictory, while there are others like me, who think that Goedel's rotating universe model has shown backwards causation to be compatible with the presently accepted laws of physics ("Tipler machines" are basically inside-out Goedel universes). Then there are the old distinctions between cause and ground, between formal, material, and final causes. Millenia of philosophy have shown the concept to be very tricky indeed. But what has that to do with computing? If I write a constraint logic program, you cannot point to any part of the program and say "this causes that", but it is computer programming for all that. And I certainly agree that "procedure" is not an obvious concept. I have just conducted a thorough search of my office. Three English dictionaries agree that "procedure" means something like "rules of behaviour", in the bureaucratic sense. I have quite a few books about algorithms, algorithmics, foundations of computing, &c. All of *them* basically boiled down to "procedure, see subroutine". >> Yes, but fundamental to what everyone in computing except you means by >> "procedure" is "a discrete sequence of steps". >No, fundamental to *you*, perhaps, but I have never heard this >definition before. As I said, the English dictionaries are talking about bureaucratic procedures, which are often written down in rule books. Even when they are not, to follow a (bureaucratic) procedure means to engage in a sequence of behaviours regulated by rules, and those behaviours are conceptualised as discrete. In computing, as I said above, I have been unable to find any definition of "procedure" that distinguishes it from a Fortran subroutine or Pascal procedure. In fact, that is why the "procedural" languages are so-called, because a procedural program is organised as a possibly structured collections of procedures in the Fortran/Algol/Pascal sense. *This* in the meaning that leaps to mind to everyone who has posted in this thread except you. I have been able to locate a definition of algorithm. I wonder if you will recognise it: The notion of an _algorithm_ is basic to all of computer programming, so we should begin with a careful analysis of this concept. ... So this is an algorithm. The modern meaning for algorithm is quite similar to that of _recipie_, _process_, _method_, _technique_, _procedure_, _routine_, except that the word "algorithm" connotes something just a little different. Besides merely being a finite set of rules which gives a sequence o operations for solving a specific type of problem [%], an algorithm has five important features: 1) Finiteness. An algorithm must always terminate after a finite number of steps. ... (A procedure which has all of the characteristics of an algorithm except that it possibly lacks finiteness may be called a "computational method." ...[%]) 2) Definiteness. Each step of an algorithm must be precisely defined; the actions to be carried out must be rigorously and unambiguously specified for each case. 3) Input. An algorithm has zero or more inputs ... 4) Output. An algorithm has one or more outputs ... 5) Effectiveness. An algorithm is also generally expected to be _effective_. This means that all of the operations to be performed in the algorithm must be sufficiently basic that they can in principle be done exactly and in a finite length of time by a man using pencil and paper. .. [%] These statements make it clear that the distinguished computer scientist who wrote this important textbook, whose emeritus chair is in fact named after the book, considers "procedures" and "algorithms" to be pretty much the same kind of thing, except that "procedure" is a _slightly_ looser term. This is actually the same book containing "procedure, see subroutine" in the index, but every other book in this office agrees with that. >Of course, your optical computer fits this >definition, since it has a discrete sequence of steps (mirrors, >etc.), and you claim that *it* doesn't. No, the mirrors are not steps. It is in principle possible to build an optical computer as a single translucent block with varying refractive index. You don't need a crisp metal/air interface to get reflection; a continuous change in refractive index (I did say my MSc was in underwater acoustics) will do the job nicely. The holographic transformation of an optical signal _certainly_ takes place in a continously varying medium; that's why I chose that example. Let's compute a 2-dimensional optical signal propagating through a medium with a predetermined static spatial modulation of its refractive index. 1. Finiteness: Not necessary for a procedure. 2. Definiteness: the transformation is precisely defined by partial differential equations. 3. Input: the incoming stream of photons. 4. Output: the outgoing stream of photons. 5. Effectiveness: fails miserably. The relevant number here is the power of the continuum. That was the point of selecting a model where _continuously_ varying media are involved. >No, the definition is very clear. You may _say_ that the definition is very clear, but you still haven't told us what your definition of a procedure _is_, and why such an extremely general redefinition of "procedure" is _relevant_ to CS education. >The point is that non-procedural >*expressions* are an abstract concept. So are *any* expressions. So are "so", "are", "any", and "expressions". What is the _point_? Causality is (or are) an abstract concept. Effect is an abstract concept. Number is a highly abstract concept. >I have no problem with >saying that algorithms can be abstractly expressed non-procedurally, >but they cannot be implemented non-procedurally. I have said this before: according to your definition, EVERYTHING is procedural. Clouds, cockroaches, ink drops swirling in milk, alpha particles tunnelling out of nuclei, they are all in the real world and according to you _everything_ in the real world is procedural. If you are right, a non-procedure expression is simply an impossibility, because strive as hard as we may, we end up with something in the real world, and you say "no, *everything* in the real world is procedural". >Now you're arguing worthless semantics. Obviously that's not what >I'm saying. No, it is *not* obvious. >How about "No algorithm implementation is not >procedural", which is what you know that I meant. No, I did *not* know that you meant that. But *any* method of expressing an algorithm (and as soon as you say "algorithm" you are conceding me finiteness, definiteness, input, output, and effectiveness") is an implementation. This is my claim: any intelligible expression of an algorithm _is_ an implementation. Proof: if a human being can determine that it _is_ the expression of an algorithm, then the human being has determined that s/he could at least in principle execute the algorithm with pencil and paper (the definition of effectiveness). Anything that no human being cannot figure out a method of calculating with pencil and paper (in principle), cannot be the expression of an algorithm. Since any human-intelligible expression of an algorithm _is_ by its very nature an implementation of that algorithm, and since you claim that no algorithm implementation is not procedural, then every human-intelligible expression of an algorithm MUST, if you are correct, count as procedural. >A mathematical expression is non-procedural, because it is a >statement of truth, and not an algorithmic sequence. _I_ have no difficulty with this, because to me most mathematical expressions are not algorithms. But _you_ say that _everything_ is procedural. If mathematical expressions are part of the real world, and if everything in the real world is procedural, how come you now surprise us with a claim that a mathematical expression is not procedural? And if a mathematical expression is not procedural, how is a constraint logic program, in which everything is non-directional statements of truth, procedural? >An SQL expression is not procedural, because it is not an algorithmic >sequence, it is a mathematical transformation. Here I disagree. I think an SQL expression _is_ procedural. Is an SQL expression an algorithm? Let's see Finiteness: yes. Definiteness: yes. Input: yes (the extensional data base). Output: yes (the computed relation). Effectiveness: yes. There is an obvious naive pencil-and-paper method for computing the result of an SQL formula. In fact, when students are taught SQL, they are sometimes required to evaluate SQL expressions in their exercises. So an SQL expression passes all the tests for being an algorithm. Why do you say it isn't? Sure, a computer program may compute the value of an SQL expression some other way, but that is equally true for every programming language. The student machine here has asynchronous loads: execute a load instruction, and the result might not arrive until 30 cycles later, during which time more than 100 other instructions may have completed. (It's an UltraSPARC.) The sequence _as written_ is not the sequence _as executed_, so we can't let _that_ be a criterion for whether something is procedural or mathematical. >In other words, procedural mechanisms have a start, middle, and >end. Non-procedural expressions do not. Hang on, you _denied_ that "steps" were an important part of a procedure, and now here you are listing three steps that must be part of every procedure. >> In short, the discussion with you has been a waste of time, because >> you were never making any claim with empirical content, only playing >> language games. >Well, I'm sorry you feel that way, but you are wrong. 1. You use a covert redefinition of procedure (you _still_ have not supplied a clear explanation of what you mean by this term) which conflicts with a book that has been one of _the_ most unfluential books in CS since 1968 2. You use a redefinition in which _everything_ is procedural, so there is simply no contrast. If everything has the Buddha nature, then to say that something has the Buddha nature tells you _nothing_ about it. 3. You use a redefinition which makes no empirical claim. If a digital computer with discrete components and discrete states is no more and no less "procedural" than an optical computer with continuous components and continuous states, then we are left with no way to predict which things you will call procedural and which things you will not. 4. You miss an *extremely* important point about computational mechanisms. The definition of "algorithm" given above, and the definition of "procedure" that goes with it, has an incredibly important and rather shocking consequence: Anything which cannot be expressed in terms of properties of polynomials with integer variables and integer coefficients cannot be computed. This is even true of quantum computers. Surely a definition of "procedure" or "procedural" which is to be _relevant_ to computing must be bounded by "procedures" (whatever they are) that can be followed by _computing_ mechanisms. This says to _me_ that the majority of processes occurring in the real world are not computable, so any definition of "procedure" that encompasses them os NOT RELEVANT TO COMPUTING. >You prove my point that programmers take the procedural nature >of the computer as so obvious as to be beneath discussion, but >it's not. I cannot stress this enough: THIS MUST BE LEARNED BY >STUDENTS. This is the primary, fundamental axiom of computers. No, I do not prove your point. No, we do _not_ take the nature to the computer as being obvious. We *DO* teach students about computability; we *DO* teach students that every known computing mechanism has no more and no less power than a Turing machine; we *DO* teach students about computer architecture, at say the level in Hennessy & Patterson and perhaps a little lower. But we sure as heck don't tell them that _everything_ in the real world is procedural. >How many questions do we get in this newsgroup where a student >simply didn't follow the flow of the program to see what happens? >This is so obvious to you and I that we don't think about it, >but *they didn't*! Because they have only a vague feeling of >flow, and are still looking at the program as a kind of >weird combination of a solid object and something with a flow >of time. There is a fatal flaw in your argument. Students don't think of looking in the index of their textbook. Students are capable of staring at a GNAT error message: "foobar.adb must be recompiled" and wondering what to do about it. Students are capable of handing in an Ada program starting like this: "procedure 8puzzle is". (That happened yesterday.) Amongst the students I have to deal with, some of the _native_ speakers of English have only the shakiest grasp of English grammar. It used to be that if I saw "Chinglish" comments I knew I was dealing with an Asian student. Not any more. There is a sort of "learned helplessness" around. It's not that they don't think of following the flow; it's that a lot of them don't think of _anything_; they wait for someone to _tell_ them. There is another flaw in the argument: the most powerful way of understanding a program that I know, and I am not a bad programmer, is to see it as a web of relations and constraints. When _I_ have a problem with an Ada program, I don't start playing computer, I look for relations that I have not made explicit, for constraints that I can check. >Take recursion. How can you not understand recursion if you >understand in your soul that computers execute a flow of >instructions? You can't, and that's the point. Understanding >the time axis is the key. There is spatial recursion as well as temporal recursion. I am completely comfortable with recursion in programming and have been since I met it. But I _understand_ recursion the same way I understand mathematical induction, and the train of thought when I write, analyse, or debug recursive code is at best indirectly related to the "flow" in the program. The core metaphors I use to grasp recursion are primarily spatial. -- Australian citizen since 14 August 1996. *Now* I can vote the xxxs out! Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.