From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: *** X-Spam-Status: No, score=3.2 required=5.0 tests=BAYES_00,RATWARE_MS_HASH, RATWARE_OUTLOOK_NONAME autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: fc89c,97188312486d4578 X-Google-Attributes: gidfc89c,public X-Google-Thread: 109fba,baaf5f793d03d420 X-Google-Attributes: gid109fba,public X-Google-Thread: 1014db,6154de2e240de72a X-Google-Attributes: gid1014db,public X-Google-Thread: 103376,97188312486d4578 X-Google-Attributes: gid103376,public From: "Tim Behrendsen" Subject: Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?] Date: 1996/09/18 Message-ID: <01bba504$8bfa27a0$87ee6fce@timpent.a-sis.com> X-Deja-AN: 181258006 references: <01bb8df1$2e19d420$87ee6fce@timpent.airshields.com> <515o3b$d7h@goanna.cs.rmit.edu.au> <01bb9fe6$7299d800$87ee6fce@timpent.a-sis.com> <519i0o$mgb@solutions.solon.com> content-type: text/plain; charset=ISO-8859-1 organization: A-SIS mime-version: 1.0 newsgroups: comp.lang.c,comp.lang.c++,comp.unix.programmer,comp.lang.ada Date: 1996-09-18T00:00:00+00:00 List-Id: Peter Seebach wrote in article <519i0o$mgb@solutions.solon.com>... > In article <01bb9fe6$7299d800$87ee6fce@timpent.a-sis.com>, > Tim Behrendsen wrote: > >The original progenitor of this thread is my assertion that > >the most important thing a student can learn is the fundamental > >procedural nature of the the computer. > > I would probably disagree. I don't consider the implementation to be > nearly as important as the design. Students should be learning the > principles of proof and design. Agreed that proof and design are important, but we are talking about first principles. We want to get them thinking in "normal" computer terms first, so that they are capable of knowing what they are learning later on. These are the students that I'm getting; all memorized knowledge and no understanding. Perhaps in the future, we will have "non-procedural" languages that can be useful over a broader range of problems, and learning these concepts will be less important. At this time in computer history, however, it's critical. > >Time is extremely useful in understanding how something works, > >although you don't normally invoke it's name. Even in your > >optical computer, you have light-encoded information flowing > >through mirrors, etc. *This is time*. If you are explaining > >to a student how it works, your finger will trace a path > >through the machine. > > But that explanation is an explanation of *another way to think > about it*, not of how it works. > > We frequently use the procedural view because it's the most natural for > languages such as English to describe; we don't have a "simultaneous" tense. > > This is frequently a downright *bad* way to model a problem, or to think about > the model. There may be some problems that are easier to understand/model using a non-procedural view, but they are usually very specialized and trivial. And the execution is always a mechanistic model. > >My entire point before we went down this road is that I think > >many teachers take it for granted that cause-and-effect and > >procedure are so obvious they don't need to be learned, and > >that's not true. > > They are important, but not necessarily the most important. But the point is, the concepts are frequently neglected as a subject in and unto itself. Given that this is by far the dominant implementation style, I don't think it's a stretch to say that is the most important thing to learn. > >> Yes, but fundamental to what everyone in computing except you means by > >> "procedure" is "a discrete sequence of steps". > > >No, fundamental to *you*, perhaps, but I have never heard this > >definition before. Of course, your optical computer fits this > >definition, since it has a discrete sequence of steps (mirrors, > >etc.), and you claim that *it* doesn't. > > That's what I'd mean by a procedure; if they aren't separate, they're only one > step. If there's only one step in the whole process, it's not much of a > procedure. > > How to build a house: > > Step 1. Assemble the materials in the form of a house. > > This is *not* a procedure. :) Sure it is; it's just a big procedure. :) Every step is composed of sub-steps; it just depends on how much you want to split them down. I execute a printf; this would commonly be referred to as one step. However, it invokes a great deal of assembly language to create the action. Taking it further, it causes a lot of gates within the CPU to open and close. Taking it still further, it causes various quantum changes within the transisters within the gates. You'll note that what you call a "step-by-step" process is actually composed of a continuous flow of electrons. > I would not think of an optical computer in terms of steps happening over > time, unless it was a very *large* optical computer. In practice, it will act > as though everything happens all at once and continuously. This is harder to > think about, but much more useful to understanding it, and less prone to > making the wrong kinds of design decisions. Ridiculous! Even if things happen very, very fast, it still requires time for the light to travel through the various gates. In practice, a modern CPU can add an array of a hundred numbers "as though it happens all at once and continuously". > >No, the definition is very clear. The point is that non-procedural > >*expressions* are an abstract concept. I have no problem with > >saying that algorithms can be abstractly expressed non-procedurally, > >but they cannot be implemented non-procedurally. > > So we should avoid procedural things and implementations until students are > comfortable with the abstractions, since abstraction is much more important to > an understanding of *anything* than any procedure or implementation. I strongly disagree. Abstraction is extremely difficult to understand without an understanding of cause-and-effect and procedure. Look at the questions we get in this newsgroup. I would wager that 90% of them are simply that the student doesn't follow the flow of the program (e.g. uninitialize variables). > >You prove my point that programmers take the procedural nature > >of the computer as so obvious as to be beneath discussion, but > >it's not. I cannot stress this enough: THIS MUST BE LEARNED BY > >STUDENTS. This is the primary, fundamental axiom of computers. > > Nonsense. It's completely irrelevant to a good understanding of many > aspects of computers, and downright misleading for whole families of > programming languages. I should have said the execution of programs must be procedural; my phrasing was not very good. > It is very important that they learn to use this view of computers; > it is neither desirable nor necessary that they believe it is the only view, > or that it is the most correct view, or any such. > > >How many questions do we get in this newsgroup where a student > >simply didn't follow the flow of the program to see what happens? > > Quite a lot. We also get a fair number where the flow of the program is not > relevant to an understanding of the problem. We even get problems where > an attempt to think of the "flow" of the program will bite the student, > because they're compile-time problems, not run-time problems. > > >This is so obvious to you and I that we don't think about it, > >but *they didn't*! Because they have only a vague feeling of > >flow, and are still looking at the program as a kind of > >weird combination of a solid object and something with a flow > >of time. > > It is. It is, in the sense that each line of code is a solid object, but not in the sense of the overall flow. Yes, there are certain languages that are do not have a programmer-specified flow, but we are talking about first principles. > >Take recursion. How can you not understand recursion if you > >understand in your soul that computers execute a flow of > >instructions? You can't, and that's the point. Understanding > >the time axis is the key. > > I disagree. I think I understand recursion, and I don't think of it > in time at all, I think of it in space. Recursion doesn't move forward > in time, it moves down in space. I tend to look at the naive recursive > quicksort as a tree of sorts, not as a sequence of sorts. This makes it > much more obvious how it *works*. But moving in space *is* time. You can't get away from it. Time is only one element; data movement is certainly important as well. The point is, execution requires time, which is why teaching this fundamental point is so important. > Not how it's implemented, mind you; how it *works*. > > The flow in a recursive algorithm is up-down, not forward-back. Well, all algorithms are up-down if they have any looping in them. :) To tell you the truth, I'm not sure what "forward-back" means v.s. up-down. -- Tim Behrendsen (tim@a-sis.com)