From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.1 required=5.0 tests=BAYES_00, PP_MIME_FAKE_ASCII_TEXT autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII X-Google-Thread: 109fba,baaf5f793d03d420 X-Google-Attributes: gid109fba,public X-Google-Thread: fc89c,97188312486d4578 X-Google-Attributes: gidfc89c,public X-Google-Thread: 103376,97188312486d4578 X-Google-Attributes: gid103376,public X-Google-Thread: 1014db,6154de2e240de72a X-Google-Attributes: gid1014db,public From: bokr@accessone.com (Bengt Richter) Subject: Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?] Date: 1996/08/22 Message-ID: <4vgs4p$evl@news.accessone.com> X-Deja-AN: 175780741 references: <01bb846d$c6c01780$87ee6fce@timpent.airshields.com> <840288278snz@genesis.demon.co.uk> <01bb8c89$9023e3e0$87ee6fce@timpent.airshields.com> organization: - newsgroups: comp.lang.c,comp.lang.c++,comp.unix.programmer,comp.lang.ada Date: 1996-08-22T00:00:00+00:00 List-Id: "Tim Behrendsen" wrote: >Lawrence Kirby wrote in article ><840288278snz@genesis.demon.co.uk>... >> In article <01bb846d$c6c01780$87ee6fce@timpent.airshields.com> >> tim@airshields.com "Tim Behrendsen" writes: >> >> >Who's talking about showing them? I would suggest that if >> >they wrote a quicksort in assembler, they will have a much >> >better "feel" for the algorithm, than if they wrote it in C. >> >> They might have a good feel for how to implement quicksort in that >> particular machine code however it would be so wrapped up in >> implementation details that they wouldn't have a good understanding of >> the algorithm itself, at last not in the same timescale. >And they don't get wrapped up in implementation >details if they write it in C? My primary point is >when it's implemented in assembly, and you have to >manually move the bytes from one location to another >rather than have the compiler "carry them for you", >you get a better view feel for not only what the >algorithm is doing, but what the computer itself is >doing. What if computers are totally irrelevant? If an algorithm is an abstract procedure for manipulating symbols, or accomplishing some mathematical purpose? Then any computerized implementation will just be an instance of a broad class of implementation possibilities. And assembler would be just one of a number of possible intermediate code representations in translating to an executable representation for a particular platform/ OS environment. If you are more fluent in assembler than in the native terms/notation of the original abstraction, then you may look at the assembler code and say, "Aha, that's what that meant," and be helped in understanding the original algorithm, ...if the latter was relatively trivial. But I don't think looking at assembler is a good way to understand system-level workings of things, even though it may help with a given primitive. Even C can be too low a level. How do you see object hierarchies in assembler? Or even C++ source code -- you need a way of seeing the woods as well as the trees. E.g., the important aspects of OpenGL are not the language it's been ported to, though you can be sure there are assembler-coded primitives supporting any given implementation. An annotated dataflow diagram supplemented with some equations etc. may be one of the best helps in arriving at understanding. A riveter sees the importance of rivets everywhere, but it's a limited view. >The most important thing a student can learn about >computers is the fundamentally simple nature. The I think the "fundamentally simple nature" you are talking about is a simplified *view/abstraction* of an older-architecture computer. If you focus on small enough a detail, it becomes "simple," e.g., a bus transaction that moves a word from memory to cpu chip, but to keep it simple you may have to ignore caches, burst modes, split address/data transactions, multiple bus masters, etc. etc. not to mention multiple CPUs each with separate multiple execution units, pipelines, etc. Perhaps what is needed is a standard virtual machine for teaching purposes. >mystery of execution *must* be broken down in order for This mystery sounds suspiciously like step-step, i.e., a rule of sequential evaluation. I think a lot of these things benefit from being considered in the abstract (assuming one is comfortable in that domain). A nice book is "Anatomy of Lisp" by John Allen, (c) 1978 by McGraw-Hill, Inc. ISBN-0-07-001115-X. I don't think they'll mind if I quote a quotation the author put at the head of the preface: "'... it is important not to lose sight of the fact that there is a difference between training and education. If computer science is a fundamental discipline, then university education in this field should emphasize enduring fundamental principles rather than transient current technology.' Peter Wegner, Three Computer Cultures" As true now as then, I think. A particular assembler for a particular CPU/OS environment is "transient current technology." A well-chosen VM might not be. An object-oriented implementation of an emulator for such a VM might be an interesting multi-perspective project... >the student to being to think like a programmer. Of Smalltalk/LISP/CLOS or C++/Delphi or assembler/Fortran/C? ;-) >Once they can think like a programmer, all the rest >of the knowledge they learn becomes trivial. A terrifying prospect! After 70k+- hrs of OJT as programmer, I still run into stuff that seems non-trivial to me, thank goodness. But then I have been trying to get beyond my assembler origins ;-) BTW, what about thinking like a mathematician ;-) Or a problem-solver? Or an artist? If you are teaching programming, how much "transient current technology" is it worth while to teach, projecting trends to graduation time? Just enough to use it for lab work (education), or in depth as case studies exemplifying the principles you are trying to teach, or focused on a job market (training)? I don't think there is a one-size-fits-all answer. If you want students to know about *computers*, why not teach them a little logical design and electronics? So they have an idea of how data actually moves around in that minitower. Hm. Should we leave caches out of it? ISA vs VLB vs PCI? VME as example of non-synchronous alternatives? Do they need to know how to write a BIOS? What does the power-on reset do to all those chips on all those boards that it reaches? Oh, and how about the OS? After all, we don't usually run an assembler on the bare metal. So, to have a decent concept of what really is going on, don't they have to know how an OS boots up and gets to the point where it's able to run a shell? And how that shell gets its input and interacts with OS services to allocate memory and load and start execution of the assembler, and how that's going to get your code from a file and translate it to object form in another file. How about the linker and that whole story? Should we be satisfied with knowing how this works in DOS as OS? "Puh-leez!" It's maybe a little much to go into both Unix and WinXX (and system 7?), so what to choose? Ok, when are they going to be ready to learn about continuations? Or meta-object protocol? Or unification? or Corba? What is helpful? What is a scenic detour? IMHO, over-emphasis on the sequential-execution aspect of programming represented by assembler may well leave students so used to one view of the optical illusion that they cannot easily snap their mental image into another just-as-valid form. I.e., they may not "get" OO design very easily. My 2� Regards, Bengt Richter