From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: fc89c,97188312486d4578 X-Google-Attributes: gidfc89c,public X-Google-Thread: 109fba,baaf5f793d03d420 X-Google-Attributes: gid109fba,public X-Google-Thread: 1014db,6154de2e240de72a X-Google-Attributes: gid1014db,public X-Google-Thread: 103376,97188312486d4578 X-Google-Attributes: gid103376,public From: ok@goanna.cs.rmit.edu.au (Richard A. O'Keefe) Subject: Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?] Date: 1996/08/14 Message-ID: <4urp88$pqa@goanna.cs.rmit.edu.au> X-Deja-AN: 174509079 references: <31FBC584.4188@ivic.qc.ca> organization: Comp Sci, RMIT, Melbourne, Australia newsgroups: comp.lang.c,comp.lang.c++,comp.unix.programmer,comp.lang.ada nntp-posting-user: ok Date: 1996-08-14T00:00:00+00:00 List-Id: There's a debate full of sound and fury going on. So far I haven't noticed anyone else make these points: (a) I personally don't know any really good programmers who don't understand "assembler". Understand != like; understand != can write large volumes of code; understand != normally uses assembly level concepts. What's really involved here is a grasp of computer _architecture_. 8086, 68000, ns32000, HP-PA, SPARC, Power-PC, /370, PDP-11, even KA-10 have a *heck* of a lot in common. Having used a B6700 in my undergraduate years, these machines all look pretty much the same to me: untagged byte addressed register machines with a linear address space (in the case of the 8086, there is a linear address space _there_ you just have to do strange things to get to it). But even the B6700 fits into the general - linear physical address space - load, store, operate - fixed set of data sizes - approximate arithmetic uses (hardware or software) floating point instead of any of the several alternatives that have been proposed. mould, well described by the (abstract) RAM cost model within reasonable limits. People who say "but if you learn an assembler, you think in terms of that machine for the rest of your life" are not talking about *good* programmers; they are talking about the trained seals who will write C for the rest of their lives whatever programming language they use. A *good* programmer is no more limited by the first machine s/he learns (in my case I learned the IBM 650 from a book shortly before I learned the IBM /360 from another book; I have seen a 650 in a museum but never programmed one) than s/he is limited by the first programming language s/he learns (in my case, Fortran and Algol 60, more or less at the same time). (b) To me, one of the main advantages of knowing about computer architecture is that I have rock-solid confidence that all these towers of abstraction actually bottom out in things I understand. Well, I've forgotten most of the semiconductor phsyics they taught me, but give me enough 74xx chips and I could build a computer if I had to. The point here is my psychology-of-learning: I am comfortable with abstraction, but I have to know that there is something at the bottom. I'm quite happy to work with the lambda calculus, for example, but I know how to implement it on hardware that I understand, so I proceed in the *confidence* that it is something I could take apart in as much detail as I wanted, given time. Similarly, I was able to learn Prolog fast, because I had previously written a theorem prover, in a lisp- like language that I had implemented myself. So I *knew* for certain, without any worry, that this was the kind of thing that *could* be implemented on a computer. I didn't have to understand how the Prolog system I was using actually worked to be sure that Prolog was the kind of thing that _could_ work. Let's face it: how many of you were taught about integration (a high level abstraction) before you had some practice with counting rods (the natural numbers are an abstraction, but counting rods are solid wood that you can hold in your hands and *see* that 1+2 = 2+1 = 3). On of the problems with "New Math" was excessive talk about abstractions (commutativity and so on) before children had thoroughly mastered the concrete experiences (counting, addition) the abstractions are based on. So, - some people don't handle abstraction well, - some people handle abstraction very well, but require an understanding of what's underneath the layers of abstraction for comfort, even if they don't _use_ those layers whil manipulating a particular abstraction - some people handle abstraction well, and don't mind not understanding what's underneath the abstraction What proportion of the general population belongs to each group is an empirical question about which I have no information. What proportion of the population of presently or potentially good programmers is another empirical question about which I have only anecdotal information. My personal experience has been that people in the first group do not become good programmers, and that people in the third group write papers, not programs. But of course my experience is limited to a few hundred people from a narrow range of ages and cultures. (c) The conclusion we draw from this is that different people are likely to require different approaches (most of the PC computer books are clearly aimed at people who have no ability with abstraction at all, in consequence I find them frustratingly unreadable) and this includes different initial languages. Some people will learn best by starting with Scheme. Some people will learn best by starting with transistors, going on to gates and flip flops, moving on to ALUs, then learning say DEC-10 with its assembler, then Pascal, then Lisp, then ML. Me, I learned bottom up and I'm glad of it. (For the record, I use the highest level languages I can get my hands on. But I still have to check assembly code from time to time.) (d) The *real* problem we have trying to teach some of this stuff is that there just isn't enough *time*. There are so *many* things out students "need" to know: user interfaces, relational data bases, a couple of programming languages, software engineering basics, you name it. RMIT actually has a reputation for graduating employable students, and I honestly don't know how we manage it when I think of all the knowledge I deploy in programming and the tiny fraction of it they get. I am still rather unhappy about letting anyone use floating point arithmetic without a semester of numerical analysis so they at least know where the pitfalls are and when to call for help. There are no royal roads and there isn't enough time. Now for something rather different. I am cosupervising a masters student who is trying to get a handle on *measuring* the effect of first year language. This is actually part of RMIT's Quality Control program, and Dr Isaac Balbin came up with the idea of getting someone to see if we could actually _measure_ whether selecting a particular first year language (in our case, Ada) is or is not having the intended educational results. The student has conducted a literature survey, and has found a couple of survey papers, but they seem to be weak on actually measuring *outcomes* (other than whether the students _liked_ it or not, which is important, but not everything). So can anyone point me to some empirical results, where someone has done some before and after educational measurements, to see what effect a new first year language (e.g. switching from Pascal to Miranda, or Fortran to Modula-2, or whatever) has actually had? Unpublished results would be fine. -- Fifty years of programming language research, and we end up with C++ ??? Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.