From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.3 required=5.0 tests=BAYES_00, REPLYTO_WITHOUT_TO_CC autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 109fba,baaf5f793d03d420 X-Google-Attributes: gid109fba,public X-Google-Thread: fc89c,97188312486d4578 X-Google-Attributes: gidfc89c,public X-Google-Thread: 103376,97188312486d4578 X-Google-Attributes: gid103376,public X-Google-Thread: 1014db,6154de2e240de72a X-Google-Attributes: gid1014db,public From: seebs@solutions.solon.com (Peter Seebach) Subject: Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?] Date: 1996/08/02 Message-ID: <4ttksk$9lt@solutions.solon.com> X-Deja-AN: 171625616 references: <01bb7da1$323102a0$96ee6fcf@timhome2> <4ttdlg$2pr@news.ida.org> organization: Usenet Fact Police (Undercover) reply-to: seebs@solon.com newsgroups: comp.lang.c,comp.lang.c++,comp.unix.programmer,comp.lang.ada Date: 1996-08-02T00:00:00+00:00 List-Id: In article <4ttdlg$2pr@news.ida.org>, David Wheeler wrote: >That's pretty clueless. However, depending on how well they >understood their programming language, they might still produce >a reasonable result. Why? Because, although they may not realize >that some of these "conversion" operations have no run time impact, >they may know how to combine the features of their language to >produce solutions. And, if their language doesn't specify a binary representation, they may get an algorithm right on a weird machine that still uses BCD or decimal internally. >Any professional developer must understand several assembly languages, >and how they work in general. I think this is no longer true. Understanding an assembler buys you nothing; all it tells you is that at least one machine had a certain kind of semantics inside it. You *don't care*. If you're writing a *solution* to a *problem*, those are the only things you need to be working with. If you try to make sure your solution fits your imaginings of the underlying machine, you constrain it unnaturally. Leave the unnatural constraints for the rare occasions when your code actually has a detectable performance problem that can't be quickly traced to a broken algorithm. >The language you use also determines the amount of detail >you need to know about the underlying machine abstraction. Quite true. >When you're using C or C++, you really need to know how the >underlying assembly language works. Quite untrue. >Why? Because pointer arithmetic >makes little sense if you don't, and tracking down rogue pointers >overwriting your local stack values and other nasties is impossible if >you don't understand how underlying machines work. This is simply nonsense. Professionals and competent engineers do not need to be told how it is that rogue pointers work; they need to be told what kinds of operations produce unpredictable results. I have never even tried to understand an assembly language. Yet, I am mysteriously able to use pointer arithmetic correctly, and I have tracked down any number of bugs involving rogue pointers. Pointer arithmetic is defined in terms of objects in C. You don't need to know what the underlying processor is doing, because you never see it. You work in terms of sizeof() and you know that sizeof(char) is 1. You know that pointers move and increment by the size of the thing pointed to. None of this depends on assembly. (conceptually. It is often implemented in machine code, but you don't need to know that to make it work, or to use it correctly.) >Most of the >basic C/C++ operations are defined as "whatever your local machine does", >so it's really part of the C/C++ definition. Will you folks *please* stop discussing "C/C++"? I'm discussing C here, because C++ is a vastly different language, and I don't know it. You have attained only the second level of enlightenment. The newbie uses a machine, and to the newbie, that is all machines; the newbie's code runs only on that machine. The experienced programmer uses several machines, and has learned how each behavies. The experienced programmer's code runs on several machines. The true programmer uses no machines, and uses the specifications granted by the language. The true programmer's code runs on any machine. The C definition explicitly *DOES NOT* include the semantics of the local machine; rather, it says that each implementation must document these things, but a portable C program *can not use any of them*. Further, it defines quite a few semantics explicitly. For instance, unsigned integers are handled modulo N, for some suitable N. Always. Everywhere. No C compiler may signal overflow on unsigned arithmetic. When a program depends on the way the local machine handles a basic operation, that C program is either OS or library code, or broken. The semantics guaranteed by the language are quite strong enough for 99% of most real projects. The other 1% is why people ask what your experience is specifically when hiring a programmer. Many programmers seem to be able to push that number down, by finding more and more ways to depend on things they don't need. This is stupid. >A professional developer using other languages should still know this >level of detail, but it's frankly less important in Java, Ada 95, Eiffel, >and Smalltalk (among others). Who cares that "if" becomes a branch >instruction? Unless you call outside the language, you won't even get >to see how some capabilities are implemented in some languages. The >result: the detail you need to know to solve a problem shrinks. Thus, >you can concentrate on learning how to abstract - the real problem. A C programmer who cares whether an "if" becomes a branch or not is wasting his own, and everyone else's, time. That's not part of the language, nor should it be. >It's not clear it's REALLY taught that way. Lots of schools just give >a quick intro to a programming languages or two, and then show other >people's abstractions (instead of teaching how to abstract). Besides, >many students just "get by" instead of learning, and many people >find abstraction really hard to grasp. Especially because unenlightened ones keep telling them that they should care how or why their code works, rather than telling them to study the specifications. If I had a dollar for every student who's been burned, and badly, by the infamous advice "try it on your compiler and see what happens", I wouldn't need to work for a living. >The answer: because you don't REALLY need to know that level of detail; >an abstraction of a computer is good enough. Empirically, the abstraction of "A C implementation" is good enough for most code. >What I do need to know is how to abstract a problem >into parts so I can solve it. I agree with Mr. Dewar - abstraction >is most important, worry about that first. This is quite true. >Thus: If solving problems is the key skill, start with a high-level >language that lets you concentrate on learning how to solve problems. This is good advice. I'm inclined to think that, properly taught, C can be such a language. However, I'd rather start people in Icon or perl, myself. -s -- Peter Seebach - seebs@solon.com - Copyright 1996 - http://www.solon.com/~seebs Unix/C Wizard - send mail for help, or send money for consulting! The *other* C FAQ, the hacker FAQ, et al. See web page above. Unsolicited email (junk mail and ads) is unwelcome, and will be billed for.