From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: fc89c,97188312486d4578 X-Google-Attributes: gidfc89c,public X-Google-Thread: 109fba,baaf5f793d03d420 X-Google-Attributes: gid109fba,public X-Google-Thread: 103376,97188312486d4578 X-Google-Attributes: gid103376,public X-Google-Thread: 1014db,6154de2e240de72a X-Google-Attributes: gid1014db,public From: smosha@most.fw.hac.com (Stephen M O'Shaughnessy) Subject: Re: What's the best language to start with? [was: Re: Should I learn C or Pascal?] Date: 1996/08/05 Message-ID: X-Deja-AN: 172324040 sender: usenet@most.fw.hac.com x-nntp-posting-host: smosha references: organization: MESC mime-version: 1.0 newsgroups: comp.lang.c,comp.lang.c++,comp.unix.programmer,comp.lang.ada Date: 1996-08-05T00:00:00+00:00 List-Id: My response to an E-mail from Patrick Horgan Pat, I tried to e-mail this back to you and it bounced, so I am posting it here on CLA At 04:41 PM 8/4/96 -0700, you wrote: > >I think you and most of the people in this discussion err in thinking that >there's this artificial dichotomy between specific and abstract knowledge, >and that there's some reason one is better to learn first. In the real >world both are learned together, watch any kids. We're always learning >specifics and abstracting generalities. I disagree. I think we learn generalities and as we continue learning we pick up the specifics. Continued learning is the learning of the specifics. > >> tie our shoes without knowledge of knot theory. We learn to read >> without a knowledge of grammer. We learn addition, subtraction, multi. >> and division without any formal mathmatical proofs. In fact learning basic >> arithmetic is an abstraction as most school children learn by counting >> pennies and dimes. This prepares us for 90% or more of the problems we > >Apparently you haven't heard of the research work of Dr. Maria Dominguez of >Venezuela. She felt that mathematics was taught all wrong starting with >concrete seeming pieces and eventually learning enough abstractions at the >college level (if you stuck it out that far) to understand the whys and hows >of math. She taught what was essentially an Abstract Algebra course to >preschool children and then built up through a ten year study. She ended >up with kids at the eighth grade level that were handling doctoral level >mathematics...not just a few of the bright kids, but almost all of the kids. >It seemed that understanding a conceptual framework to fit math in instead >of memorizing rules made it easier for them...go figure:) > Though I am not familiar with this research, your brief description sounds like support for my assertions. We do not teach children number theory, we teach them to count. And not count numbers, they count apples and oranges and fish. "One fish, Two fish; Red fish, Blue fish". When we teach math we step back from the details (concrete) to a higher level, a set of apples in front of the student. Memorizing rules is not an abstraction, that is the concrete. Applying those rules to every day situations, without direct reference to those rules, is the abstraction. >> you encounter. (I do not consider assembly as a language. Despite it's title >> of assembly *language* it is a code not far enough removed (i.e. abstracted) >> from the underlying hardware). > >That's a bit arbitrary;) Most machine code is abstracted from the hardware >at least a level or two through microcoding. You support my point. We don't agree on how much abstraction is desirable, but machine code is abstracted from hardware through microcode; to which I add assembly code is abstracted from machine code and higher level languages are abstracted from assembly code. It is at this higher level of abstraction that I believe is the best place to start with learning programming. >> I am not saying that the basic principles are not important. If one is going >> to make a career of programming these principles are crucial. But I don't >> believe one can recognize the underlying principles without the *shroud* of >> abstractions to frame them. > >Eh? I thought the basic principles were the abstractions and their expression >in a language was the concrete? I'm completely unclear what you mean here. No, basic principles of hardware, microcode, machine code (assembly) are the concrete that I was refering to. In the real world it is electron moving through circuits that is taking place at the most basic level. It is from here that we must abstract to higher levels to solve real world problems. In looking at the problem space, yes the language that we express the problem with is less abstract, more concrete. I was responding to other posters who claimed that a fundamental understanding of the hardware, microcode, assembly language was necessary for a *first* understanding of the problem. I was coming from the other direction: abstracting up from the hardware to the problem space, not, as I think you are doing, moving from an abstract problem to a more concrete solution. I feel we are both correct, just arguing different points. >> >> problem: Add two numbers >> Q: How? >> A: Put them in registers A and R5 then do ADD A,R5 >> Q: What is a register? >> A: A place to hold data. >> Q: What is data? >> A: A collection of 8 bit bytes. >> Q: What is a bit/Byte? >> A: ... >> >> OR >> >> problem: Add two numbers >> Q: How? >> A: C := A + B > >You forgot to ask all the questions for this one. If you did it would go much >farther and be a lot more complicated. General numeric expressions are much >more complicated than hardware level expressions of them. >Q: What's a number >Q: What's add >Q: What's = These are valid questions. I left them out because they apply to both situations. My point was that if I need to add the sales tax on a purchase total, I really don't care what register the result is stored in or how many bits it takes. I want to abstract away from those details. A some point in my career I will need to know more detail about data types, but for the beginning student that level of detail is unnecessary >From the book "Programming in Ada" by J. G. P. Barnes, pg 3: "The first advance occured in the early 1950s with high level languages such as FORTRAN and Autocode which introduced 'expression abstraction'. It thus became possible to write statements such as X=A + B(I) so that the use of machine registers to evaluate the expression was completely hidden from the programmer." > >> So first teach them to express these problems with a computer language. Ada, >> with it's strong typing, maps very well to real world objects. Strong typing >> is about abstraction and that enables the hiding of irrelevant details. > >This just isn't true, strong typing has nothing to do with abstraction. We are obvioulsy not in agreement about the definition of abstraction, or perhaps we are applying it differently. "The brightest highlights of Ada95 are its inherent reliability and its ability to provide ABSTRACTION through the package and PRIVATE TYPE" Ada95 Rational, Introduction, pg II-1 first paragraph, first sentance. (Emphasis are mine) "The emphasis on high performance in Ada applications, and the requirement to support interfacing to special hardware devices, mean that Ada programmers must be able to engineer the low-level mapping of algorithms and data structures onto physical hardware. On the other hand, to build large systems, programmers must operate at a high level of ABSTRACTION and compose systems from understandable building blocks. The Ada TYPE system and facilities for separate compilation are ideally suited to reconciling these seemingly conflicting requirements." Ada95 Rational, Overview of the Ada Language para. III.1.5 "The Ada language goes further, providing for the definition of TYPEs that can only be manipulated according to the type's abstract operations. Thus the Ada language provides direct support for the idea of an abstract data type." Norman H. Cohen, Ada as a Second Language, pg 64. Patrick, typing has everything to do with abstraction. >> This is so important when we are trying to learn something. (That something >> we are trying to learn is how to solve real world problems with a computer, >> not how a computer works). > >No, learn both, learn everything, don't be prejudiced against some types of >learning. You don't know what you're missing. Understanding things from >both the top down and the bottom up adds a richness to your ability to >deal with abstract concepts in your head, and figure out some concrete (i.e. >programatic) expression of an abstract solution. > I agree with you. I said we must learn both. My point is that we can't learn both at the same time. I believe it is best to start with the higher level to get a good feel for what the problems are and how to solve then, then work our way down to more basic levels of the *machine* to see if there might be better ways to solve specific problems on specific machines. With all do respect to Dr. Dominguez, I don't believe you can teach phd level mathmatics without first teaching them to count. From my point of view, counting, while a basic math skill, is an abstraction away from the details that phd level students study. Perhaps counting, because it is so basic, is a poor example. But I hope you see my point. In the realm of mathematics, counting abstracts away number bases, number theory, addition, etc. When the child has a firm grasp of this, then we can show her more of the details such as addition is counting by one, that zero is just a place holder, that multiplication is addition with fixed steps (i.e. 2*2 is 2+2, 2*3 is 2+2+2, 2*4 is 2+2+2+2 ...) We are not in agreement as to where the abstraction is taking place and where assumtions about a solution are being made. >> No. Learn Ada > >No, it just doesn't really matter compared to the real issue...just keep learning:) > >Most people aren't going to learn a number of languages though, and the original >question was specifically about what would make him marketable. Ada won't cut >it since it's not used outside of the defense industry and the pay scale in >defense is low and the work environment is fraught with risk. > You are correct in recommending C for the most job choices. You are wrong is assuming Ada is niche language with no future. More than half the project being written in Ada today are outside the defense industry. As a result of many of it's side benefits, such as readablity, maintainablity, as well as things like it is the only recognized, standardized object oriented language, Ada is rapidly becomming a language of choice for large projects.