From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=0.4 required=5.0 tests=BAYES_00,THIS_AD autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,751584f55705ddb7 X-Google-Attributes: gid103376,public From: John G. Volan Subject: Re: Side-effect arithmetic again [was: Ada ... in embedded systems] Date: 1996/03/30 Message-ID: <4jkfpj$jl1@dayuc.dayton.saic.com> X-Deja-AN: 145078958 distribution: world references: <4iq4k7$1p6@fozzie.sun3.iaf.nl> content-type: text/plain; charset=ISO-8859-1 x-xxmessage-id: organization: Science Applications International Corp. (SAIC) mime-version: 1.0 newsgroups: comp.lang.ada Date: 1996-03-30T00:00:00+00:00 List-Id: In article Robert A Duff, bobduff@world.std.com writes: >In article <4jchbi$ep0@dayuc.dayton.saic.com>, >John G. Volan wrote: ... >>Then why did we ever bother with any of the identifiers on the left >>below, when the more terse identifiers on the right would have done >>"just as well" ... as long as you happened to be a member of that >>select coterie known as "professional programmers": > >Because using zillions of strange abbreviations makes the code harder to >understand, FOR PROFESSIONAL PROGRAMMERS. On this, we seem to be in violent agreement. :-) >My point is just that this >has nothing to do with making the code understandable by random folks >off the street -- that's clearly impossible. You cannot understand any >substantial Ada program if you don't know Ada -- and using nice >un-abbreviated identifiers won't change that fact. I take your point, and I'm sorry I glossed over it before -- I thought Tore Joergensen already did a good job of responding to this. I agree with Tore that (1) yes, you're right, it does take more than just a working knowledge of English to fully understand an Ada program, or at least to understand it enough to do maintenance on it; but (2) not everyone who is involved in the development of software is necessarily a professional programmer that's doing the actual coding. IMHO, the fewer obstacles to understanding we put in the way of these associated folks, the better things will be all around. But perhaps that's a minor point. Let me turn your argument around another way: Even a professional programmer doesn't start out as one. There is some point in the education of every professional programmer where a "random person off the street" gets their first introduction to programming concepts and computer languages. Usually, this happens at least a decade after that person first learns to read and write their own natural language. (At least, at this stage in human history.) IMHO, anyone with reasonably good literacy skills should have a decent chance of becoming a software engineer. But I'm afraid that, as you so aptly put it, "zillions of _strange_ abbreviations" [my emphasis] do a lot to get in the way of that happening. I suspect that many intelligent people who might have made important contributions to our field were turned off to the whole idea, simply because of the obscurely cryptic nature of so much of the code that's written these days. This is a shame, because I strongly believe that this ad hoc crypticness is totally gratuitous -- there is nothing fundamentally inherent in software that requires it to be obscure and obfuscated, and it only tends to be that way for _cultural_, not _technical_ reasons. So the playing field is ceded to two groups of individual: (1) Those who eagerly embrace and perversely thrive on ad-hoc crypticness, in the extreme case even _promoting_ it as a cynical guarantee of their job security -- to whit, hackers (spit, spit); and (2) those stalwart few who can overcome the hurdles set up by the hackers, and through fortitude and perserverance learn to talk to computers, without forgetting how to talk to people. Things are getting better, but I'm afraid that the latter group is still a minority. ... >I want Ada to be readable so that people >who *do* know it can read it. (And, of course, it's always a good idea >if the language is simple enough, so that people who *don't* know it can >*learn* it.) ... Hmm ... I think in retrospect you actually agreed with the point I just made! :-) ... >Using Increment instead of "++" isn't going to make the program horribly >verbose. My complaint is that people are suggesting a generic >instantiation, which is a bunch of useless verbosity, to replace >something that's very simple in C. ... >>Hey, don't look at me. My generics were about as inanely >>direct as I could make them. > >My point is that they're not (and cannot be) "inanely direct" *enough*. >If you try to convince a C programmer to change "foo[i]++;" to >"Increment(Foo(I));", that's fine. But to require a generic >instantiation in addition would make the C programmer laugh, and rightly >so. I think we'd be better off just admitting that there's no direct >replacement for ++ in Ada, and tell the programmer, "Sorry, but you have >to write "Foo(I) := Foo(I) + 1;". Oh, I don't think it's as bad as all that. Instantiating a generic isn't all _that_ big a deal, and for beginners it can simply be an incantation that they can come to understand later. But perhaps you have a point that something as "simple" as side-effect arithmetic shouldn't warrant a generic -- but where do you draw the line at "simple"? Is I/O too "simple" to warrant having to instantiate Text_IO's generic subpackages; should we have had something analogous to printf instead? Is deallocating a heap object too "simple" to warrant having to instantiate Unchecked_Deallocation, or should we have had a built-in free or delete operation? Is breaking the strong typing system when you need to too "simple" to warrant having to instantiate Unchecked_Conversion, or should Ada have just allowed arbitrary type conversions? These are all things an arrogant C-fanatic might conceivably laugh at, but I think we Ada programmers ought to have thick enough skins not to let that bother us. ... >>But you don't like a generic as a workaround at all? Alright, suggest >>something better, maybe even an Ada0X improvement, if you think it's >>warranted. > >I did suggest a different workaround, using derived types. Yes, that's a good technique, one that I've actually used in my own code, now that I think about it. More people should realize that any sort of type, even scalar types, can be endowed with new _primitive_ subprograms which can be _inherited_ by derived types. This was true even in Ada83. For all that Ada83 was branded as an "object based" language, but not an "object oriented" language, people should give Ada83 credit for actually having _inheritance_, even though it didn't have Ada95's mechanisms for type-extension and classwide polymorphism. ... >>...Just don't tell me the answer is: "Either thou must accept >>the received perfection of Ada95 as it is, or thou mayst as well become >>a C programmer". Sheesh indeed. > >OK, I won't tell you that -- I don't believe it. I'm glad of that! :-) ------------------------------------------------------------------------ Internet.Usenet.Put_Signature ( Name => "John G. Volan", E_Mail => "John_Volan@dayton.saic.com", Favorite_Slogan => "Ada95: The *FIRST* International-Standard OOPL", Humorous_Disclaimer => "These opinions are undefined by SAIC, so" & "any use would be erroneous ... or is that a bounded error now?" ); ------------------------------------------------------------------------