From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,a3ca574fc2007430 X-Google-Attributes: gid103376,public X-Google-Thread: 115aec,f41f1f25333fa601 X-Google-Attributes: gid115aec,public From: dewar@merv.cs.nyu.edu (Robert Dewar) Subject: Re: Ada and Automotive Industry Date: 1996/11/15 Message-ID: #1/1 X-Deja-AN: 196842993 references: <3280DA96.15FB@hso.link.com> <1996Nov6.210957.3070@ole.cdac.com> <1996Nov8.183051.21638@ole.cdac.com> organization: New York University newsgroups: comp.lang.ada,comp.realtime Date: 1996-11-15T00:00:00+00:00 List-Id: one more thought on for i in 1 .. 10 loop it is almost always a mistake in a programming language like Ada to introduce shortcut notations that are supposedly helpful. In the short run it may save a little bit of writing and make the language a little easier to initially handle, but in the long run it causes confusion. Another example in Ada is the x : array (1 .. 10) of ... which again defaults to type Integer. Given that we generally accept that it is a bad idea for programs to use the built in Integer type, it is quite inappropriate to have this kind of default. Typing is pretty central to Ada, and anyone who refuses to type things is going to be unhappy, an their unhappeiness will not be assuaged by kludges of this type. With respect to the array notation, it is ALWAYS a bad idea to take advantage of this. If for some reason you want type Integer say so: x : array (Integer range 1 .. 10) of .. This will immediately draw attention to the fact that you are introducing the use of a potentially implementation dependent type into your code. For an example from another language of similar thinking, consider Algol-68. The notion of references and the unification of variables and poitners are central to the design of this language (some love it, some hate it, but that is a different story). The point is that you cannot write in Algol-68 without a clear understanding of the reference concept. This means that an ordinary integer variable is in fact of "reference to integer" type (ref int in algolese). Consequently, one would expect that the normal way of declaring an uninitialized variable would be ref int x; but in a fit of trying to be more accomodating to people used to Pascal or Algol or Fortran or gosh-knows-what, the Algol-68 design uses the shorthand int x; to declare x as a variable of type ref int. This saves a bit of ink, and makes the program superficailly more similar to other familiar languages, but in the long run, it causes huge confusion and was a mistake! Trying to make language X friendly to foreign syntactic and semantic thinking imported from language Y is a risky occupation (substitute X = C++ and Y = C for a more familiar example, or X = JAVA and Y = C++)