From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,4751d44ff54a2c2c X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2002-08-04 07:07:03 PST Path: archiver1.google.com!postnews1.google.com!not-for-mail From: dewar@gnat.com (Robert Dewar) Newsgroups: comp.lang.ada Subject: Re: 64-bit integers in Ada Date: 4 Aug 2002 07:07:03 -0700 Organization: http://groups.google.com/ Message-ID: <5ee5b646.0208040607.ebb6909@posting.google.com> References: <3CE3978F.6070704@gmx.spam.egg.sausage.and.spam.net> <3D46DC69.7C291297@adaworks.com> <5ee5b646.0207301613.5b59616c@posting.google.com> <5ee5b646.0208030424.39703482@posting.google.com> <3D4C2805.62563584@adaworks.com> NNTP-Posting-Host: 205.232.38.14 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 8bit X-Trace: posting.google.com 1028470023 4754 127.0.0.1 (4 Aug 2002 14:07:03 GMT) X-Complaints-To: groups-abuse@google.com NNTP-Posting-Date: 4 Aug 2002 14:07:03 GMT Xref: archiver1.google.com comp.lang.ada:27665 Date: 2002-08-04T14:07:03+00:00 List-Id: Richard Riehle wrote in message news:<3D4C2805.62563584@adaworks.com>... > I assume you are talking about Standard.Integer. If so, > this > would not correspond to the way software is so frequently > written for these machines. In particular, for I-8051 > platforms, it would introduce a potential inefficiency > and > force the programmer to explicitly declare a shorter > integer (absent the availability of > Standard.Short_Integer). First. I trust we all agree that new Ada code should almost NEVER EVER use type Integer, except as the index of the String type. To talk of the programmer being "forced to declare a shorter integer" is very peculiar, since this is nothing more than good Ada style. If this approach helps persuade incompetent programmers to adopt better style -- GOOD! Second. In fact legacy code does tend to over use Integer. So in practice when acquiring or porting legacy code, this may be an issue, but this is *precisely* the case where making Integer 32 bits can be appropriate, because most of that badly written code that uses type Integer will have assumed that integer is 32 bits. However, the issue of unwanted overhead on the String type is an issue (too bad these got intertwined in the original design). > Since we often counsel designers to specify their own ^^^^^ I trust this is a typo for *always* > numeric types anyway, this is probably not a hardship, > but it could be troublesome for an experienced I-8051 > programmer who expects 16 bit integers. I don't understand, is this "experienced I-8051" programmer an experienced Ada programmer. If so, he has no business using Standard.Integer. If not, and he is writing Ada in C style, then perhaps the choice of 32-bit integers will help cure this bad practice. > Consider, for example, that simply > pushing a 16 bit entity on the stack requires storing two > eight-bit stack entries. To store a 32-bit integer > would > take four stack entries. The corresponding inefficiency > would be intolerable for most I-8051 applications. Yes, but it is also intolerable for this experienced I-8051 programmer to be using Standard.Integer explicitly. > One reason I like Ada is because we can define our own > numeric types. Exactly, so what's the issue. > Though there are few machines still > extant that use storage multiples of other than > eight-bits, they > do still exist. I think the compiler for the Unisys > 11xx > series has a word size of 36 bits. Randy can correct me > on that if I am wrong. Yes, of course it's 36 bits (that was a port of Alsys technology with which I am familiar). > Don Reifer recently told me that one reason Ada was becoming > irrelevant, and his reason for recommending against its use for > new projects, was that it is not sufficiently flexible to support > the new kinds of architectures in the pipeline. This is complete and utter nonsense. Where on earth does Reifer get these crazy ideas? > Though I disagree > with him on this assessment, forcing the language to > correspond > to a single word-size architecture (read 32 bits) would > be > play into his flawed view of Ada's value for new > software. I find this completely puzzling. Given that in Ada code we always define the integer types we want, the standard forces nothing. Well we still have the tie in with Integer and String, and that is worthy of discussion, but this business of claiming that Ada is flawed because incompetent programmers misusing Standard.Integer might get an integer size they do not expect is really not a sustainable argument. Indeed a good argument can be made that the type Integer should never have been introduced in the first place. It is an unnecessary concession to C and Fortran programmers.