From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,4751d44ff54a2c2c X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2002-08-03 05:24:22 PST Path: archiver1.google.com!postnews1.google.com!not-for-mail From: dewar@gnat.com (Robert Dewar) Newsgroups: comp.lang.ada Subject: Re: 64-bit integers in Ada Date: 3 Aug 2002 05:24:22 -0700 Organization: http://groups.google.com/ Message-ID: <5ee5b646.0208030424.39703482@posting.google.com> References: <3CE3978F.6070704@gmx.spam.egg.sausage.and.spam.net> <3D46DC69.7C291297@adaworks.com> <5ee5b646.0207301613.5b59616c@posting.google.com> NNTP-Posting-Host: 205.232.38.14 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 8bit X-Trace: posting.google.com 1028377462 16030 127.0.0.1 (3 Aug 2002 12:24:22 GMT) X-Complaints-To: groups-abuse@google.com NNTP-Posting-Date: 3 Aug 2002 12:24:22 GMT Xref: archiver1.google.com comp.lang.ada:27646 Date: 2002-08-03T12:24:22+00:00 List-Id: "Marin David Condic" > so having the requirement is not a > bad thing. Here is why I think it *is* a bad requirement. No implementation would possibly make Integer less than 16 bits. None ever has, and none ever would. Remember that the only really critical role of integer is as an index in the standard type String (it was a mistake to have string tied in this way, but it's too late to fix this) No one would make integer 8 bits and have strings limited to 127 characters. If you are worried about implementors making deliberately useless compilers, that's a silly worry, since there are lots of ways of doing that (e.g. declaring that any expression with more than one operator exceeds the capacity of the compiler). So why is the requirement harmful? Because it implies that it is reasonable to limit integers to 16 bits, but in fact any implementation on a modern architecture that chose 16 bits for integer would be badly broken. As for implementations for 8-bit micros, I would still make Integer 32 bits on such a target. A choice of 16-bit integer (as for example Alsys did on early on) would cause giant problems in porting code. On the other hand, a choice of 32-bit integers would make strings take a bit more room. I would still choose 32-bits. Anyway, I see no reason for the standard to essentially encourage inappropriate choices for integer types by adding a requirement that has no practical significance whatever. This is an old old discussion. As is so often the case on CLA, newcomers like to repeat old discussions :-) This particular one can probably be tracked down from the design discussions for Ada 9X.