From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,4751d44ff54a2c2c X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2002-07-31 06:22:21 PST Newsgroups: comp.lang.ada Path: archiver1.google.com!news1.google.com!newsfeed.stanford.edu!news-spur1.maxwell.syr.edu!news.maxwell.syr.edu!nntp.abs.net!uunet!dca.uu.net!ash.uu.net!world!news From: Robert A Duff Subject: Re: 64-bit integers in Ada Sender: news@world.std.com (Mr Usenet Himself) Message-ID: Date: Wed, 31 Jul 2002 13:20:55 GMT References: <3CE3978F.6070704@gmx.spam.egg.sausage.and.spam.net> <3D46DC69.7C291297@adaworks.com> <5ee5b646.0207301613.5b59616c@posting.google.com> NNTP-Posting-Host: shell01.theworld.com Organization: The World Public Access UNIX, Brookline, MA X-Newsreader: Gnus v5.7/Emacs 20.7 Xref: archiver1.google.com comp.lang.ada:27522 Date: 2002-07-31T13:20:55+00:00 List-Id: Keith Thompson writes: > dewar@gnat.com (Robert Dewar) writes: > > Richard Riehle wrote in message > > news:<3D46DC69.7C291297@adaworks.com>... > > > Robert, > > > > > > We still have quite a few embedded platforms for which 64 bit > > > integers are not supported. > > > > There is no reason for hardware support here, even the ia32 > > does not have hardware support, but 64-bit integers are > > very useful and must be supported, just as floating-point > > MUST be supported even on processors with no floating-point. > > For certain values of "must". I'm fairly sure that the Ada standard > does not require support for 64-bit integers, ... Yes, and I'm pretty sure Robert is well aware of that. Actually, Ada requires 16 bit integers (at minimum). Robert has argued in the past that this is silly -- too small to be of use, and better to let the market decide. Probably true. >... and I've worked with Ada > implementations that didn't support anything bigger than 32 bits > (System.Max_Int = 2**31-1). I don't know of any Ada implementation that only supports 16 bits, and only one that doesn't support at least 32 (it supports 24 bits). > If you want to argue that such an implementation is broken (even > though it's conforming), I won't disagree. But why 64? Why shouldn't we say 128? Or 1000? After all, Lisp implementations have been supporting more than that for decades. - Bob