From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,4751d44ff54a2c2c X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2002-07-29 13:16:55 PST Newsgroups: comp.lang.ada Path: archiver1.google.com!news1.google.com!newsfeed.stanford.edu!canoe.uoregon.edu!logbridge.uoregon.edu!uunet!sea.uu.net!ash.uu.net!world!news From: Robert A Duff Subject: Re: 64-bit integers in Ada Sender: news@world.std.com (Mr Usenet Himself) Message-ID: Date: Mon, 29 Jul 2002 20:15:45 GMT References: <3CE3978F.6070704@gmx.spam.egg.sausage.and.spam.net> NNTP-Posting-Host: shell01.theworld.com Organization: The World Public Access UNIX, Brookline, MA X-Newsreader: Gnus v5.7/Emacs 20.7 Xref: archiver1.google.com comp.lang.ada:27459 Date: 2002-07-29T20:15:45+00:00 List-Id: Victor Giddings writes: > David Rasmussen wrote > in news:3CE3978F.6070704@gmx.spam.egg.sausage.and.spam.net: > > > I understand that I can easily use an integer in Ada that has exactly 64 > > bits. But are there any guarantees that such a type would be mapped to > > native integers on 64-bit machines or to a reasonable double 32-bit > > implementation on 32-bit machines? In the language standard, there is no "guarantee" of anything related to efficiency. I think that's true of pretty much all languages, and I don't see how one could do better. >... At least, are compilers ok at this in > > real life? Yes. Compiler writers don't deliberately go out of their way to harm efficiency. Note that the RM does not require support for 64-bit integers (unfortunately, IMHO), and there are compilers that do not support 64-bit integers. But if the compiler *does* support 64-bit integers, I see no reason to suspect that it wouldn't do so in the obviously efficient way (native 64-bit ints on a 64-bit machine, or a pair of 32-bit ints on a 32-bit machine). > > /David > > Try using (or deriving from) Interfaces.Integer_64 or Integer.Unsigned_64. > Admittedly, this requires 2 steps on the part of the compiler developer. 1) > actually support the 64-bit integer type. 2) to put it in Interfaces (as > required by B.2(7)). However, we rely on this in our CORBA product > implementation and have been making sure that the compiler vendors are > adding these types when they are supported. I don't see the point of this advice. If you say "type T is range -2**63..2**63-1;", I don't see any reason why T would be more or less efficient than Interfaces.Integer_64. In fact, I would think they would have identical representation, and arithmetic ops would use identical machine code. > As of now, I know of only one compiler that supports 64-bit integers and > doesn't define Interface.Integer_64. That is to be remedied very soon. - Bob