From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.2 required=5.0 tests=BAYES_00,INVALID_MSGID, PLING_QUERY autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,b87bc683630619fa X-Google-Attributes: gid103376,public From: kst@thomsoft.com (Keith Thompson) Subject: Re: HELP:Declaration in ADA?!! Date: 1996/04/09 Message-ID: #1/1 X-Deja-AN: 146635293 sender: news@thomsoft.com (USENET News Admin @flash) x-nntp-posting-host: pulsar references: <4j65rr$fqg@soleil.uvsq.fr> <4kc5ti$1ir@comet2.magicnet.net> <4ulok5fn1m.fsf@zippy.frc.ri.cmu.edu> organization: Thomson Software Products, San Diego, CA, USA newsgroups: comp.lang.ada originator: kst@pulsar Date: 1996-04-09T00:00:00+00:00 List-Id: Tuyet-Tram DANG-NGOC wrote: >Hi there, >I have a stupid question to ask. >DOES SOMEONE KNOW HOW TO DECLARE LONG_INTEGER IN ADA? >All the books I've read here does not answer this metaphysical question, they >just take it as if everyone should know how to declare it. In a wonderful >book, I've seen how to declare LONG_REAL by typing: > type LONG_REAL is DIGIT 14; >But how to do it for LONG_INTEGER? >Can someone answers me pleeeeaaaase, i've a big and horrible project to finish >for very soon. Probably the "big and horrible project" is either finished or overdue by now, but I'll jump into the fray anyway. The type Long_Integer, if it exists, is predefined in package Standard. Some compilers support a type of this name, some don't. Of those that do, not all declare it with the same bounds (though -2**31 .. 2**31-1 is most common). If the compiler you're using doesn't predeclare Long_Integer, you can declare your own integer type of that name, but it's almost certainly a bad idea. Anyone else reading your code will naturally assume that the name Long_Integer refers to the predefined type, not to a user-defined type. Please don't introduce this kind of confusion if you don't have to. If your compiler does predeclare Long_Integer, DON'T USE IT! (Sorry about the shouting.) Ada provides excellent facilities for declaring your own integer types of whatever size and/or range you want, subject to the limitations of the implementation. Using Long_Integer directly is inherently non-portable. (The Ada 95 standard does recommend providing a type Long_Integer if the implementation supports a type of at least 32 bits, but this is only a recommendation.) If you want to declare the longest possible integer type, you can do it like this: type Longest_Integer is range System.Min_Int .. System.Max_Int; If you want a 32-bit type, assuming 2's-complement hardware, you can do it like this: type Integer_32 is range -2**31 .. 2**31-1; (or use Interfaces.Integer_32 if it's provided). (There are ways to declare this without assuming 2's-complement, but that's beyond the scope of what I feel like thinking about at the moment.) The only valid reason I can think of for using Long_Integer directly in user-level code is if you're trying to port a program from a compiler that supports Long_Integer to one that doesn't, and the program makes extensive use of Long_Integer. In this case, it might be necessary to declare your own type Long_Integer whose range is compatible with the range of the predefined Long_Integer on the original compiler. Do this only if you don't have enough time or other resources to remove the references to Long_Integer, add lots of apologetic comments to the type declaration, and wash your hands afterward. 8-)} Then track down the person who wrote the original code and make them read this article. (Note that I specifically said user-level code. I think the GNAT runtime, for example, uses Long_Integer directly. In that context, it's not unreasonable to depend on the characteristics of the compiler.) -- Keith Thompson (The_Other_Keith) kst@thomsoft.com <*> TeleSoft^H^H^H^H^H^H^H^H Alsys^H^H^H^H^H Thomson Software Products 10251 Vista Sorrento Parkway, Suite 300, San Diego, CA, USA, 92121-2718 This sig uses the word "Exon" in violation of the Communications Decency Act.