From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00,FREEMAIL_FROM autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,4f1dddd3318e056d,start X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2002-06-02 14:21:21 PST Path: archiver1.google.com!postnews1.google.com!not-for-mail From: 18k11tm001@sneakemail.com (Russ) Newsgroups: comp.lang.ada Subject: type declaration and storage requirements Date: 2 Jun 2002 14:21:21 -0700 Organization: http://groups.google.com/ Message-ID: NNTP-Posting-Host: 63.194.87.148 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 8bit X-Trace: posting.google.com 1023052881 29274 127.0.0.1 (2 Jun 2002 21:21:21 GMT) X-Complaints-To: groups-abuse@google.com NNTP-Posting-Date: 2 Jun 2002 21:21:21 GMT Xref: archiver1.google.com comp.lang.ada:25219 Date: 2002-06-02T21:21:21+00:00 List-Id: Ada allows me to specify the number of digits and the range of a floating-point type, or the delta and range of a fixed-point type. This gives more control than, say, C/C++, which only allows me to specify single or double precision. What bothers me, however, is that I apparently need to do some homework of my own to determine if my declaration will require one or two words for storage (for a particular word size). Suppose I have a particular range in mind, and I just want to get the maximum number of digits I can get for, say, a 32-bit word. Is there a simple way to determine how many digits to specify? If I specify one digit too many, can it step the storage requirement from one to two words? I'm no expert, but it seems to me that what you really want to be able to specify is not the number of digits but rather the total number of bytes to be used. Also, suppose I specify one digit too many for single-word storage. I presume I will get double-word storage and nearly double the precision I asked for. If I then port the program to another machine with a different word size, I may get less precision, which means that the program may produce different results. Am I missing something here? I guess Ada takes the view that storage requirements are secondary to the application's actual precision requirements. That's certainly valid to a point, I guess. But sometimes precision requirements are not so precise themselves, and jumping from one to two words might be undesirable, particularly if the type is to be used extensively.