From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.5-pre1 (2020-06-20) on ip-172-31-74-118.ec2.internal X-Spam-Level: X-Spam-Status: No, score=-1.9 required=3.0 tests=BAYES_00,MSGID_SHORT autolearn=no autolearn_force=no version=3.4.5-pre1 Date: 13 Dec 91 17:24:40 GMT From: elroy.jpl.nasa.gov!swrinde!zaphod.mps.ohio-state.edu!unix.cis.pitt.edu!ds inc!gvlf3.gvl.unisys.com!email!parkhill@ames.arc.nasa.gov (parkhill) Subject: Re: 'SIZE attribute of a type Message-ID: <5907@email.sp.unisys.com> List-Id: warack@dip.eecs.umich.edu (Christopher Warack) writes: > In article <5898@email.sp.unisys.com> parkhill@email.sp.unisys.com (parkhill) writes: > >>>Image a compiler was smart enough to understand baised number ranges. > >> > >>> type b is range 9 .. 10; > >>> or > >>> type b is range 2**31 - 2 .. 2**31 - 1; > >> > >> > >>>If the compiler can generate code that only uses 1 bit then b'Size > >>>should return 1. > >> > >>Oh, that compilers were so cleverly written! > >> > > > >> Deleted text. > > > >My point on the compiler having the capability to use one bit is that I > >find the utility of type_name'size is now questionable. Lets say the > >compiler is capable of using 1 bit but won't do it in most situations. > >How can any programmer use type_name'size to any practical purpose? It > >seems less useful than Entry_Name'Count. However, I am sure that the > >justification for the change to type_name'size is rock solid. > > Seems to me that the only reason you'd want the minimum number of bits > used to represent a type is if you wanted to implement that type with the > minimum number of bits allowed, eg, in a maximally packed record using > a "somewhat" portable rep clause. (Not the most beautiful piece of code I > can imagine). > > If you wanted to know the number of bits in a certain type of object (e.g, > a variable of a type) use object'size -- that's what it's for. > > -- Chris > > -- > Christopher A. Warack warack@eecs.umich.edu > Graduate Dept, EECS (313) 665-4789 > University of Michigan You are quite correct, one of the only reasons you would want to use the minimum number of bits that minimum bits are required. I am often required to write device drivers in Ada. The device being communicated with sets the protocol. Some of the military devices I communicate with used biased representation for input and output values. If the compiler could generate code for this I wouldn't have to do it. I think portability goes or when you are doing memory mapped I/O to external devices. Yes, I know Object_Name'Size tells you the number of bits used by an object. What is Type_Name'Size for?? Robert Parkhill