From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.5-pre1 (2020-06-20) on ip-172-31-74-118.ec2.internal X-Spam-Level: X-Spam-Status: No, score=-1.9 required=3.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.5-pre1 Date: 16 Sep 93 17:39:19 GMT From: olivea!news.bu.edu!inmet!spock!stt@uunet.uu.net (Tucker Taft) Subject: Re: Don't we already have a 'Valid? (was Re: Unchecked_Conversion...) Message-ID: List-Id: In article ncohen@watson.ibm.com writes: >Some clarifications: >3. Like Robert Eachus (if I understand his post correctly), I believe > that the best solution would have been an attribute like > > target_subtype'Would_Be_Valid(source_object) > > indicating without performing the unchecked conversion of > source_object to target_subtype whether the bits of source_object are > a valid representation of a value of target_subtype. This provides a > convenient way for the programmer to validate untrustworthy data > without ever constructing invalid values. (I'm sure there is a better > name, but Would_Be_Valid conveys my intent.) There are a few reasons why we chose the object'Valid approach rather than the above "attribute-function" approach (all nicely documented by Ben Brosgol in the Ada 9X Rationale document coming out imminently): 1) The above proposal implies a function call, which takes a value, not an object, as a parameter, meaning that the source object must already have a valid value or else the implementation is allowed to raise Program_Error. Furthermore, what is the formal type of the parameter to the "Would_Be_Valid" function? 2) Unchecked_Conversion takes a source *sub*type and a target subtype as generic parameters. That is critical because the particular subtypes specified can affect the actual number of bits involved in the unchecked conversion. By contrast, the attribute-function approach is given a value, which does not indicate unambiguously how many bits will be involved in the conversion (which matters when converting between bit arrays and scalars on a big-endian machine, for example). 3) A common source of potentially invalid data is from input or unchecked conversion of a whole record. It is much more convenient to check the fields of interest of a record with the object-attribute approach, rather than the attribute-function approach applied to slices of the source of the record. E.g.: type Enum is (ABC, XYZ, ...); type Enum_Array is array(Positive range <>) of Enum; type Enum_Stack is record Length : Natural range 0..Max := 0; Data : Enum_Array(1..Max); end record; Stk : Enum_Stack; ... Read_Stack(Into => Stk); if not Stk.Length'Valid then Put_Length("Bogus Length"); raise Data_Error; else for I in 1..Stk.Length loop if not Stk.Data(I)'Valid then Put_Line("Trouble in River City"); raise Data_Error; end if; end loop; end if; -- Stk now checked out, relevant data is valid An alternative to the attribute-function approach that would almost work would be a generic "Checked_Conversion" function, which would combine the conversion with a validity check. However, since validity is not unambiguously defined for composite types (because of examples like the Stack above), Checked_Conversion would be limited to scalar types, which would make it pretty inconvenient. By the way, we believe we have fixed the wording problems in the manual in the latest version, so that the Unchecked_Conversion is not erroneous, even if the bits being converted do not correspond to a valid value of the target subtype. >Norman H. Cohen ncohen@watson.ibm.com S. Tucker Taft stt@inmet.com Ada 9X Mapping/Revision Team Intermetrics, Inc. Cambridge, MA 02138