From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,3abdaa39e1c27f49,start X-Google-Attributes: gid103376,public X-Google-Language: ENGLISH,ASCII-7-bit Path: g2news1.google.com!news3.google.com!news.glorb.com!wn13feed!worldnet.att.net!207.35.177.252!nf3.bellglobal.com!nf1.bellglobal.com!nf2.bellglobal.com!news20.bellglobal.com.POSTED!not-for-mail From: Sandro Magi Subject: Discriminant computation problem Newsgroups: comp.lang.ada User-Agent: KNode/0.8.1 MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7Bit Message-ID: Date: Sun, 14 Nov 2004 20:11:10 -0500 NNTP-Posting-Host: 64.229.97.158 X-Complaints-To: abuse@sympatico.ca X-Trace: news20.bellglobal.com 1100481069 64.229.97.158 (Sun, 14 Nov 2004 20:11:09 EST) NNTP-Posting-Date: Sun, 14 Nov 2004 20:11:09 EST Organization: Bell Sympatico Xref: g2news1.google.com comp.lang.ada:6196 Date: 2004-11-14T20:11:10-05:00 List-Id: Just learning Ada and I came across this little issue. I can't perform a computation on a discriminant. Suppose I want to create a bit vector type that uses an array of Integers as its underlying storage type (just an example). I want the user to provide the length of the bit vector in bits, but I need to define the upper bound on the Integer array in Integer'Size units. type Int_Array is array (Positive range <>) of Integer; type Bit_Vector (Max_Bit_Size : Positive) is record Series : Int_Array (Positive'First .. Max_Bit_Size/Integer'Size); end record; The compiler error gnat produces: "discriminant in constraint must appear alone". Is there something I'm doing wrong, or is the above simply not possible? I can solve this either by breaking the encapsulation of my type so that the user must perform the computation ahead of time (not preferable), or by creating a special function which computes the real upper bound. So the user would have to instantiate: bv : Bit_Vector(Real_Upper_Bound(1024)); --1024 bits Instead of: bv : Bit_Vector(1024); --1024 bits Does anyone have a better answer?