From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,6f69b1cf0f02b9ac X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2001-01-22 15:45:55 PST Path: supernews.google.com!sn-xit-02!supernews.com!nntp-relay.ihug.net!ihug.co.nz!newsfeed.mesh.ad.jp!osa.uu.net!sac.uu.net!usenet.rational.com!not-for-mail From: "Mark Lundquist" Newsgroups: comp.lang.ada Subject: Re: Built-in types (was Re: How can I avoid Using a Semaphore? Date: Mon, 22 Jan 2001 15:17:16 -0800 Organization: Rational Software Message-ID: <94ifq8$uu$1@usenet.rational.com> References: <93ti8b$bjpps$1@ID-25716.news.dfncis.de> <9BP86.270637$U46.8654942@news1.sttls1.wa.home.com> <94563n$cb6kp$1@ID-25716.news.dfncis.de> <0Cka6.290338$U46.9207275@news1.sttls1.wa.home.com> <94co6t$v27$1@nnrp1.deja.com> <94f1a8$k9r$1@nnrp1.deja.com> <94fv9d$cjt$1@nnrp1.deja.com> <94g3tf$gb9$1@nnrp1.deja.com> <94hmgo$o2k$1@nnrp1.deja.com> <94hq56$rlv$1@nnrp1.deja.com> NNTP-Posting-Host: ext-3038.rational.com X-Priority: 3 X-MSMail-Priority: Normal X-Newsreader: Microsoft Outlook Express 5.00.2314.1300 X-MimeOLE: Produced By Microsoft MimeOLE V5.00.2314.1300 Xref: supernews.google.com comp.lang.ada:4343 Date: 2001-01-22T15:17:16-08:00 List-Id: Robert Dewar wrote in message news:94hq56$rlv$1@nnrp1.deja.com... > In article <94hmgo$o2k$1@nnrp1.deja.com>, > mark_lundquist@my-deja.com wrote: > > If that's my only > > requirement, then I'm being explicit about it when I write > > "Integer", because that's what Integer is supposed to be -- > > the "whatever integer". > > In this very unlikely scenario, I would suggest writing > > type My_Int is new Integer; > -- Use standard default (and presumably efficient) integer > > so that if there are porting problems, or if you do need to > specify additional attributes, then it is easily done without > major rewriting. You've almost convinced me :-) First I would have to figure out how to think up type names that are not so cheesy as "My_Int" :-) And the names shouldn't "encode" specifics about their implementation properties, e.g. "Int16" is going to be a bad choice if I later decide it has to be 32 bits! Also -- in this example, you've defined your own integer type so that you have the flexibility to change it later. Isn't there still always a problem, in that you still assume that you'll always be able to use the same type for all the same things? Let's say you go and use My_Int everywhere instead of Integer. Now you find that for certain of those uses, the current definition of My_Int isn't going to cut it. Well, the ability to change the definition of My_Int only helps you if the new definition is also what you want for all the uses of My_Int. Maybe you can't really change My_Int in the way you require for a particular use, without screwing it up in its other uses. So My_Int is not a great Integer substitute. It's certainly no worse than Integer, but I'm not convinced that it's all *that* much better from the standpoint of portability, resilience to change etc. (remember, this is the case where I truly *don't* *care* about the unspecified properties). > > (Let's see, just for integer types, what can the programmer > > specify? There's base range, size, alignment, stream I/O > > attributes... anything else?) > > Yes, there are other things, What are they? I'm trying to see what it would take to define an integer type that is fully specified, leaving nothing to the implementation-specific defaults. (I think that leaves out anything specified by a pragma, right?) > but the point is that we are > specifically talking here about a complaint that you can NOT > specify them for the standard type Integer. That's what this > thread is about. *was* about, and then only briefly (_I_ was the one who changed the subject line...) :-) My original response was to your statement that carefully written Ada programs don't (within reeason of course) use predefined types. But you use Deja and it doesn't do a good job with changed subject lines :-) > > It is curious logic to be in a position of saying > > 1. I want to use Integer when I don't want to specify any > additional stuff. > > 2. It is annoying that for type Integer, I cannot specify > additional stuff > > that makes little sense to me! I said no such thing, nor implied it! What you say there is exactly the point I made to "DuckE" however many posts ago, when he said it was a weak point of Ada that you can't write 'Input etc. for the predefined types :-) > > > > Well that's just it... it seems like if the programmer is > > sharp enough to specify all this stuff, then he also ought to > > be sharp enough to know when he means the "whatever" types > > and to use them when that's what he means. > > Please give a VERY clear example of why it is good EVER to > use type Integer (other than when constrained by a library)? OK procedure Append ( This : in Element; Instances : Natural; -- subtype of Integer anyway To : Collection ); > > Even if efficiency is a concern, the proper approach is to > write something like > > type Temp_Int is range min-required .. max-required; > type My_Int is new Temp_Int'Base; > > Now use My_Int. That's really MUCH better than using Integer > directly. [OK, I'm not debating you right now :-) Just a question, because I really don't know...] Why did you derive My_Int from Temp_Int, instead of defining it as you did Temp_Int? [OK, I'm debating you again :-)] If I really have no requirements on the range, won't I just say type Temp_Int is range Integer'first .. Integer'last; ??? (I suppose there *might* be occasions where I have no requirements on the range, but where that might someday change...) OTOH, if I nail it down to something else, am I not saying that I want constraint checks if this is ported to an architecture where overflow won't take of it? > > > > There was a thread here last year, in which someone was > > lamenting that Ada's integer types were not as "portable" as > > Java's because the language doesn't nail standardize the > > sizes/ranges. > > Well this of course makes no sense. of course... > In Java you only have the > standard types, so it is important to standardize them. True, if you can't define your own elementary types then the standard ones ought to be standardized. Also, Java is meant to run on one machine (the JVM), so in a sense the standard types are defined the way they are because they're natural for the target machine.