From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII X-Google-Thread: 103376,38c827f7e800d317 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2003-07-10 03:18:41 PST Path: archiver1.google.com!news1.google.com!newsfeed.stanford.edu!headwall.stanford.edu!fu-berlin.de!uni-berlin.de!tar-alcarin.cbb-automation.DE!not-for-mail From: Dmitry A. Kazakov Newsgroups: comp.lang.ada Subject: Re: conversion Date: Thu, 10 Jul 2003 12:19:19 +0200 Message-ID: References: <3EFCC18B.4040904@attbi.com> <3F03D54C.7010008@attbi.com> <3F0C4B18.6080204@attbi.com> NNTP-Posting-Host: tar-alcarin.cbb-automation.de (212.79.194.111) Mime-Version: 1.0 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 8bit X-Trace: fu-berlin.de 1057832320 6108610 212.79.194.111 (16 [77047]) X-Newsreader: Forte Agent 1.8/32.548 Xref: archiver1.google.com comp.lang.ada:40164 Date: 2003-07-10T12:19:19+02:00 List-Id: On Wed, 09 Jul 2003 17:04:53 GMT, "Robert I. Eachus" wrote: >Dmitry A. Kazakov wrote: > >> This is what I meant. You can consider string literals as being of >> some unversal string type. Then you can have some parent string type >> for all string types. Call it root string if you want. The hierarchy >> might look like... > >This is where your confusion starts. String is the root type for all >strings of Latin1 characters. If you want a universal type what it >needs to cover are things like Wide_String, Wide_Wide_String, and any >user defined string types with either a different set of characters, or >different index types. Strings in Ada are much richer than your >imagination realizes: Your example proves exactly the opposite. How to get an Unbounded_String of Colors? >type Color is (Red, Orange, Yellow, Green, Blue, Violet); > >type Digit is ('0','1','2','3','4','5','6','7','8','9'); > >type Digit_String is array (Color range <>) of Digit; -- new string type > >Foo: Digit_String := "12345"; Yes, and why is this? Because Ada 83 did it (arrays) in a right, though a very limited way. On the contrary, Ada 95 failed to extend ADT in a reasonable way. So we have got a private Unbounded_String falling out of the concept. >>>The Integer type that corresponds most closely to String, is Integer, >>>and they are both declared in Standard. > >> It does not. Because Integer is constrained. > >And String is constrained in exactly the same way. Most Ada >implementations now have Integer'Last = 2**32-1, which not >coincidentally is the maximum number of Characters in a String. Compare: X : Integer := 5; Y : String := "12345"; begin X := 7; Y := "1234567"; A decalration of a string variable, in effect, also declares an anonymous constrained subtype which makes the last assignment illegal. It is not so for integer types. >> You missed the point. String as a concept has a semantics which allows >> *any* values. That String as a type implementing the concept has >> values constrained in some special way, does not change that >> semantics. > >Hoo boy, you still are wearing blinders. In concept a string could be >any ordered set of say Japanese Kanji, or if you prefer, and the >Japanese do, a mixture of Katakana, Hirigana, and Kanji. Absolutely! >In Ada this >does't fit in a String, but Japanese is a subset of Wide_Character. And > so Wide_String is fine for Japanese. On the other hand, Wide_Character >cannot represent all of the Chinese characters, there are already >several additional planes of additional Chinese characters defined in >ISO 10646. In Ada the corresponding string type, if supported, would be >Wide_Wide_Character. It is not a problem if you have a full blown ADT. If we could say that one array is a subtype of another, then all problems would disappear. The type of elements will become a representation issue. So even Wide_Wide_Very_Wide_Unbounded_Remote_Oracle_String could be yet made a sub- and supertype of String, i.e. substitutable. >> The problem is. Should String and Unbounded_String be siblings >> (descendants of same base), or better, as many would rightfully >> expect, String be *both* a subtype and a supertype of >> Unbounded_String, then we will have a lot of problems to solve. >> Because ADT in Ada is presently unable to deal with that. > >But Unbounded_String is really a (very useful) container type for >Standard.String, no more, no less. Yes, it does not solve the problem, so people are unhappy with it. What I am trying to say is that to get it right would require too many changes in Ada, which was impossible then and, I am afraid, is still impossible now. >I can easily imagine--because I have >had to do it in Ada 83 which was not as friendly in this area--writing a >package which had to display messages in Latin (English), Cyrillic, and >Arabic. Three separate (7-bit) character sets, and the corresponding >string types. If I were to rewrite that code in Ada 95, I would >probably use the corresponding ISO 8859 8-bit character sets. But I >would need an instance of Ada.Strings.Bounded_String for each. (Well >actually only the Latin1 version could be an instance of >Ada.Strings.Bounded_String, but that is a detail.) > >Why am I spending so much time on this? Simple. A lot of effort over >the years has gone into the support for additional character sets (and >string types) in Ada. The most recent discussion was whether or not to >"officially" change the default character set to one of the new 8859 >variants with the Euro symbol. (Verdict, no.) People who look at Ada >through a mono-linguistic filter tend to miss this. But what really >suprises me is the fact that many people whose first language is not >English still tend to think of Ada as having an English bias. (:-)) I would recommend you to take a look at Russian edition of revised report on ALGOL 68, with Cyrillic identifiers and keywords. Even a native Russian is unable to understand such a garbage. >(Technically, Ada does have a slight Western European bias, but very >slight, see 3.5.2.(4). And as I indicated above Ada does not define the >Wide_Wide_Character type needed for full Chinese language support.) And it also should not. ADT can deal with all that on the basis of user-defined types. You cannot support all possible languages. It is utopic. Alone Russian has dozens of different code tables. Germans are going to eliminate their beta (�). Would change the language standard each time some of 200 crazy governments over the world modify an alphabet? --- Regards, Dmitry Kazakov www.dmitry-kazakov.de