From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,6bf9d4ba0cfd8cb6 X-Google-Attributes: gid103376,public From: Hyman Rosen Subject: Re: Announce: OpenToken 2.0 released Date: 2000/02/17 Message-ID: #1/1 X-Deja-AN: 587007236 Sender: hymie@calumny.jyacc.com References: <3890C62B.18309585@telepath.com> <876unj$jcs$1@nnrp1.deja.com> <87d7qck6pm.fsf@deneb.cygnus.argh.org> X-Complaints-To: abuse@panix.com X-Trace: news.panix.com 950809585 15788 209.49.126.226 (17 Feb 2000 17:46:25 GMT) Organization: PANIX Public Access Internet and UNIX, NYC NNTP-Posting-Date: 17 Feb 2000 17:46:25 GMT Newsgroups: comp.lang.ada Date: 2000-02-17T17:46:25+00:00 List-Id: Robert A Duff writes: > My point was simply that the C programming language has a design flaw > (namely, confusion between characters and integers) that contributes to > the poor design we were talking about (namely, assuming that int can > represent every char value, plus at least one more value, which is not > always the case). If you look at the subject of this thread, you will be reminded that it started because the author of OpenToken used exactly this approach in his Ada code, in a way even worse than C's approach - making a potentially legal character the end-of-file sentinel. It's not at all unnatural to want to use this kind of approach. The only "design flaw" in the C approach is that it is possible to implement integers and chars to have the same size, and to have streams that then allow the full range of at least 32-bit values to appear on their input. I would guess that the number of such adversarial platforms is small, and would not be surprised at all if that number were zero. (I know that the number of platforms where sizeof(int) == sizeof(char) is non-zero, but do those platforms have 32-bit input from external sources?) C does not "confuse" characters and integers. It allows arithmetic on chars, and automatic conversions among arithmetic types. It is certainly the case that this can lead to truncation errors and similar surprises that would not occur in Ada. Many modern C compilers attempt to compensate for this by generating warnings when truncation would occur, but that's something of a band-aid.