From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.4 required=5.0 tests=AC_FROM_MANY_DOTS,BAYES_00 autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,febd9e55846c9556 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2003-03-03 05:31:10 PST Path: archiver1.google.com!news1.google.com!newsfeed.stanford.edu!logbridge.uoregon.edu!arclight.uoregon.edu!wn13feed!wn14feed!worldnet.att.net!208.49.253.98!newsfeed.news2me.com!newsfeed2.easynews.com!newsfeed1.easynews.com!easynews.com!easynews!newsfeed1.earthlink.net!newsfeed.earthlink.net!stamper.news.pas.earthlink.net!stamper.news.atl.earthlink.net!harp.news.atl.earthlink.net!not-for-mail From: "Marin David Condic" Newsgroups: comp.lang.ada Subject: Re: Endianness independance Date: Mon, 3 Mar 2003 08:29:48 -0500 Organization: MindSpring Enterprises Message-ID: References: <5115eb96.0303010248.1b2b8d37@posting.google.com> <5115eb96.0303020149.4d438e40@posting.google.com> NNTP-Posting-Host: d1.56.b4.e4 X-Server-Date: 3 Mar 2003 13:31:10 GMT X-Priority: 3 X-MSMail-Priority: Normal X-Newsreader: Microsoft Outlook Express 5.00.2314.1300 X-MimeOLE: Produced By Microsoft MimeOLE V5.00.2314.1300 Xref: archiver1.google.com comp.lang.ada:34823 Date: 2003-03-03T13:31:10+00:00 List-Id: I agree - solve it "by hand". That is to say, one establishes the communications protocol and tests to make sure the programs involved work with that protocol. But remember that the original notion was that somehow Ada was invented to make sure that when a new device was plugged into some big military system, you wouldn't have a case where it would misinterpret the data and cause a failure. My contention was that A) it wasn't invented for that purpose and B) the language can't really solve that problem. The language can't check the representations of things that exist outside of the language. If you have thousands of messages flying around through a system you are always going to have some unusual cases generated by unusual needs and it is not possible to anticipate all those circumstances from within a language standard. You have to manually determine what the messages are supposed to look like and test to be sure the software understands them. Often there are corner cases or things that are difficult to test or unusual enough that they get missed. Hence, the possibility of errors. A language might *help* minimizing representation problems because if everyone is using the same thing they are at least starting from a common base. But the representation of data outside the language isn't defined by the language and attempting to do so is only going to hamstring the language & keep it from being usable in a large variety of circumstances. For example, Ada's selection of ASCII for character representation pretty much insured that it would not become a popular language on IBM mainframes where EBCDIC ruled the roost. Had the standard gone further and tried to define external representations for more things, they would have just kept ruling out more and more architectures on which it would be difficult or impossible to meet the standard. And none of this would result in more interoperatbility because the problems tend to come up from inadequate understanding on the part of the programmers about the data being dealt with rather than some intrinsic data representations that the language can't check anyway. MDC -- ====================================================================== Marin David Condic I work for: http://www.belcan.com/ My project is: http://www.jsf.mil/ Send Replies To: m c o n d i c @ a c m . o r g "Going cold turkey isn't as delicious as it sounds." -- H. Simpson ====================================================================== Amir Yantimirov wrote in message news:5115eb96.0303020149.4d438e40@posting.google.com... > > We solves that by hand without much brain efforts so the task to > automate it is not difficult by itself. Ideally we should only mark > sertain types and interfaces that deals with communications as having > particular endianess and thats all. But including notion of endianess > (as kind of storage specifier) in existent Ada type system seems > impossible to me. >