From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,febd9e55846c9556 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2003-03-02 01:49:37 PST Path: archiver1.google.com!postnews1.google.com!not-for-mail From: amir@iae.nsk.su (Amir Yantimirov) Newsgroups: comp.lang.ada Subject: Re: Endianness independance Date: 2 Mar 2003 01:49:37 -0800 Organization: http://groups.google.com/ Message-ID: <5115eb96.0303020149.4d438e40@posting.google.com> References: <5115eb96.0303010248.1b2b8d37@posting.google.com> NNTP-Posting-Host: 81.1.215.198 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 8bit X-Trace: posting.google.com 1046598577 17528 127.0.0.1 (2 Mar 2003 09:49:37 GMT) X-Complaints-To: groups-abuse@google.com NNTP-Posting-Date: 2 Mar 2003 09:49:37 GMT Xref: archiver1.google.com comp.lang.ada:34790 Date: 2003-03-02T09:49:37+00:00 List-Id: "Marin David Condic" wrote in message news:... > Well, but if hardware is not standardized, then why would you expect > languages to be standardized? (With respect to data representations.) > Language X can say "All integers of any kind will be 32-bit, > twos-compliment, network byte order....." That's nice for platforms where > that is always available and the only integer type available. That means you > can't use Language X on a lot of hardware platforms that don't support that > type. Or if the hardware has other types available, you can't use those. > Where does that get you? First, hardware in a large degree IS follow de facto standards, except some marginal examples. There isn't any gain to be different and uncompatible. And I wonder what reasons were for hardware be so diverse in the past. I think, twos-compliment, 8 bit per byte, little- and big-endian crowds pretty much covers together 99.9% of all systems. Second, data representation is only model. Processor with 36 bit word perfectly capable to operate with 8, 16, 32, 64 bits integers. The only difference is performance. And for most cases interoperability is far far more important. Same for hypothetical future processors with 5-state elements. > And even if you went so far as to thoroughly dictate the exact precise > representation of all data within the program - right down to the number of > electrons it takes to make a bit equal to one :-) - how is that going to > guarantee interoperability? Program X built by manufacturer X running on box > X thinks the data going down the wire looks like this.... Program Y built by > manufacturer Y running on box Y thinks the data looks like that..... Both > have the same kinds of integers and floats and strings, but they're all in a > different order. How does the language/compiler solve that? We solves that by hand without much brain efforts so the task to automate it is not difficult by itself. Ideally we should only mark sertain types and interfaces that deals with communications as having particular endianess and thats all. But including notion of endianess (as kind of storage specifier) in existent Ada type system seems impossible to me. By the way, chips of different manufactures with different programs already swarms on any PC. One my colleague feeds NVidia GeForce chip with data, other struggles to share some algorithm between central processor and custom multimedia chip. Guess, that language they use. ;) Amir Yantimirov http://www174.pair.com/yamir/programming/