From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=unavailable autolearn_force=no version=3.4.4 Path: backlog4.nntp.dca3.giganews.com!border1.nntp.dca.giganews.com!nntp.giganews.com!goblin1!goblin.stu.neva.ru!eternal-september.org!feeder.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail From: Simon Clubley Newsgroups: comp.lang.ada Subject: Re: Oberon and Wirthian languages Date: Sat, 19 Apr 2014 20:08:57 +0000 (UTC) Organization: A noiseless patient Spider Message-ID: References: <1ljwj8f.1wqbhvuabsdw1N%csampson@inetworld.net> <51c7d6d4-e3be-44d5-a4ce-f7e875345588@googlegroups.com> <%J32v.70539$kp1.45343@fx14.iad> <8761m535e4.fsf_-_@ludovic-brenta.org> Injection-Date: Sat, 19 Apr 2014 20:08:57 +0000 (UTC) Injection-Info: mx05.eternal-september.org; posting-host="e458ff8b81bc0c159989eb0e36c6e372"; logging-data="30337"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/jPHbIg4GuQcIihLtya1HerE8ETuzFAqI=" User-Agent: slrn/0.9.8.1 (VMS/Multinet) Cancel-Lock: sha1:ff3Xrf7wJF5pDsyWP4SvcevJ1co= Xref: number.nntp.dca.giganews.com comp.lang.ada:185887 Date: 2014-04-19T20:08:57+00:00 List-Id: On 2014-04-19, Shark8 wrote: > On 19-Apr-14 11:35, Jeffrey Carter wrote: >> On 04/19/2014 04:50 AM, Ludovic Brenta wrote: >>> >>> So, Oberon-14 or whatever its name is should not only reinstate >>> subranges but also allow the definition of incompatible scalar types. >>> If it did support all of the desirable features above then it would >>> effectively almost become Ada :) >> >> That would make the language unsuitable for its intended purpose of >> replacing C for S/W like OpenSSL. >> >> My experience (unfortunately on more than one example) of Ada designed >> and written by coders is that they eschew user-defined numeric types and >> numeric subtypes. They pick a few numeric types, predefined if at all >> possible, and use them for everything, just like C. > Fine. Let them keep writing their code using uint16 and friends (for now). There are still other things we can do to make their code more robust in a new language and if it helps ease the transition then I don't really have a problem with that to begin with. We can still add support for user defined datatypes so it's waiting when they are ready to use them. Note that this is very much a pragmatic view and certainly not what I would prefer, but if you make the learning barrier for a new language too high then they are not going to try it out. > One of the problems is that specs have become dependent on C and C-ish > terminology as an example, let us examine the basic definitions in TLS: > >> 6.2.1. Fragmentation >> >> The record layer fragments information blocks into TLSPlaintext >> records carrying data in chunks of 2^14 bytes or less. Client >> message boundaries are not preserved in the record layer (i.e., >> multiple client messages of the same ContentType MAY be coalesced >> into a single TLSPlaintext record, or a single message MAY be >> fragmented across several records). >> >> struct { >> uint8 major; >> uint8 minor; >> } ProtocolVersion; >> >> enum { >> change_cipher_spec(20), alert(21), handshake(22), >> application_data(23), (255) >> } ContentType; >> >> struct { >> ContentType type; >> ProtocolVersion version; >> uint16 length; >> opaque fragment[TLSPlaintext.length]; >> } TLSPlaintext; This is probably seen as pseudocode so the layout is clear as possible to the greatest number of people. I suspect that every Ada user here would be able to read the above and know exactly what would be required of a Ada version of this. > > As we can see here the 'length' should not be "uint16", but instead: > Range 0..2**14 {with a Size specification of 16 bits.} The problem is that the above lays out the on the wire structure, but says nothing about the valid length values other than in the surrounding text. > Another illustration comes from the enumerations; for "ContentType" > there are 4 possibilities, which means we could fit them into a 2-bit > field -- but the size is unintuitively specified by the null-enumeration > with a value of 255. > I suspect they are probably worried about bit alignment issues if they do that. However, have another look at the above; it actually says nothing explicit about the size of the enum. It just appears to assume it's a single byte based on the upper limit. BTW, in my opinion, one of the things which would be absolutely non-negotiable in a new language would be the required treating of each enum declaration as it's own data type instead of as integers as C currently does. That would not reduce the hackability of any half decent code, but it would help to reduce errors. Simon. -- Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP Microsoft: Bringing you 1980s technology to a 21st century world