From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.8 required=5.0 tests=BAYES_00,INVALID_DATE autolearn=no autolearn_force=no version=3.4.4 Path: utzoo!attcan!uunet!samsung!shadooby!mailrus!ncar!ico!vail!rcd From: rcd@ico.isc.com (Dick Dunn) Newsgroups: comp.lang.ada Subject: Forward into the past Summary: glass houses Message-ID: <1989Nov19.055253.14320@ico.isc.com> Date: 19 Nov 89 05:52:53 GMT References: <14033@grebyn.com> <7053@hubcap.clemson.edu> Organization: Interactive Systems Corporation List-Id: billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe, 2847 ) writes about UNIX and the Ada<->UNIX conflict that Ted Holden had commented on: > A prime example is Unix; the current POSIX effort aims to > standardize 1960's technology, thus resulting in a "lowest > common denominator" which locks users into obsolescence. I picked this out of the latter half of an article where Bill was objec- ting to Ted saying things that were "blatantly false" (Bill's words). Bill ought to be a little more careful about his own blatant falsehoods. The POSIX work does, in fact, standardize some things which existed in the '60's. But it also covers mechanisms introduced in the '70's (e.g., job control) and even into the latter part of the '80's (such as the "session" mechanism). Wouldn't it be rather disturbing to have a standard which *excluded* technology older than 1970?!? Standards are developed in two ways--either they codify existing practice, or they attempt to constrain future practice. Most standards in computing are of the former sort. The danger of a "codification" standard is, as Bill obliquely points out, that it doesn't provide any guidance for the future. However, it does NOT lock people in; they are free to experiment with extensions to this type of standard. As new practice emerges beyond the old standard, a basis for a successor standard is built. Defining a standard _a_priori_ without the benefit of existing practice is much harder to get right. You have to use the best research available to you, and hope it's right. It's sometimes necessary; sometimes an industry can't hope to move forward without a standard. (Consider TV, CDs, etc.) Ada is in this second class of standardization (unlike almost all other programming languages). It would help a lot if people who advocate either approach to standardization would be less parochial and realize the tradeoffs. > Ada's problem with Unix is that Unix, being 1960's technology,... Careful there! People who live in glass houses... To the extent that UNIX is 1960's technology, Ada is 1950's technology and UNIX is the more modern of the two by a decade! I'm serious...the basic control structures, not to mention the "look and feel" of algorithmic languages, was pinned down by the end of the 50's; it took another two years to get it canonized in Algol 60 (with the "real" report out in '62). Sure, if I'm going to characterize Ada as a "'50's language" I have to ignore a lot of important characteristics--but no more than are ignored by charac- terizing UNIX as "1960's technology". > [UNIX] does not properly support lightweight processes...Modernized > versions of Unix (e.g., MACH) which are designed to provide > such support remove the difficulty... Bill is one of the people who has been arguing (over in comp.software-eng) against the value of having a heavy operating systems course in a CS curriculum...yet here we find that he thinks Mach is a "modernized version of UNIX"??? I've seen trade rags muddle it that way, but it's wrong... UNIX and Mach are not even the same type of animal. But the more signifi- cant misunderstanding is the idea that somehow lightweight processes should have been in UNIX long ago. The reason we have only recently seen a more widespread implementation of lightweight processes is that it's only recently (meaning within the last few years) become reasonably well understood just what the abstraction of a "lightweight process" should be. The understanding has been evolving along with various experiments in hardware parallelism of various granularity. The problem was not a matter of finding a workable abstraction, but of choosing the right one(s). The approach, in the UNIX community in particular, is to try things first in research work, then let them gain some wider experience in the more avant-garde places in industry, bring them into the mainstream, and finally standardize them. This cannot help but look strange in the Ada world where standardization comes first and is followed by trial. -- Dick Dunn rcd@ico.isc.com uucp: {ncar,nbires}!ico!rcd (303)449-2870 ...`Just say no' to mindless dogma.