From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,29fe9a340e0d180d X-Google-Attributes: gid103376,public From: bobduff@world.std.com (Robert A Duff) Subject: Re: Depending on passing mechanism Date: 1997/10/21 Message-ID: #1/1 X-Deja-AN: 282139701 References: Organization: The World Public Access UNIX, Brookline, MA Newsgroups: comp.lang.ada Date: 1997-10-21T00:00:00+00:00 List-Id: (e-mailed and posted) In article , Henry Baker wrote: >It should be comforting to you when your Boeing 777 crashes that its Ada code >met the standard.... > >This is another one of those cases that give standards bodies such a >bad name -- the behavior is well-defined as 'non-deterministic', but >non-functional. Henry, I'm surprised you still lurk here. ;-) I'm inclined to agree with you that this (by-copy vs. by-ref) is a bad thing to leave "nondeterministic". On the other hand, what the heck do you want? Lots of languages leave lots of things ill-defined/non-deterministic. Do you want Boeing to use C instead, to make 777's fly?! C has far more nondeterministic stuff than Ada (although this particular thing is nailed down better in C than in Ada -- with some efficiency cost). The state of the art is for the average programmer (in both C and Ada) to assume that "what my current compiler does today is law", and for the better-than-average programmer to accidentally trip over nondeterminism. The advantage of Ada here is that there's less of it. Then there's Java, which is about as deterministic as you can get -- namely, the core language allows nondeterminism only for concurrent threads (btw Ada does better here). But the libraries do whatever who-knows-who's windowing library likes, and of course some sort of "nondeterminism" is introduced by the fact that various Java compilers don't obey the Java standard (often deliberately). But at least the semantics of plain old integer arithmetic and similar mundane operations are nailed down. But Java pays an efficiency price for avoiding non-determinism. If I were designing a language from scratch, I think I would try to get the best of both worlds: nail things down, but give the compiler enough information to optimize. In the case we're talking about here (by-copy vs by-ref), define the semantics as by-copy, but make sure the compiler knows about side effects and exceptions well enough to do by-ref in the vast majority of cases when it's both safe and more efficient. Perhaps that's a pipe dream. If so, IMHO the Pascal semantics is better, where the programmer chooses by-copy vs by-ref (and loses when that choice is bad for a particular machine). The point is: the vast majority of calls don't do any troublesome aliasing. So do you go the Ada(*)/Fortran way, which says that the compiler can *assume* that, even if it's not true, or do you go the Java/C way, which is deterministic, but inefficient? My wish is: neither -- I want determinism *and* efficiency. (And by the way, I wouldn't go the "pure functional" route.) (*) Ada doesn't actually assume no aliasing. It's a bit more conservative than Fortran. Not much. So anyway, I agree with your criticism of Ada, in this particular respect, but I'm not quite sure why you criticize Ada in particular (as opposed to lots of other mainstream languages that choose non-determinism for efficiency reasons). - Bob