From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.5-pre1 (2020-06-20) on ip-172-31-74-118.ec2.internal X-Spam-Level: X-Spam-Status: No, score=-1.9 required=3.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.5-pre1 Date: 27 May 93 19:33:35 GMT From: eachus@mitre-bedford.arpa (Robert I. Eachus) Subject: Re: good software engineering (was: mixing integer and logical ops) Message-ID: List-Id: In article groleau@e7sa.crd.ge.com (Wes Groleau X7574) writes: > I think the reason is not that Ada makes it easier to spot "bad code" > Rather, IMHO, it's that the "Ada community" has from the beginning > had an attitude in favor of "pretty" and readable source code. The > C community on the other hand is unsuccessfully trying to change a > culture that EXPECTS code to be unreadable. I think it is a little more complex than that. Why did the culture evolve that way, and why does it evolve similarly when new communities adopt Ada? First of all, declarations in Ada come first, and it is hard to write unreadable declarations. So if the private part or the sequence of statements is obtuse, the reader--even if he or she is also the author--notices the increase in the fog level. (Yes, an expert can write a declarative part with a high fog index. But by the time he has that capability, he has also been conditioned not to.) Second, Ada tends to magnify little flaws out of proportion. When you write Int(Integer(Int'First) + N - 1) the need to rethink the type of N is immediately apparent. In this case, which happened to me earlier this week, the reason for the expression was that N could be out of range of the generic formal type Int. Replacing the variable N by a new variable, linearly related to N, but always in range, got rid of the need to use Integer for anything. So Ada code is held to a high standard of beauty not only because elegence in Ada is usually easy, but also because the occaisional wart is so ugly that anyone with the least bit of pride will try to clean it up. Yes the culture is important, but the language keeps the culture the way it is. A similar case is short variable names in C or APL. This not "just" a cultural thing, those languages tend to make long names look silly. (C also had an early restriction on the number of significant characters (8) and on external names--but that was an implementation restriction. C has always allowed arbitrarily long identifiers.) -- Robert I. Eachus with Standard_Disclaimer; use Standard_Disclaimer; function Message (Text: in Clever_Ideas) return Better_Ideas is...