From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.5-pre1 (2020-06-20) on ip-172-31-74-118.ec2.internal X-Spam-Level: X-Spam-Status: No, score=-0.5 required=3.0 tests=BAYES_05 autolearn=ham autolearn_force=no version=3.4.5-pre1 Date: 22 Oct 92 08:07:35 GMT From: cis.ohio-state.edu!zaphod.mps.ohio-state.edu!rpi!scott.skidmore.edu!psinn tp!psinntp!intrepid!gary@ucbvax.Berkeley.EDU (Gary Funck) Subject: The Obfuscated Ada Contest (was Re: An admittedly biased ...) Message-ID: <1992Oct22.080735.19815@intrepid.com> List-Id: In article <1992Oct19.165603.16988@nntpd2.cxo.dec.com> wallace@bonmot (Richard Wallace) writes: > >Gregory Aharonian (srctran@world.std.com) has put his finger on the real >issue behind the C++/Ada religious wars -- economics. > >I can not agree with the slams that are put forward by Gregory. The issue >behind the economic arguement is one that can be brought down to >implementation and education issues. I list them here: > ... interesting info. on the economics of PC and GNU software left out ... ... info. on lint as an unacceptable tool left out ... >This is the issue that the U.S. Government wants. >Not to grow the individual (who is around for 20 years on a single program?) >but to grow the technology so that second-sourcing -- not a well understood >concept in the software industry -- can be done on the software of the >Government computer software. > To change this thread a little: Let's talk about the Ada programs that I've seen in the "real world" of Ada programming on large projects. First my disclaimer: I haven't worked on a defense contract, so am not aware of all the constraints that defense contracts impose on programming practice. However, I'd like to learn more, because I have had the oppurtunity to observe the output of several large Ada (commercial and defense) projects. Most, if not all of the code I've read has been almost completely opaque, reaching standards of opaqueness seldom achieved in the C source code (and even assembly language) source that I've read. Here are some of the practices that I've seen, which just leave me wondering: 1. Naming conventions that only computers can understand. Something like SCU_1_1_2_FCU_VAR (the 2167 disease), or ACU_CONTROLLER_X_AXIS (the subsystem disease). It seems that the readability of the program has been sacrificed to the god of traceability. Tell me that you get used to reading 50 character variable names where the last 5 characters are significant. Tell me that it's not a monumemtal effort to change program structure. Tell me it wouldn't be better to implement a traceability/CASE tool that keeps the unit hierachy and dependency numbering scheme in parallel with the program components. 2. The fully qualified never-use-the-use-statement syndrome, which is a close cousin to the "local renames are clearer ... really" syndrome. First, I'm aware of some/most of the gotchas related to the rampant use of the "use clause". But hey, I give the programmer the benefit of the doubt, and also believe in code review. If a "use" makes things clearer, then so be it. If adding a new item to another package causes a clash in one of its "users", then fix the clash and get on with it. I've never seen these clashes come up often (though my experience working on large Ada projects is limited). It seems to me that the maintenance cost of understanding and reworking a set of packages which never use "USE" is much higher than the cost of fixing an occasional name clash. Isn't this a tools issue? Couldn't a pretty printer, cross-ref. program or browser be able to provide the necessary package resolution information on an as-needed basis? 3. The never-use-a-package-variable syndrome, whose close cousin is the pass-the-world-via-procedure parameters syndrome. On this one, I have only second-hand experience. I've heard of a project which mandated that global (library level package) variables could *not* be referenced by packages other than the packages that declared them. This mandate apparently led to the development of procedures with up to 20 parameters, whose values were passed up and down the call hierarchy. When the delivered system couldn't meet its time constraints, the code was reworked so that it judiciously used global variables. Voila! it was twice as fast and half the size. 4. The Booch component syndrome. I like Booch's books and OOP, don't get me wrong. But when you run into a program that's running at PC/AT speed on an R4000 box, give me a call :-). When you find out that the program is beating its brains out because it creates 1000 iterators, each implemented as a task, we definitely have something to talk about. My point here is: if some degree of abstraction is good, a very high degree of abstraction is not always better. I'll close with a couple of questions: - Are some/all of the above practices common to the Ada projects you've worked on? If, yes, what's your take? Did such coding practices really improve understanding and maintaining the program? - Are the programming practices described above unique to Ada? I ask because I've seen a pattern in the "production" Ada code that I've read; each program runs along the lines outlined above. The C code that I've worked on had other problems, but long names, rigid structure, and unnecessary levels of abstraction weren't among them :-). and a concern: - If the only Ada programs that a new graduate/new hire/new Ada programmer saw were by obscured "encrypted" naming conventions, rigid structure, and poor performance, I'd be seriously concerned that Ada would take the rap, when it is only the messenger. dissenting/confirming opinions welcome, - Gary -- | Gary Funck gary@intrepid.com [uunet!uupsi!intrepid!gary] | Intrepid Technology Inc., Mountain View CA (415) 964-8135 --