From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,2702c1ed8be62863 X-Google-Attributes: gid103376,public From: "Marc A. Criley" Subject: Re: What ada 83 compiler is *best* Date: 1998/12/09 Message-ID: <366E7EFC.841A2418@lmco.com>#1/1 X-Deja-AN: 420429258 Content-Transfer-Encoding: 7bit References: <3666F5A4.2CCF6592@maths.unine.ch> <74hk55$6t5$1@remarQ.com> <74jhct$e2m$1@remarQ.com> <74jpk8$p8j$1@remarQ.com> Content-Type: text/plain; charset=us-ascii Organization: Lockheed Martin M&DS Mime-Version: 1.0 Newsgroups: comp.lang.ada Date: 1998-12-09T00:00:00+00:00 List-Id: Rick Thorne wrote: > > 1) You haven't provided concrete facts. You identified studies on an > obscure web page as "hard core" evidence of Ada's greater productivity. > The Institute for Creation Reseach offers "hard core evidence" of > Scientific Creationism too. Ever wonder why 95% of PhDs in biological > sciences poo-poo these findings? Simple: these people aren't scientists; > they're advocates POSING as scientists. When AFA produces a study showing > that Ada's a better language, it's like Dow-Corning doing research on the > safety of silicon breast implants. You'd be out of your mind to not take > the data *cum grano salis*. > > 2) I've read many studies touting the productivity of Ada, and I have to > say I think they're all a joke? Why? Think on your training as an > engineer. To properly conduct a comparative study like this, you need to > set up the experiment under tightly controlled environments. To conduct > an Ada vs. C++ study, you need identical development environments, > identical tools, and identical staff in order to proceed. Additionally, > you need people who AREN'T advocates of one side over the other analyzing > the data. Finally, you need to have ALL elements of the lifecycle > identical. My belief: if Ada developers ACTUALLY DO beat the C++ers in > all areas (quality, development time, etc.), it's less because of the > source code and much more because of the system engineering involved. > Most Ada organization (at least in the US) are government controlled in > some way (go to a Lockheed-Martin CDR/PDR if you don't believe this), and > the systems engineering is very tight. The Ada people tend to get better > requirements that the C++ by virtue of their organizational domains, and > the design is usually less brittle for the same reasons. > > My bottom line here: don't quote productivity studies and expect me to > believe them. For all the reasons I've stated above, I think they're > uncontrolled and uncontrollable AND I think the studies are aggressively > skewed by advocates on whatever side. I've actually read an AFA study > that stated up front that the study itself needs to taken with a grain of > salt! > Let me suggest one brief productivity study summary made by a highly-respected programming language non-partisan--Software Productivity Research's Capers Jones. In a letter to the editor published in the October 1998 issue of Crosstalk, Jones was responding to an earlier article regarding SLOCs and metrics. While advocating function points over SLOCs as a metrics basis was the motivation for the letter, he did briefly summarize the results of a comparative language study. "Elizabeth Starrett's article, "Measurement 101," Crosstalk, August 1998, was interesting and well written, but it left out a critical point. Metrics based on "source lines of code" move backward when comparing software applications written in different programming languages. The version in the low-level language will look better than the version in the high-level language. "In an article aimed at metrics novices, it is very important to point out some of the known hazards of software metrics. The fact that lines of code can't be used to measure economic productivity is definitely a known hazard that should be stressed. "In a comparative study of 10 versions of the same period using 10 different programming languages (Ada 83, Ada95, C, C++, Objective C, PL/I, Assembler, CHILL, Pascal, and Smalltalk), the lines of code metric failed to show either the highest productivity or best quality. Overall, the lowest cost and fewest defects were found in Smalltalk and Ada95, but the lines of code metric favored assembler. Function points correctly identified Smalltalk and Ada95 as being superior, but lines of code failed to do this. " (http://www.stsc.hill.af.mil/CrossTalk/1998/oct/letters.html) While I do not know the details of the referenced study, given the reputation of Capers Jones and the nature of his business, I believe one can trust the validity of his conclusions. > > Respectfully, > > -- > ? Rick Thorne ? "I'm quite illiterate, ? > ? software engineer by day ? but I read a lot" ? > ? harried father of two by night ? J. D. Salinger ? > ? rick.thorne@lmco.com ? ? > ? http://www.geocities.com/Athens/Oracle/6816/ ? -- Marc A. Criley Chief Software Architect Lockheed Martin M&DS marc.a.criley@lmco.com Phone: (610) 354-7861 Fax : (610) 354-7308