From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 109fba,df854b5838c3e14 X-Google-Attributes: gid109fba,public X-Google-Thread: 1014db,df854b5838c3e14 X-Google-Attributes: gid1014db,public X-Google-Thread: 103376,df854b5838c3e14 X-Google-Attributes: gid103376,public From: Ken Garlington Subject: Re: C/C++ knocks the crap out of Ada Date: 1996/02/20 Message-ID: <312992F6.588D@lfwc.lockheed.com>#1/1 X-Deja-AN: 140296575 references: <4etcmm$lpd@nova.dimensional.com> content-type: text/plain; charset=us-ascii organization: Lockheed Martin Tactical Aircraft Systems mime-version: 1.0 newsgroups: comp.lang.ada,comp.lang.c,comp.lang.c++ x-mailer: Mozilla 2.0 (Macintosh; I; 68K) Date: 1996-02-20T00:00:00+00:00 List-Id: Jon S Anthony wrote: > > So, does this mean that there are _no_ confirmed cases of probes lost due > software? If so, I'm impressed as software has just plain _got_ to be > the weakest link in the chain. 1/2 :-) Actually, I would say that system requirements are the weak link in the chain, although the errors often tend to occur in software these days since more requirements (and particularly the harder, more volatile requirements) tend to be put in software. Three cases near and dear to my heart: For years, I have heard the story about how a "bug" in the F-16 flight control computer caused it to roll to an inverted position when crossing the equator. I have never found anything authoritative that exactly described the circumstances (if anyone has this information, let me know); but there are two points to be made about this: 1. Until relatively recently, the F-16 flight control computer didn't have any software in it. It was an analog computer. 2. Some people believe they heard this story in terms of the behavior of a handling qualities _simulation_ of the flight control system, in which the environment model only contained a part of the northern hemisphere. Someone decided to see what happened when you "flew off the edge of the earth." The other two cases are more recent and involve pilot-induced oscillations leading to an aircraft crash. In both cases, the press widely reported (in at least one case, quoting a senior executive at one company) that "the software got confused." However, the error in both cases was due to an interaction of the control law model, which can be implemented in either hardware or software, and the pilot. (The pilot will probably say the control laws were wrong; the control law people will probably say that the pilot operated the system outside its' limits. Both are probably right :). Nonetheless, because the behavior occured in software, that's what gets the blame. Dr. Levison's "Safeware" defines far issue much better than I just did, BTW.