From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.8 required=5.0 tests=BAYES_00,INVALID_DATE autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,38af5710cd0592fa,start X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 1994-11-18 07:20:50 PST Path: nntp.gmd.de!xlink.net!howland.reston.ans.net!pipex!uunet!psinntp!barilvm!news.datasrv.co.il!zeus.datasrv.co.il!benari From: benari@zeus.datasrv.co.il (Moti Ben-Ari) Newsgroups: comp.lang.ada Subject: Motivating inheritance and dyn. poly. Date: 18 Nov 1994 13:16:56 GMT Organization: DataServe LTD. (An Internet Access Provider), Israel. Message-ID: <3ai9g8$5e6@israel-info.datasrv.co.il> NNTP-Posting-Host: zeus.datasrv.co.il X-Newsreader: TIN [version 1.2 PL2] Date: 1994-11-18T13:16:56+00:00 List-Id: I am writing a textbook which will have a chapter on object-oriented programming (I suppose all textbooks these days have to have a chapter on OOP...). My problem is that I have yet to see a truly convincing example of the need for inheritance and dynamic polymorphism in a real application. Can anyone contribute a good example of an application that is significantly better with inheritance? To analyze in more detail: The 9X rationale talks about variant programming and class-wide programming. As for class-wide programming, most of the examples seem to deal with heterogeneous data structures, whose problems are caused by strong-typing. If that is the justification for inheritance, then it becomes a technical matter hardly of interest to the average programmer. I would like to go more deeply into variant programming, as type extension seems to be the real justification of inheritance. One phenomenon that I repeatly see is "variant overkill", i.e. using variants for no essential purpose. I believe that the normative use of variants is to represent true alternates, e.g. a message which can take dozens of forms. On the other hand, many try to use it to save a couple of words of memory as if we were still programming for the dear, departed 16K PDP-11. The example in the Highlight section of the Rat. is typical. There is no reason why a simple record could not be used: type Alert is record P: Priority; Time_of... Message... Action... Ring... end; with null values or pointers used for non-relevant fields. Processing can be done using __non-nested__ if's or case's: if (A.Priority = Medium) or (A.Priority = High) then ... end if; if (A.Priority = High) then Display... Set_Alarm... end if; This is trivial to understand and maintain, and the time and space overhead is minimal. It is _true_ that adding an alarm of a new priority will require significant modification of the package specification. But I believe that a more likely scenario is that the day after you deliver the system you will get a call: "My operators are falling asleep at the controls of the reactor; I want an alarm raised in all priorities." In a non-variant, non-tagged implementation, the modification is a trivial modification __within__ the package body: if (A.Priority = High) then Display... end if; Set_Alarm... With a tagged implementation, this trivial request breaks the entire type hierarchy and requires non-trivial modification of the type definitions. I don't want to put myself against the whole world, and object to object-oriented programming (no pun intended!). I would like to ask for help in motivating the use of inheritance in the design and programming of a true application: not insects or rectangles, but air-traffic control, financial market modelling, cellular phone switching -- the sort of things we claim that OOP in general and Ada in particular is good for. Thanks Moti --------------------------------------- Dr. Moti Ben-Ari, Mavo Software Ltd. benari@datasrv.co.il POB 1603, Rehovot 76115, Israel Tel: 972-8-470-793, Fax: 972-8-466-831 ---------------------------------------