From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.8 required=5.0 tests=BAYES_00,INVALID_DATE, MSGID_SHORT autolearn=no autolearn_force=no version=3.4.4 Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP Posting-Version: version B 2.10.1 6/24/83; site mit-eddie.UUCP Path: utzoo!linus!vaxine!wjh12!genrad!mit-eddie!nessus From: nessus@mit-eddie.UUCP (Doug Alan) Newsgroups: net.lang,net.lang.ada Subject: Re: Abstraction In Ada Message-ID: <2226@mit-eddie.UUCP> Date: Fri, 22-Jun-84 03:47:25 EDT Article-I.D.: mit-eddi.2226 Posted: Fri Jun 22 03:47:25 1984 Date-Received: Sat, 23-Jun-84 06:37:42 EDT References: <1979@mit-eddi.UUCP> <5400007@ea.UUCP> <7506@umcp-cs.UUCP>, <2144@mit-eddie.UUCP> <2620@ncsu.UUCP> Organization: MIT, Cambridge, MA List-Id: > From: mauney@ncsu.UUCP (Jon Mauney) >> To do a good job with data abstraction, you really need >> heap-based allocation with automatic garbage collection. >> Ada doesn't support this. > I don't see how this follows, except that lack of a > garbage-collected heap restricts your ability to implement an > ADT using a garbage-collected heap. There are advantages and > disadvantages to heap allocation of data objects; I don't see > how they relate to abstraction. If data abstraction is done right, data types that are added to the languange should look just like data types that are already built into the language. Stack-based allocation doesn't work right because you have to know how much space you will use before you use it. This is not very abstract. You might not know how much space you need. Heap-based allocation where explicit deallocation is required doesn't work right because you can have dangling references. On object isn't very abstract if you try to reference it and find out it's been turned to garbage. Explicit dealocation also violates modularity because one part of a program has to take responsibilty for deallocating an object and it has to know when everyone else is no longer using it. This compromises modularity. I will demonstrate by example. Let's say that you want to implement a bignum (integer with arbitrary size) abstraction. In order to be abstract, the bignum data type should be just as first class as any other number type. If you use stack-based allocation, you will have to worry about reserving the right amount of space in advance. But gee, you don't have to do this with number types that are built in. If you use heap-based allocation with explicit dealocation, you will have to worry about dealocating a bignum when you are finished with it. But gee, you don't have to do this with number types that are built in. > (C) a type can be declared "limited private." In this case the > only operations supplied by the system are declaration of > variables (an essential ability) and passing as parameter (also > essential). This is useful in many cases, because the > system-supplied operations are not appropriate to the particular > abstraction or implementation. The '=' operator may be > overloaded, and definition of '=' automatically implies > definition of '/='. Sad to say, ':=' is not an operation that > can be overloaded, and assignment of limited private types must > be done using a different syntax. This system is not without > its defects, but is it really so horrible? Yes, it's gross! You also forgot to mention that if a composite type has components of a limited private type "=" is not available for objects of the composite type. > Can you name a language that will not elicit a "Bleah" from > someone on the net? Languages so obscure that no one on the net > has heard of them CLU is my choice. It is small, simple, clean, powerful, and general. It is also quite efficient. It sometimes sacrifices power for the sake of simplicity. It doesn't do type inheritance, or run-time type generics, so it's not suitable for everything. But for what it tries to do, it does remarkably well -- it has the best trade off of power/easy-of-use/efficiency I've ever seen. Death to Ada! Long live CLU. -- -Doug Alan mit-eddie!nessus Nessus@MIT-MC "What does 'I' mean"?