From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,3498dd887729ed19 X-Google-Attributes: gid103376,public From: dewar@merv.cs.nyu.edu (Robert Dewar) Subject: Re: Garbage Collection in Ada Date: 1996/10/17 Message-ID: #1/1 X-Deja-AN: 190271651 references: <01bbb910$f1e73f60$829d6482@joy.ericsson.se> <199610132138291604607@dialup101-6-14.swipnet.se> <19961014115513529729@dialup105-2-16.swipnet.se> <32652A7C.41C6@mti.sgi.com> <3265C383.167E@mti.sgi.com> <3266C15E.2781@mti.sgi.com> organization: New York University newsgroups: comp.lang.ada Date: 1996-10-17T00:00:00+00:00 List-Id: "I believe the version with full coalescing has worst-case allocation time log N, where N is the heap size. Consider a heap with only one large empty block and repeated requests to alternately allocate and deallocate a 1 byte block. (I'm referring to the algorithm in Knuth, vol. 1, 2nd edition, starting at page 442. Knuth credits it to Markowitz and Knowlton.)" Yes, sure, if the largest block size is the size of the heap, but this is not the normal scenario for the binary storage allocator. But yes, you are quite right about the fragmentation, I was thinking of something else when I answered, and yes, indeed the fragmentation can be much worse than a factor of 2. Indeed the only way to avoid fragmentation that is pretty much arbitrarily bad is to implement a proper compacting garbage collector. My feeling is that if you are going to go beyond the simple conservative allocators, you really want to go all the way to a fully compacting collector. This is more practical with Ada than with C++.