From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,df05246395602224 X-Google-Attributes: gid103376,public From: dewar@merv.cs.nyu.edu (Robert Dewar) Subject: Re: Variable-block memory managers bad in realtime Date: 1997/12/23 Message-ID: #1/1 X-Deja-AN: 309626463 References: X-Complaints-To: usenet@news.nyu.edu X-Trace: news.nyu.edu 882886541 28019 (None) 128.122.140.58 Organization: New York University Newsgroups: comp.lang.ada Date: 1997-12-23T00:00:00+00:00 List-Id: Joe says <> There are most certainly algorithms that have guaranteed worst case (I assume that is what you mean by absolute) upper bounds on response time. Obviously absolute protection against fragmentation is impossible for an arbitrary allocation pattern unless you do compacting (and there are algorithms for doing compacting that meet the other requirements, talk to Henry Baker for example!) As to requisite efficienty, well that of course is undefined. It is of course not true that fixed-block allocators waste no space. For an arbitrary allocation pattern, they waste lots of space by allocating more than space than is required. If on the other time you only need fixed-blocks of a given size in a particular application, it is trivial to write a general purpose algorithm that behaves as you demand for this particular allocation pattern. <> There is a substantial literature that specifically addresses the issues of dynamic allocation for non-critical real time systems (critical real time systems usuall don't do dynamic allocation at all, which is why I make the distinction -- except in very trivial cases, certification of safety-critical requirements is very dificult if there is dynamic allocation of any kind, which is why most protocols for SC programming forbid dynamic allocation completely). <> Your may doubt, but my experience (from doing a great deal of real time programming in very small [4K to 16K byte memories] real time systems is that sophisticated algorithmic techniques, and particularly the use of clever data compression techniques, is highly relevant to that domain. (and needs people who know the current algorithms literature reasonably well). <> No, of course not, and that is why Ada provides the storage_pool capability so that you can write specialized allocators that are tailored to your application. However, one can do a lot better than you think in providing a general algorithm. Despite your skepticism, it is quite possible to create general purpose algorithms that give guaranteed worst case response times that are highly efficient for simple allocation patterns of the type you mention here. Robert