From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,21960280f1d61e84 X-Google-Attributes: gid103376,public X-Google-Language: ENGLISH,ASCII-7-bit Path: g2news2.google.com!news4.google.com!border1.nntp.dca.giganews.com!nntp.giganews.com!nx01.iad01.newshosting.com!newshosting.com!news-feed01.roc.ny.frontiernet.net!nntp.frontiernet.net!newsfeed2.telusplanet.net!newsfeed.telus.net!edtnps82.POSTED!023a3d7c!not-for-mail Sender: blaak@METROID Newsgroups: comp.lang.ada Subject: in defense of GC (was Re: How come Ada isn't more popular?) References: <1169531612.200010.153120@38g2000cwa.googlegroups.com> <1mahvxskejxe1$.tx7bjdqyo2oj$.dlg@40tude.net> <2tfy9vgph3.fsf@hod.lan.m-e-leypold.de> <1g7m33bys8v4p.6p9cpsh3k031$.dlg@40tude.net> <14hm72xd3b0bq$.axktv523vay8$.dlg@40tude.net> <4zwt33xm4b.fsf@hod.lan.m-e-leypold.de> <1j7neot6h1udi$.14vp2aos6z9l8.dlg@40tude.net> From: Ray Blaak Organization: The Transcend Message-ID: User-Agent: Gnus/5.09 (Gnus v5.9.0) Emacs/21.1 MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Date: Fri, 02 Feb 2007 01:37:26 GMT NNTP-Posting-Host: 208.66.252.228 X-Trace: edtnps82 1170380246 208.66.252.228 (Thu, 01 Feb 2007 18:37:26 MST) NNTP-Posting-Date: Thu, 01 Feb 2007 18:37:26 MST Xref: g2news2.google.com comp.lang.ada:8832 Date: 2007-02-02T01:37:26+00:00 List-Id: "Randy Brukardt" writes: > I don't agree. Ada provides those hooks to the user code in the form of > controlled types and finalization. It's always possible for an object to > know that it is about to be destroyed. Combined with using local objects as > much as possible (so that they can be destroyed automatically) and avoiding > references as much as possible, there is no particular problem, and no GC is > necessary. Doing GC by marking scope exits is a fundamentally inefficient way of doing GC. This is the classical performance hit that reference counting schemes suffer from, which can result in "waves" of cascading cleanups. Modern GC algorithms do things by "finding" unused objects at collection time. In between the GC is not running, which leads to faster and more predictable execution times. And collection time does not necessarily imply a huge performance spike either, as collection is amortized into future collections. Furthermore, finalization-based GC only lets the programmer have control over their own types. If GC is part of the runtime one has the confidence that all memory in the system is being properly managed (outside of unsafe areas, interfaces to foreign functions, etc.). Modern GC systems simply do a better job at it them people do. > The only time you need GC is when the owner of an object looses any > connection to it without destroying it. I think that represents a program > bug - a manifestation of sloppy programming. It is only sloppy programming in the context of manual clean up. With a valid GC you do not have to clean up at all. One simply stops using objects when they no longer need them, just "dropping them on the floor", leaving it up to the GC to eventually collect it. This considerably simplifies many algorithms and application code, and the "sloppiness" in fact becomes an advantage: cleaner, easier to maintain code. > In any case, I think GC is just a stopover on the road to general > persistence. At some point in the future, we'll have enough space that we > won't ever destroy objects. No way. As long as you have finite memories and long running algorithms, you will need to clean up. Just leave that clean up to an automatic system. Anyway, even "destroy nothing" is a form of GC, assuming that a process' memory gets reclaimed by the OS when it terminates. The point is that the programmer is freed from the error prone tedium of explicitly managing memory. > GC is unnecessary in this system (and quite possibly interferes with it), > and forcing support for it isn't helpful. It's like encoding other forms of > obsolete technology into your programming languages; because of > compatibility you can never get rid of them. GC is one of the modern advances of computing science, period, akin to the use of high level languages vs assembly, lexical scoping vs dynamic scoping, strong typing vs no typing, etc. It should be used by default and turned off only in unusual situations. Of course, in this group, those circumstances are probably the usual case, what with the use of Ada for realtime and embedded programming. Still, given recent realtime GC algorithms, I would consider the use of GC for any system I was responsible for, and would give it up reluctantly, only if timing and space constraints were too tight for a given application. -- Cheers, The Rhythm is around me, The Rhythm has control. Ray Blaak The Rhythm is inside me, rAYblaaK@STRIPCAPStelus.net The Rhythm has my soul.