From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,21960280f1d61e84 X-Google-Attributes: gid103376,public X-Google-Language: ENGLISH,ASCII-7-bit Newsgroups: comp.lang.ada Subject: Re: How come Ada isn't more popular? References: <1169636785.504223.139630@j27g2000cwj.googlegroups.com> <45b8361a_5@news.bluewin.ch> <3pejpgfbki.fsf@hod.lan.m-e-leypold.de> From: Markus E Leypold Organization: N/A Date: Wed, 07 Feb 2007 11:56:01 +0100 Message-ID: <66tzxy44ou.fsf@hod.lan.m-e-leypold.de> User-Agent: Some cool user agent (SCUG) Cancel-Lock: sha1:plPr3W8EuPFvg/6CyOcn1PGLMT0= MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii NNTP-Posting-Host: 88.72.241.241 X-Trace: news.arcor-ip.de 1170845456 88.72.241.241 (7 Feb 2007 11:50:56 +0200) X-Complaints-To: abuse@arcor-ip.de Path: g2news2.google.com!news4.google.com!news2.google.com!news3.google.com!border1.nntp.dca.giganews.com!nntp.giganews.com!newsfeed00.sul.t-online.de!newsfeed01.sul.t-online.de!t-online.de!newsfeed.arcor-ip.de!news.arcor-ip.de!not-for-mail Xref: g2news2.google.com comp.lang.ada:9111 Date: 2007-02-07T11:56:01+01:00 List-Id: Maciej Sobczak writes: > Markus E Leypold wrote: > >>> Sure. But for the sake of discussion completeness, you might wish to >>> throw an example of a situation where scoped lifetime will not make it. >> Model-View-Controller in GUIs. Especially trying to adapt that to >> GTKAda. > > I will not repeat Dmitry's arguments here. Which was 1) I am not sure what you mean, and then 2) "I'm now talking about reference counting". Which latter you might notice is a specialiced garbage collection scheme and not objects with scoped lifetime. Seems to support my view. > >>> Java programmer wrote a loop where he opened database cursors, > [...] > >> I'm not surprised. GC'ing ressources that are bounded doesn't spare >> you knowing about the way GC works. > > Exactly. That's why I say that the solution is incomplete. If you have > to think about the mechanics of some solution, then that solution is > not entirely/properly automated. I'd call that nonsense. Let me explain by analogy and then bury that argument, since I obviously have another approach to that and we've rehashed the respective arguments enough and seem not to be able to agree. Analogy: I don't think all the time about the way the compiler generates code, but if I want to optimze, e.g. a loop, I have to think about the mechanics of compilation. Certainly: A super-duper compiler with artificial intelligence woul have seen right away the way I optimze my loop now and would have it done qithout my interference (actually many compilers today do quite a good job in this respect), but that they didn't in the past didn't keep people from using compilers and I still experience using compilers AND using GC as tools as a vast simplification of my work w/o many downsides. You can't deny _my_ experience but YMMV, And since you insist you cannot see my point or do not want to use GC we're stuck :-). >> You're approach "I want it all, and if >> I can't have both (memory management AND general ressource collection) >> I don't want neither" is somewhat counterproductive. > > I find it counterproductive to apply different management strategies > with regard to *implementation details* of different types. Unfortunately external ressource are no types in the type theoretic sense, since they are not implemented in memory only. And whereas, i.e. I produce and drop data with every function call, I find myself opening and closing files or creating temporary file much less often. So I profit vom GC as it is w/o feeling much of the mental pain you seem to experience from the concept. > I prefer solutions which enable me to hide implementation details from > clients (sotware engineering?). If clients have to treat different I fail to see how using no GC helps you in that, whereas using GC as it is today hinders you. > types differently just because their implementation details differ, > then it means that these implementation details leak out in the form > of distinct handling methods. I want to treat String and > DatabaseCursor in the same way - that's the prerequisite for being > productive for me. If purity in any sense is your prerequisite for being productive, you should take up Haskell. >> But you might well continue to believe in your policy here. > > Thanks. :-) > >> I, >> personally, find that it brings me a big step nearer to salvation if I >> can have GC, even if I do only manual MM with it. After all: I don't >> have this much other (external) ressources to care about and if I do, >> it pays to have a careful look at their structure and then wrap some >> abstraction around them. > > OK, I understand it. We just agree that GC is a valid solution for > *some* class of computing problems. For some really large class. I know, e.g. , it wouldn't solve the halting problem. > So why people claim that GC-oriented languages are general-purpose? Because they can write all kinds of programs with them? I'm not trying to sell a GC-oriented langauge to you. I'm convinced it fits many of my problems at it does fro other people and I'm contend to leave the rest to evolution of a sorts: If i'm wrong I'll meet sooner or later "the unsolvable problem" whereas if you're wrong you'll never write programs on a level of beauty and simplicity as I do. Not that I'm a multi tool person of sorts. I've been developing seriously in quite a number of languages, so I expect to be able to fall back on the tool I think that is the best for the job at hand. You, on the other side insist on excluding a tool because it does not SEEM perfect to you (you never worked with a GC'ed language, I gather?). Forgive me this word, but it _seems_ narrow minded to me. But of course it's your decision. > >> But your approach is, since somebody had problems >> misusing GC in a specific case > > No, that's not the point. The point is that languages which are build > around GC tend to drop the proper support for other types of resources > altogether. As if manual managment could be called proper support in this respect ... > It's not the particular programmer who misused GC in a specific case > - it's language designers who closed themselves in the GC cage and > cranked a language that fails to provide good support for wider > class of problems. Big words, but given the choice between 1) Manual ressource management 2) GC with (according to you) imperfect support for other kinds of ressources 3) The hypothetical system you propose should exist I'll always prefer to go with (2) and bear the intellectual friction. > As I've already said, the ideal would be to have both GC and scoped > lifetime. The problem is that there is no reliable industry experience > with such a mix, unless we treat Boehm's GC as one. You know, the (functional) with_ressource wrappers have been around for some time -- dating even back to Lisp -- and since Lisp has been used quite extensively "in the industry" for some time, I'd say there is reilable industry experience. You're just inventing problems where none exist. > >> As far as the usability of GC goes, it even helps with controlled >> objects > [...] >> I just make sure to close (e.g.) the filehandle and let the >> rest to the GC. > > Of course. But then it's up to the designer of the type to decide how > to treat each component of that type - it should be implementation > detail. This decision should not be put on the shoulders of the final > user, which is now the case in mainstream GC-oriented languages. This "Treat each component of the type?" We're not talking about components, but closing file handles, that is, switching over to scoped ressource management in case where you know the automatic collection won't be up to it, because your consumption will hit a limit before the next collection is triggers and that there is no trigger tied to this particular ressource limit. > is what is broken. And it will stay broken, since there are enough cases where we don't want to trigger a major GC cylce every 64 file open()s only because the programmer is dropping file descriptors on the floor in a loop. Awareness to ressource restriction cannot be give up altogether since the naive approach of first openeing 500 files and storing their handles in an array seems also possible for a programmer who isn't aware of the restriction -- and GC won't be able to fix that problem anyway. > >>> There is nothing particular in scoped lifetime that would prohibit >>> compacting heaps and there is nothing particular in GC that guarantees >> No. But without having 3/4ths of a GC anyway compaction is pretty >> pointless. > > Why? If the goal of compaction is to avoid fragmentation, then what is > pointless in having compacted heaps managed by scoped lifetime? Heap(s) -- I even resent the plural here -- are not about scoped but about indeterminate lifetime. And a "compacted" heap has all the unpredictability (in timing) as that of a garbage collector and would provide garbage collection almost for free. So you want the downside -- perhaps loose real time capability (with som algorithms) -- pay for it in moving memory object (makes the FFI more complicated) and the you don't want to have the advantage of freeing unused memory? Strange... > >>> it. It's just the statistics based on popular implementations, not a >>> rule. >> Sorry, that is nonsense. There are garbage collectors that are >> designed to be compacting. > > So what? This is exactly the statistics I'm talking about, that does No no no no no. > not prove that GC guarantees compacting or that the lack of GC > prevents it. 2-space garbage collectors are compacting. Full stop. Forgive me, I know you're only uninformed, but what youre spouting is exactly the kind of FUD that has hindered the wide spread adoption of GC into the main stream for years. People don't know how GC works, don't know what it does, but they are somehow convinced that (a) it disenfranchises them of the control of the computer, (b) cannot cope with real time requiremens (as if they had any) and (c) is statistical and unpredictable. I'm sure I've forgotten some points on the list, but you get my drift. FUD. And it's not even that you're selling a system w/o GC for money. >> They are moving objects around. This is >> absolutely deterministic and not statistical. > > By statistics I mean the number of language implementations on the > market that choose to use compacting GC vs. the number of languages > that use non-compacting heaps. :-) And that proves what? > > > Whereas manual >> allocation and deallocation as in Ada or C will fragment the heap and >> you have NO guarantee (only statistics) about the ratio of allocated >> (used) memory and presently unusable hole. > > If that bothers you, then use non-fragmenting allocators. There are, as I can see, no non-fragmenting (heap) allocators for unpredictable allocation patterns. >> Hows that about >> reliability if you can't give space guarantees even if you know about >> the memory your algorithms need, since unfortunately you cannot >> perdict the exact sequence of allocations? > > I use non-fragmenting allocator and I get my guarantees. See above. Apart from the fact that there are no nonfragmenting allocators being shipped with, e.g. Ada. So do it yourself. Wow: Avoid compacting GC, get more work, do it by hand. You can see why that doesn't attract me. >>> I can perfectly imagine compacting heaps managed by scoped lifetime. >> Yes you can do that. Since you're following pointers than and reqrite >> them you might as well go the whole way and deollaocate unusable >> memory while you're at it. > > Yes. Note that scoped lifetime does not preclude GC on some lower level. You admit it, finally? > Scoped lifetime provides a hook for deterministic "good bye" action - > there is nothing more to it. Even if that "good bye" action calls > free/delete/whatever on some memory block, there is nothing that > forces the runtime to return the given block of memory right back to > the operating system. Actually, none of the self-respecting allocators > do this systematically - instead they keep the memory around for a > while in anticipation of future allocations. I have nothing against GC > at this level, really (and I've seen such implementations - in fact, a > fully standard-compliant implementation of the C language could > provide *empty* free function and GC underneath; and fully conformant > C++ implementation could just call destructors as a result of delete > and leave the raw memory to GC). > What I'm against is a GC "paradigm" that prevents me from having > deterministic "good bye" hooks for scoped lifetime. The problem is There is no such GC paradigm. I wonder what we were talking about the whole time. > that most GC-oriented languages I'm aware of do have this "issue". > > In other words, for me GC is acceptable as an implementation detail of > the dynamic memory allocator. I don't care *how* the allocator deals Unfortunately GC is no implementation detail, since you see wether there a free() or dispose() calls in the source. > with memory that I free in the same sense that I don't care *how* the > operating system deals with files that I remove from the > filesystem. What I care about are hooks and scoped lifetime is an > obvious answer for this. > >>>> (3) What is often needed are upper limits not determinism and thos >>>> upper limits can be guaranteed with GC or with an appropriate >>>> collector. >> >>> This refers to memory consumption only, whereas I clearly stated >>> deterministic *time* as a second (first, actually) goal. >> This refers to both, there are real time compatible GC >> algorithms. > I'm interested in what is their target audience. I would expect any > decent RT system to *refrain* from using dynamic memory except in the > initialization phase (so that the "mission phase" is performed with > constant set of objects), in which case RT GC would be just an answer > to the question that nobody asked. Yes, that is the old answer. > Experts might wish to correct me and elaborate on this. Ask Ray Blaake. >>> OK. What about refcounting with smart pointers? >> (1) It ties lifetime to multiple scopes (instead of one) > > With GC tracing pointers you have the same, just the tracing is hidden. Yeah, but -- you had problems with that, you wanted true and pure scoped lifetime, I don't. And if I don't want that I don't use smart pointers as a hidden GC scheme: I just use GC and do away with reference counting. > >> (2) its not >> efficient > > Why? See other posts in this thread. >> (3) It stille doesn't work for the general case > > Neither does GC, as seen in examples. :-) Yes, but smart pointers where YOUR better answer to GC. You were dissatisfied with GC. GC doesn't work for you, since it's not general enough etc etc. Then you come up with smart pointers and ref counting as an alternative -- which doesn't work either. Ooops. Why bother at all, then? > >>> I acknowledge that there might be some applications which are strictly >>> memory-oriented. They are just not the ones I usually write. >> It also works for apps that are not "memory-oriented". I think you're >> missing that e.g. filehandles are really simpler and differently >> structured ressource from memory. A filehandle does not contain >> references to memory or other filehandle. Memory does. That vastly >> simplifies the problem of managing file handles indeed so much that >> I'm convinced that you don't need buitlin support for this. > Somehow this idea didn't work for database cursors, as already described. Somehow ... you missed my point. It did work. File handles are in my view not supposed to be handled by GC since they are structurally and semantically different from memory. They should be closes explicitely. >>> Sure. In other words, be prepared that with GC you have to >>> handle/understand some parts of the sytem better. >> So? > So the implementation details of *some* types leak out in the sense > that they force me to understand their internal mechanics. I don't > want to. No. You just need to stick to the rules. Close your ***** filehandles manually. All the time. If you want to be smart, though, it pays to think about the interaction with underlying system (the implementation). Same as with the loop optimization. And > I want to say this: > > declare > Sql : Session := Open_Session("some parameters"); > Str : String := "Hello"; > begin > -- ... > end; > > instead of this: > > declare > Sql : Session := Open_Session("some parameters"); > Str : String := "Hello"; > begin > -- ... > -- damn, I have to do *something* with *some* stuff here > end; > > [about FP] >>> The difference is that in languages with scoped lifetime the lifetime >>> management is a property of the type (and so applies to all >>> instances), whereas the "FP-trick" above is a property of the >>> use-side. Which one is more robust and less prone to bugs? >> This is, forgive me, nonsense. I might want to use a file handle in a >> scoped way here and in a free floating way there. > > What about readability and maintainability of such code? Nothing. It's OK. with_file "foo" ( fun fd -> ... );; or let blaba = ... and fd = file_open "..." in yadda (); blubb (); let oops = ... in ... fileclose(); oops (* that's the return value for non ML programmers *) So? > >> And no -- the FP way is not "more prone to bugs" > > Unless you use a handle in a free floating way and find later that in > production your code was called in a loop causing handles to pile up? I didn't do that, so I can't defend against it. Perhaps your friend simply is not the Java software engineer he wants to be. I never did have that problem ... -- so can I stop now defending my approach to programming against errors other people have committed in another language because they didn't read the JDBC reference manual? > I have the practical example (already described) that this way of > thinking can lead to failures. If you think you never need to use a file handle in a scoped type, you _can_ wrap it either in a with_file_do wrapper or into a scoped type. But given e.g. the question how to build other primitives like open_server_connection() from file handles I doubt it's a winning situation to do that from the beginning. And BTW: Excluding human error is only possible to certain extend. I see Controlled and scoped life times only a tool to structure programs in an understandable way but by no account as a way to enforce programming _style_. Quality is better served by (a) reviews and (b) coaching structures within larger teams. Both probably would have caught your friends mistake. That said, it probably would be a good idea to flag objects as containing external resources with additional resource limits and generate a compiler warning if the user doesn't deinitialize them within the scope or doesn't return them. > The programmer wanted to use a database > cursor in a free floating way. That was fine. Later his code was used > in a loop. Reusing code in another context without re-viewing it, was what cost the ESA that famous Ariane 5 launch. > Ah, yes - his code was used in a loop written by another > programmer, so his judgement about whether it's OK to use anything in > a free floating way was misguided from the very beginning. Reviews, reviews, reviews. Understand what you use! >> and as with >> George Bauhaus I simply refuse this kind of discussion (FUD and >> ContraFUD). > > OK. We will just stay unconvinced. :-) :-) I think so. We just don't have a common basis to slug it out and come to a rational decision. Not surprising, give that Ada and C++ are still around AND the GCed languages are also living > >>> BTW - please show me an example involving 10 objects of different kinds. :-) >> All at the same time? > > Yes. > >> Well -- bad programming. > > I knew you would answer this. :-) let foozle a b c = with_thingy1 a (frobnicate b c);; let foobar x y = with_thingy2 y x (defrobnicate (foozle a b));; ... let do_it_now = with_thingy10 "/etc/passwd" (bla "thing1" 12 13) (baz "thing2" (troon "thing3 123 123) unfroth)) ;; I think you get the drift: It depends on the structure of the problem ... Regards -- Markus