From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,a8f3d1e506b839fc X-Google-Attributes: gid103376,public X-Google-Language: ENGLISH,ASCII-7-bit Newsgroups: comp.lang.ada Subject: Re: FW: How come Ada isn't more popular? References: From: Markus E Leypold Organization: N/A Date: Sun, 11 Feb 2007 15:52:03 +0100 Message-ID: User-Agent: Some cool user agent (SCUG) Cancel-Lock: sha1:Dt23dPYt9sxzFmMPbZSlzw7Oi9Y= MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii NNTP-Posting-Host: 88.74.59.229 X-Trace: news.arcor-ip.de 1171205208 88.74.59.229 (11 Feb 2007 15:46:48 +0200) X-Complaints-To: abuse@arcor-ip.de Path: g2news2.google.com!news3.google.com!news4.google.com!border1.nntp.dca.giganews.com!nntp.giganews.com!newsfeed00.sul.t-online.de!newsfeed01.sul.t-online.de!t-online.de!newsfeed.arcor-ip.de!news.arcor-ip.de!not-for-mail Xref: g2news2.google.com comp.lang.ada:9294 Date: 2007-02-11T15:52:03+01:00 List-Id: Hi Randy, "Randy Brukardt" writes: > Markus E Leypold writes: >> "Randy Brukardt" writes: >> > Markus E Leypold writes: > ... >> >> "it's usually the entire structure" => In a world without >> >> representation sharing. In one with representation sharing (and that >> >> makes the production of new trees from existing ones really >> >> efficient), one would have to traverse the whole subtree any time a >> >> tree referencing this subtree is deleted - if only to adjust the >> >> reference counters. In a world with GC you just overwrite one >> >> reference to some subtree. When the GC hits it just traverses the tree >> >> once (and then perhaps decides which parts to deallocate). Dependent >> >> on the application domain (e.g. if you produce and drop many shared >> >> representations between GC cycles) that might be much more efficient. >> > >> > In an imperative language like Ada (and this *is* the Ada newsgroup!), >> >> Yes, yes. But I *had* the impression that we're now talking about >> languages in general and general purpose languages. > > Perhaps others are, but not me. Yes, forgive me. It was probably me introducing the topic of other languages, since my hypothesis is, that the role Ada currently has can hardly be understood without asking what advantages the users find with competing languages. > I'm not much interested in most other > languages because they don't meet my personal requirements. > > [In case you're wondering, here are some of them: This is always interesting to hear. And since you're labeling them as your personal requirements, you'll not find me to contradict you. > (1) Pascal-like syntax; > (2) Good support for encapsulation/information hiding, including > good support > for initialization/finalization; > (3) Great compile-time and run-time checking, with as much as possible at > compile-time; > (4) The possibility of getting close-to-the-metal performance.] Yes. I understand how Ada probably comes out as the only language that could meet those requirements. And since you're having your own compiler, you're probably not as affected by "the vendor situation" as other people (me) are. BTW: Since I'm just talking to you: Is there a Janus Ada for Linux or BSD? "for 80386 Unix" sounds to me like SCO Unix. >> I _do_ think that this concern for overhead in most cases is a case of >> premature optimization. As I already noted in this thread: >> >> For all the increase in processing power we have seen in the last >> decades I once would like to buy myself the convenient luxury of using >> GC because it's easier on the developer (candy for developers). For >> once I don't want to see that processing power go into useless >> blinking and flickering of smooth GUIs (candy for users). After all >> the users (indirectly) pay the development time and it's their money >> that is being saved if the developers spent less time with getting the >> memory manangement right. > That's fair, but I disagree. (Perhaps it's just my advancing years -- > although it is weird to think of oneself as old at 48, considering I work > with much older people on the ARG -- it's still old in computer years. I > started out on a 4MHZ Z80 with 48K (not M!) of memory, and that will always > color my impressions.) :-) I'm not so much younger. A bit perhaps. And where I disagree with you or other people here stems from some experience and less from that I were a newcomer now trying to reform all that has been good until now (like those people who always try to remove the parenthesises from Lisp :-). Indeed to make that clear: I think Ada is good as it is, actually the high point in the development of languages from the Ada family. I don't think it can be changed much without introducing major design faults or friction between design goals. My point in that discussion(s) was that other approaches to language design (FP, GC, type inference) found a receptive audience who found much use in those features. And since the number of developers is finite, one has to look to other languages what they offer to their adherents, to understand why Ada doesn't have more followers at present. That is not to say that Ada is bad (I find judgements like that difficult to pronounce anyway, because I usually find that languages are well adapted to solve specific problems in a specific historic situation: It might not be your or my problem these day, but still, it's hard to blame a language to be adapted for a situation which has changed today). BTW, my other point was about the economics behind the decision for or against programming languages, Specifically the threshold costs of entering the market. So also not against Ada as a language. > Anyway, I think that it is very important to be able to get high > performace out of code without having to rewrite it into a different > language or (in the case of GC) implementation style. When you find > out that the memory management overhead is too high in your code, > what can someone who depends on GC do? Very little, without > completely breaking encapsulation. I'm not sure you can in all cases of manual memory management (e.g. unbounded string packages have somehow to anticipate future allocation patterns in all languages). Just yesterday I read about special heap allocators you can link with .NET, Visual C++ and Borland C++ programs to reduce runtime and allocation costs. They are rumored to bring around 20-50% of speedup for certain application. I can imagine similar specialized allocators with different strategies that get substituted after profiling allocation patterns. > And just > turning GC off of the offending types is likely to produce a program that > leaks memory like a sieve, as objects are dropped everywhere. :-]. No. Of course a certain amount of rewriting for that types would have to take place, but I had one case in mind you brought into discussion: Memory that is allocated once and left until program termination where it is implicitly "collected" by the operating system. > The issue for me is not so much that everything has to have high performance > (that surely *is* premature optimization), but rather that I can get to the Ok :-). > highest performance without replacing large parts of my code. And there I disagree. I only want to be able to achieve reasonable high performance by optimizing hot spots. I think I can do that using a language with GC and where necessary rewrite a bit and if everything else fails link in foreign functions in C (or whatever). I know that this doesn't give me highest performance, but I'm not willing to pay the price in terms of developer time and effort (which I consider to be high) for the _option_ to get highest performance. I prefer to pay the price for performance later when I really need it. And BTW: Since I'm usually not deploying to thousands of customers or users, it is usually cheaper to make the user(s) buy a newer and faster machine if that should really be a problem: The price would be higher if they all had to pay for optimizations or the permanently occurring costs of using the wrong tool for the job in question. > The worst > example of that is writing the program in some poorly performing language Now, now. But that hardly ever happens. (If we forget for a moment the enthusiasts that build office applications and databases (e.g. trouble ticket systems) in Java in the 1999s ... that really sucked :-). > which is not quite good enough (I won't name names here, as I don't have > enough experience with other languages to say anything useful). > In that case, you might be forced to rewrite the entire system into > Ada or C. I do not think that happens with a good foreign function call interface. After all, I'd only have to optimize the hot spots. > For me, that would be a disaster (I hardly have enough time to do > something once!). :-). See -- my experience is, that things take longer in Ada than in XYZ (let the language stay unnamed to avoid discussions about the merits of this language). This might not be due to merits of language but also to the accompanying tools and libraries. But if I'd act like you do, I'd pay those costs all the time. If I'm doing enough stuff in XYZ, though, I have a bit of time left to optimize the one project that really needs it. > But, of course, you mileage may vary. Specifically our approaches vary and how we see the factors that influence our work. Since you see GC as something that hardly buys you anything now, but will often incur costs in the future (when you would be forced to rewrite), I'm not surprised about your choices in this matter, > And I don't think much else remains to be said: our perspectives are > too different to come to any sort of agreement. Yes. But it was nice to talk to you anyway and especially this last exchange was enlightening (to me, anyway). Regard -- Markus