From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,21960280f1d61e84 X-Google-Attributes: gid103376,public X-Google-Language: ENGLISH,ASCII-7-bit Newsgroups: comp.lang.ada Subject: Re: How come Ada isn't more popular? References: <1169636785.504223.139630@j27g2000cwj.googlegroups.com> <45b8361a_5@news.bluewin.ch> From: Markus E Leypold Organization: N/A Date: Sat, 27 Jan 2007 21:40:13 +0100 Message-ID: <3pejpgfbki.fsf@hod.lan.m-e-leypold.de> User-Agent: Some cool user agent (SCUG) Cancel-Lock: sha1:zy1Jx6VD2aHLxzQGfrHZeDpGfDk= MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii NNTP-Posting-Host: 88.72.252.37 X-Trace: news.arcor-ip.de 1169930129 88.72.252.37 (27 Jan 2007 21:35:29 +0200) X-Complaints-To: abuse@arcor-ip.de Path: g2news2.google.com!news4.google.com!news3.google.com!border1.nntp.dca.giganews.com!nntp.giganews.com!newsfeed00.sul.t-online.de!newsfeed01.sul.t-online.de!t-online.de!newsfeed.arcor-ip.de!news.arcor-ip.de!not-for-mail Xref: g2news2.google.com comp.lang.ada:8632 Date: 2007-01-27T21:40:13+01:00 List-Id: Stephen Leake writes: > Markus E Leypold writes: > >> Given, that I recently found in 3.15p (which is not the newest one, I know): >> >> >> - Really bad bugs handling Read and Write of discrimated records, >> >> - Race conditions in the runtime system when trying to catch >> interrupts, >> >> - No way to catch the console break under windows with the interupts >> mechanism (i.e. a design error in my opinion), >> >> >> I wonder wether GNAT was good even in, say 2002 (or whatever was >> 3.15p's release date). > > It was good, in my opinion; at that time, I was far more productive > writting real applications using GNAT 3.15p than Borland C++. > >> And all those problems are really expensive to circumvent (partly >> because the runtime system insists on fiddling with the signal >> handlers). > > But to be fair, you have to say how easy the solution was in Ada, vs > the solution to the same problem in some other language. Unfortunately that's not been the case that Ada was cheaper here: - Read/Write Bug (see below). - Races with interrupts / catching break -- well, catching INT or TERM in C or console breaks is really dead easy. With GNAT you've first to remove all interrupt handling, then re-attach the interrupts and introduce a race at the begin of the program (which makes the program freeze -- aborting properly at that point would be OK, but a freeze is not). And catching the windows console break in C just takes a C procedure call. Since in the GNAT RTS it's not mapped to an interrupt (design error, IMHO), you'll have to write a C stub and do some magic there (see, C again, only more effort). > What is the equivalent of Discriminated_Record'Write in C? The concept > doesn't even exist! Partly the circumvention was not to use discriminated records together with controlled types, meaning instead of having a string in an discrimated part, the string was set to empty when it would have been hidden by the discriminant. I could have done that in C. The other part was to write 'Write myself and do the right thing. I could also have done the same in C. And C has union types. Of course they are not type safe (as C always is), but the same effect can be achieved when writing a proper handling procedure that depend on some discriminating field. Look I do not want to promote C (far from it). And I'm not talking about embedded programming or aerospace, but end user PC applications. So in this not uncommon case / market my hypothesis is (has been now for some time) that the ecological niche for Ada has been closed by the development of languages and tools during the last 20 years: - Some low level stuff is still better done in C, since the operating systems and their interface libraries are written in C and often fit better to that language (I've the impression that there are some problems with the GNAT tasking runtime, since it introduce a new layer above the host operating system's philosophy). - Other languages are also type safe and have become really fast. They have a more "expressive" and powerful type system (I'm basically talking about the Hindley-Milner type system here and OCaml or SML, not about Java, mind you. If I look at the new Java generics (there's a paper by Wadler about it) and compare the new subtyping polymorphism there with the Hindley-Milner type system, I just begin to see how impossible it is to write really useful container libraries without polymorphism of that kind. No, Ada doesn't have it and C doesn't have it. And the absence of useful generics and the absolute impossibility to get those with preprocessing in C is in my eyes a much more important argument against C as an application programming language than the buffer overflow problems. And yes, that applies to Java up to 1.4 also). - Other languages have garbage collection. - Even C has acquired a number of new tools (splint, valgrind, C Cured) that make reasonable reliable programming (we are not talking about autopilots and not about rockets here, at least I don't!) in C much more feasible. - The vendor situation ... Overall I'm haunted by the impression that C + some high level language of your choice with a proper FFI makes more sense for day to day development of (a) complete PC applications, (b) Web software and (c) small tools (say: "scripting"). I'm not alone with that, AFAIK. But note: I don't say this to disparage Ada. Ada, to me, is some kind of super-Pascal. But for historical reasons that has not played out (between Turbo Pascal being traditional on Micros and having the community Ada didn't have at that time and the Ada compilers coming just a bit too late), and now the situation has changed. The time of the Pascals is over. Their "mind share" has been swallowed by Python, Ocaml, Java, dependent on to which paradigm of programming those people adhere. If you look upon the number of books sold by O'Reilly, the big contender is not C (and not C++ anymore). They are Java (basically for people with a C++ mind that have seen the light and don't want to do manual memory management any more), Python, Pearl and Ruby (no pointers, no types, but GC) and of course C#. The trend I see, is that GC is a must, clumsy pointer handling is out and types are an option for those that can understand them. I admit I'm not completely sure how all that fits together. But the perception that it is C vs. Ada is IMHO wrong. That particular fight was -- in a sense -- already lost when Turbo Pascal and Modula lost out against C. (Examining that part of history should answer the OPs question). Perhaps the answer in general is, that unreliability doesn't matter so much in most software (MS Office i.e. has become pretty stable despite being written in C AFAIK), since it is not embedded or not in aerospace. Regards -- Markus