From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,21960280f1d61e84 X-Google-Attributes: gid103376,public X-Google-Language: ENGLISH,ASCII-7-bit Path: g2news2.google.com!news4.google.com!border1.nntp.dca.giganews.com!nntp.giganews.com!newsfeed00.sul.t-online.de!t-online.de!130.59.10.21.MISMATCH!kanaga.switch.ch!news-zh.switch.ch!switch.ch!cernne03.cern.ch!not-for-mail From: Maciej Sobczak Newsgroups: comp.lang.ada Subject: Re: How come Ada isn't more popular? Date: Fri, 02 Feb 2007 09:42:23 +0100 Organization: CERN News Message-ID: References: <1169636785.504223.139630@j27g2000cwj.googlegroups.com> <45b8361a_5@news.bluewin.ch> <3pejpgfbki.fsf@hod.lan.m-e-leypold.de> NNTP-Posting-Host: abpc10883.cern.ch Mime-Version: 1.0 Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-Trace: cernne03.cern.ch 1170405743 25886 137.138.37.241 (2 Feb 2007 08:42:23 GMT) X-Complaints-To: news@@cern.ch NNTP-Posting-Date: Fri, 2 Feb 2007 08:42:23 +0000 (UTC) User-Agent: Thunderbird 1.5.0.9 (X11/20061220) In-Reply-To: Xref: g2news2.google.com comp.lang.ada:8837 Date: 2007-02-02T09:42:23+01:00 List-Id: Markus E Leypold wrote: >>> But from what I remember in >>> the 1997s to 1998s >> Most programming languages were terrible at that time, that's true. > > Not Ada 95 ... :-). Ada 95 *is* terrible. It doesn't have containers nor unbounded strings and it cannot even return limited types from functions. Yuck! ;-) > Oh yes, misconceptions perhaps. But I'v only been talking about > peoples motivations (which say a lot about their perceived problems). Everybody has full rights to use misconceptions as a basis for perception and motivation. That's a natural limitation of human brain. >> I've even heard that Java is better, because it has a String class and >> there is no need to use char* as in C++ (!). FUD can buy a lot. > > As far as that goes I have seen people getting tripped up really bad > by string::c_str(). And I think you need it, if you don't program pure > C++, which at that time nobody did. Good point. It is true that people get tripped when interfacing with old C code. What about interfacing with old C code *from Java*? Are there less opportunities for getting tripped, or what? Another misconception. Interfacing to old C is tricky from Ada as well. (Strangely, in "pure" C++ you have to use .c_str() even when opening a file stream, because fstream constructor does not understand string - that's real oops, but actually I've never seen anybody getting tripped here.) > s.c_str() returned a pointer > into some internal data structure of s which promptly changed when s > was modified. Yes. > The only "safe" way to use it was strdup(s.c_str()) No, the only "safe" way to use it is making an immutable copy of the string: string someString = "Hello"; const string safeCopy(someString); some_old_C_function(safeCopy.c_str()); // modify someString here without influencing anything // ... > and > that is not threadsafe as anybody can see. Why? There is nothing about threads here. > I see "the need to use > char* in C++" rumour as the result of people having been burned > by similarly quirks at that time. Yes, I understand it. Still, most of the rumour comes from misconceptions. >> I think that at the end of the day the embedded C++ will disappear >> from the market as the "full" C++ gets wider compiler support on >> embedded platforms. > > That is no question of compiler support, as I understand it, but of > verifiability and safety. A bit like Ravenscar, but -- of course -- > not as highly integer (ahem ... :-). I agree with it, but restrictions for embedded C++ are not catering for the same objectives as those of Ravenscar. For example: EC++ does not have templates. Nor even namespaces (:-|). Does it have anything to do with schedulability or necessary runtime support? No. Result - some people got frustrated and invented this: http://www.iar.com/p7371/p7371_eng.php Funny? That's why I believe that Embedded C++ will die. >> Subsetting C++ would be beneficial in the sense similar to Ravenscar >> or by extracting some core and using it with formal methods (sort of >> "SPARK++"), but I doubt it will ever happen. > > It already did (and perhaps died) > > http://en.wikipedia.org/wiki/Embedded_C++ Exactly. It will die, because it's just a subset. If it was a subset extended with annotations ("SPARK++") or with anything else, the situation would be different, because it would provide new possibilities instead of only limiting them. >> The interesting thing is that memory management is *said* to be >> painful. > I disagree. The only-downward-closures style of C++ and Ada, which > allows only to mimic "upward closures" by using classes, heavily > influences the way the programmer thinks. Higher level abstractions > (as in functional languages) would require full closure -- and since > this means that memory life time cannot bound to scope any more, this > would be the point where manual memory management becomes painful. You can have it by refcounting function frames (and preserving some determinism of destructors). GC is not needed for full closures, as far as I perceive it (with all my misconceptions behind ;-) ). On the other hand, GC is convincing with some lock-free algorithms. Now, *this* is a tough subject for Ada community, right? ;-) > Furthermore I've been convinced that manual memory management hinders > modularity. Whereas I say that I don't care about manual memory management in my programs. You can have modularity without GC. (And if I think about all these funny GC-related effects like resurrection of objects in Java, then I'm not sure what kind of modularity you are referring to. ;-) ) >> Reference-oriented languages have completely >> different ratio of "new per kLOC" so GC is not a feature there, it's a >> must. > > I wonder, if it is really possible to do OO without being > reference-oriented. I somewhat doubt it. Why? OO is about encapsulation and polymorphism, these don't need references everywhere. >> But then the question is not whether GC is better, but whether >> reference-oriented languages are better than value-oriented ones. Many >> people get seduced by GC before they even start asking such questions. > > Value-oriented in my world would be functional -- languages which all > heavily rely on GC. What about maintainability and reasoning? > I also admit being one of the seduced, but that is not surprising > since my main focus is not in embedded programming and in everything > else it's sheer folly not to have GC. I disagree. I'm not seduced. > The arguments against GC often > read like arguments against virtual memory, against high level > languages as opposed to assembler, against filesystems (yes there was > a time when some people thought that the application would best do > allocation of disc cylinders itself since it knows its access patterns > better than the FS). Valid points. Still, Blue Gene/L uses real addressing scheme in each node and more advanced database servers use raw disk access bypassing the features provided by FS. Guess why. -- Maciej Sobczak : http://www.msobczak.com/ Programming : http://www.msobczak.com/prog/