comp.lang.ada
 help / color / mirror / Atom feed
* How come Ada isn't more popular?
@ 2007-01-23  5:53 artifact.one
  2007-01-23  6:37 ` adaworks
                   ` (8 more replies)
  0 siblings, 9 replies; 397+ messages in thread
From: artifact.one @ 2007-01-23  5:53 UTC (permalink / raw)


Hello.

I am a long time C programmer (10 years plus), having a look
at Ada for the first time. From my (inadequate) testing, it seems
that performance of average Ada code is on par with average
C code, and there's a clear advantage in runtime safety. The
GNU ada compiler makes pretty sure that there are very few
platforms without easy access to Ada, so portability should be
on at least an equal footing too.

My question is: how come Ada isn't more popular?

This isn't intended to start a flame war, I'm genuinely interested.

thanks,
MC




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  5:53 How come Ada isn't more popular? artifact.one
@ 2007-01-23  6:37 ` adaworks
  2007-01-23  6:50   ` artifact.one
                     ` (4 more replies)
  2007-01-23  6:58 ` AW: " Grein, Christoph (Fa. ESG)
                   ` (7 subsequent siblings)
  8 siblings, 5 replies; 397+ messages in thread
From: adaworks @ 2007-01-23  6:37 UTC (permalink / raw)



<artifact.one@googlemail.com> wrote in message 
news:1169531612.200010.153120@38g2000cwa.googlegroups.com...
>
> My question is: how come Ada isn't more popular?
>
Ada suffered, in its early days, from a convergence of several
things.  One is that the designers of the language did not anticipate
the impact of the personal computer and the democratization of
computing.   There were other factors, as well.

1)  The DoD mandated Ada before there were any good compilers
     or development environments in place.   That began a routine
     practice of rubber-stamping waivers to use other languages.

2) The compiler publishers, having a captive audience, inflated the
     price of compilers and tools, making Ada unattractive for anyone
     in the non-DoD world.  For example, Alsys sold a compiler for
     the personal computer at $4000 per copy thereby putting out
     of the range of most companies and every hobbyist.

     Turbo Pascal and other alternatives were already in place and
     much cheaper than Ada.   A few brave souls tried to compete
     with products such as RR Software's Janus Ada and Meridian's
     AdaVantage, but the full environment (e.g., integrated editors,
     debuggers, etc.) were not in place they were for Turbo Pascal.

3)  Inept DoD management of the Ada initiative.  Sometimes it seemed
     that the DoD was trying to make Ada a bad choice for businesses.
     The public line was that they wanted commercial users, but the
     practices often put barriers in the way.

4)  Other languages were cheaper to acquire, cheaper to use, and had
     no copyight associated with them.   The copyright was eventually
     removed, but late.

5) The earliest compilers were not uniformly good.  I recall the mainframe
    compiler from Telesoft was, when compared to other language choices,
    simply terrible.   It was slow, had an awkward development environment,
    and did not support the central features of the mainframe very well.

    Many of those early compilers were "checkbox" compilers.   On the form
    where one had to check-off "Validated Ada Compiler" the fact that a 
validated
    compiler was available was considered enough.   One compiler I recall quite
    vividly was for the Tandem.   Although the compiler was validated, that same
    compiler was not integrated into the rest of the system tools, and barely
    supported by the operating system.   The word in the Tandem management
    was that no one was expected to take Ada seriously, but the checkbox had
    to be supported.   This was quite widespread in the industry.

6) Really good compilers began to appear around 1989.   By then Ada's reputation
    for being slow, cumbersome, and hard to use had already been firmly set.

7) Instruction in the language was bad.   I recall a U.S. Navy Admiral 
complaining
    about how hard it was to teach anyone Ada.   He described the efforts he put
    in place to make this happen.    I told him he had hired people to do the 
teaching
    who were incompetent.   That was true, but they had PhD's and he thought 
that
    should have ensured success.   The fact was that those teachers had not yet 
come
    to a full understanding of the Ada and their own confusion was more a source 
of
    obfuscation than enlightenment for the students.

8) Grabbing defeat from the jaws of victory.   In the mid-90's, when Ada became
    a powerful alternative to other languages, when tools were in place, the 
language
    modernized, and the availability of low-cost (or free) compilers could have 
made
    it attractive, the DoD lost its nerve and gave the impression that Ada was 
no longer
    part of the DoD language requirement.  A lot of people misinterpreted this 
and thought
    the DoD had decided to abandon Ada entirely.

9) Traitors.   Some people who were previously charged with promoting Ada, in
     particular certain former AJPO officials, once having left government, 
exploited
     their former role and joined the forces against Ada.   They were able to 
use their
     title as former ... to gain credibility and lobby against the use of Ada in 
exactly
     the places where it was appropriate to lobby for it.

Ada is not now, nor has it ever been, the perfect language.  There is no perfect
language.  However, anyone who understands Ada and has a good understanding
of the competing technologies realizes that Ada continues to be the most 
appropriate
choice when the requirement is for a language that will improve the probability 
of
an error-free software product at a reasonable cost.  The alternatives, mostly C
and C++ are generally less dependable.  In fact, I often wonder why anyone would
pick a language that is inherently error-prone (e.g., C++) and expect a result 
that
is error-free.

If one does an objective comparison of Ada to its alternatives, in the design 
and
constuction of dependable software, Ada will come in with high marks -- higher
than most alternatives.   If one is looking for a language that is well-suited 
to
supporting a long-lived software system, Ada is certainly better than most of
the alternatives.

More could be said in favor of Ada.   I will leave that more to others.

Richard Riehle








^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  6:37 ` adaworks
@ 2007-01-23  6:50   ` artifact.one
  2007-01-23 14:24   ` Arthur Evans Jr
                     ` (3 subsequent siblings)
  4 siblings, 0 replies; 397+ messages in thread
From: artifact.one @ 2007-01-23  6:50 UTC (permalink / raw)


Ah, politics and dramatics! It's had quite a history then - always
a killer.

Thanks for the impromptu documentary.
MC




^ permalink raw reply	[flat|nested] 397+ messages in thread

* AW: How come Ada isn't more popular?
  2007-01-23  5:53 How come Ada isn't more popular? artifact.one
  2007-01-23  6:37 ` adaworks
@ 2007-01-23  6:58 ` Grein, Christoph (Fa. ESG)
  2007-01-23 10:31   ` Talulah
  2007-01-23 10:02 ` Stephen Leake
                   ` (6 subsequent siblings)
  8 siblings, 1 reply; 397+ messages in thread
From: Grein, Christoph (Fa. ESG) @ 2007-01-23  6:58 UTC (permalink / raw)
  To: comp.lang.ada


This has been asked several times, and I think you'll get many opinions.
There are historical reasons, but by now, they are no longer relevant.

In effect, there is no real reason why she isn't more popular - I
personally cannot understand this. Most of the arguments you hear
against Ada are in fact made up - people just don't want to use her,
they're happy with whatever they're using.

Mind you, I do not argue that there are strong reasons why some project
does not use Ada - we're talking popularity here.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  5:53 How come Ada isn't more popular? artifact.one
  2007-01-23  6:37 ` adaworks
  2007-01-23  6:58 ` AW: " Grein, Christoph (Fa. ESG)
@ 2007-01-23 10:02 ` Stephen Leake
  2007-01-23 16:49   ` adaworks
                     ` (2 more replies)
  2007-01-23 10:38 ` Alex R. Mosteo
                   ` (5 subsequent siblings)
  8 siblings, 3 replies; 397+ messages in thread
From: Stephen Leake @ 2007-01-23 10:02 UTC (permalink / raw)


artifact.one@googlemail.com writes:

> I am a long time C programmer (10 years plus), having a look
> at Ada for the first time. From my (inadequate) testing, it seems
> that performance of average Ada code is on par with average
> C code, and there's a clear advantage in runtime safety. The
> GNU ada compiler makes pretty sure that there are very few
> platforms without easy access to Ada, so portability should be
> on at least an equal footing too.
>
> My question is: how come Ada isn't more popular?

Because most people don't have the same attitude towards language
evaluation that you do.

Most that I've actually asked have the attitude "C was what I learned
in college, and it's good enough for me". 

Or, in managers, "everyone else is using C, so it must be the best
language". When I point out that far more programs are written in
Visual Basic, or Excel, they look very puzzled :).

Welcome to enlightenment :).

-- 
-- Stephe



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: AW: How come Ada isn't more popular?
  2007-01-23  6:58 ` AW: " Grein, Christoph (Fa. ESG)
@ 2007-01-23 10:31   ` Talulah
  2007-01-23 13:48     ` Anders Wirzenius
  2007-01-23 20:17     ` Jeffrey R. Carter
  0 siblings, 2 replies; 397+ messages in thread
From: Talulah @ 2007-01-23 10:31 UTC (permalink / raw)



Grein, Christoph (Fa. ESG) wrote:

> Mind you, I do not argue that there are strong reasons why some project
> does not use Ada - we're talking popularity here.

Popularity is the key thing surely - the chicken and egg. As a software
manager in a commercial business, I employ C programmers because there
are so few Ada programmers around in the UK. There are so few Ada
programmers in the UK because I employ C programmers! How do you break
that chain?

There are many examples in marketing history of inferior products
becoming the more widespread, e.g. Betamax v VHS video recorders, MSDOS
v Concurrent CPM-86. I guess this is just another one of them.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  5:53 How come Ada isn't more popular? artifact.one
                   ` (2 preceding siblings ...)
  2007-01-23 10:02 ` Stephen Leake
@ 2007-01-23 10:38 ` Alex R. Mosteo
  2007-01-23 12:58   ` gautier_niouzes
  2007-01-23 21:56   ` Dr. Adrian Wrigley
  2007-01-23 19:16 ` Tero Koskinen
                   ` (4 subsequent siblings)
  8 siblings, 2 replies; 397+ messages in thread
From: Alex R. Mosteo @ 2007-01-23 10:38 UTC (permalink / raw)


artifact.one@googlemail.com wrote:

> Hello.
> 
> I am a long time C programmer (10 years plus), having a look
> at Ada for the first time. From my (inadequate) testing, it seems
> that performance of average Ada code is on par with average
> C code, and there's a clear advantage in runtime safety. The
> GNU ada compiler makes pretty sure that there are very few
> platforms without easy access to Ada, so portability should be
> on at least an equal footing too.
> 
> My question is: how come Ada isn't more popular?

Others have given longer scoped responses, and I will concentrate on the
hobbyist POV (I have felt an Ada hobbyist for a long time now): there is a
catch-22 problem with Ada and it is the lack of libraries. This is a
relative problem, consider these points.

1) The standard library is really standard, so this is an advantage if it
does all you need. Also some features (e.g. fixed point, bound checking,
tasking!) are in the language so you don't need extra wrappers around the
basic language or OS.

2) There's no good, easy, almost automatic C binding generator, although the
language has well defined mechanisms for C interfacing. Yes, there was some
generator. No, it is not trivial at present to get it running in my
experience. There's some effort to have Ada integrated into SWIG; this is
promising and IMHO an important selling point to newcomers.

3) There are bindings for quite some important things: Gtk+, Xml parser,
Unicode, CORBA, databases...

Summarizing, the Ada programmer feels a bit pariah when he sees his fellow
C/C++/java friends trying the latest and greatest version of some library.
Either it is unavailable for Ada, or is not up to date, or has to invest
some time in creating or tweaking a binding.

This is something that, as I said, may be important or not at some point,
depending on what you need to do. Also going against the majority of
colleagues is a burden. In my lab almost all development is done in matlab,
C or C++, and these are not all CS people but from other engineering
branches too. It's a shock when you have to use other's code and start to
see random pointers in function declarations, arcane syntax for complex
datatypes (because typedef seems a forbidden word) and so on. In my case,
Ada isn't event a obscure language: it is taught in my university and has
good backing among several high-profile teachers. Even so, I feel very
alone... :)



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 10:38 ` Alex R. Mosteo
@ 2007-01-23 12:58   ` gautier_niouzes
  2007-01-23 21:56   ` Dr. Adrian Wrigley
  1 sibling, 0 replies; 397+ messages in thread
From: gautier_niouzes @ 2007-01-23 12:58 UTC (permalink / raw)


[about bindings - and their lack of]

For libraries that are no more developped, like some compression or
image formats, an alternative is to get an Ada translation and not
needing a binding anymore (a problem with a binding is that you have to
provide it and keep it up-to date for each compiler/OS/CPU
architecture, and accept that the quality of the bound library
fluctuates with the time...).
If you are lucky, there is a Pascal translation around and you can go
further with P2Ada, it is much easier than translating from C which is
very different.
It is also a good occasion to seriously debug these libraries.
______________________________________________________________
Gautier         -- http://www.mysunrise.ch/users/gdm/index.htm
Ada programming -- http://www.mysunrise.ch/users/gdm/gsoft.htm

NB: For a direct answer, e-mail address on the Web site!




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: AW: How come Ada isn't more popular?
  2007-01-23 10:31   ` Talulah
@ 2007-01-23 13:48     ` Anders Wirzenius
  2007-01-23 20:17     ` Jeffrey R. Carter
  1 sibling, 0 replies; 397+ messages in thread
From: Anders Wirzenius @ 2007-01-23 13:48 UTC (permalink / raw)


"Talulah" <paul.hills@uk.landisgyr.com> writes:

> Grein, Christoph (Fa. ESG) wrote:
> 
> > Mind you, I do not argue that there are strong reasons why some project
> > does not use Ada - we're talking popularity here.
> 
> Popularity is the key thing surely - the chicken and egg. As a software
> manager in a commercial business, I employ C programmers because there
> are so few Ada programmers around in the UK. There are so few Ada
> programmers in the UK because I employ C programmers! How do you break
> that chain?

Quality is not free - short term.
Quality costs are traditionally split in:
- preventive costs
- inspection costs
- error costs, which are split in:
  - external error costs
  - internal error costs

Investing in Ada people belongs to the preventive quality
costs which hopefully are paid back at a later stage.

So, breaking the chain is a long term process which in the very
beginning does not produce much more than costs.
But in the long term ...  

> 
> There are many examples in marketing history of inferior products
> becoming the more widespread, e.g. Betamax v VHS video recorders, MSDOS
> v Concurrent CPM-86. I guess this is just another one of them.

I don't know what "this" refers to. :(

-- 
Anders



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  6:37 ` adaworks
  2007-01-23  6:50   ` artifact.one
@ 2007-01-23 14:24   ` Arthur Evans Jr
  2007-01-23 20:11     ` Jeffrey R. Carter
  2007-01-23 15:23   ` Ed Falis
                     ` (2 subsequent siblings)
  4 siblings, 1 reply; 397+ messages in thread
From: Arthur Evans Jr @ 2007-01-23 14:24 UTC (permalink / raw)


In article <L2ith.12633$ji1.1337@newssvr12.news.prodigy.net>,
 <adaworks@sbcglobal.net> wrote:

> Ada suffered, in its early days, from a convergence of several
> things.

Richard Riehle wrote eloquently on this subject. I'll add one more point.

Ada came out at a time when the government in general and the defense 
Department in particular were widely perceived as evil. Since Ada was 
intended to be used to write programs that would kill people, some 
perceived it as inherently evil. Many folks, myself included, made the 
argument that wrenches are used to build weapons; should we ban 
wrenches? Those who had already made up their minds couldn't or wouldn't 
hear that argument.

This argument alone wasn't a major desideratum in Ada's failure to 
become more popular, but then neither was any one of Richard's 
arguments. All of these arguments taken together, though, were too much 
at the critical time when Ada might have succeeded as intended.

Too bad.

Art Evans



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  6:37 ` adaworks
  2007-01-23  6:50   ` artifact.one
  2007-01-23 14:24   ` Arthur Evans Jr
@ 2007-01-23 15:23   ` Ed Falis
  2007-01-23 20:09   ` Jeffrey R. Carter
  2007-01-25 11:31   ` Ali Bendriss
  4 siblings, 0 replies; 397+ messages in thread
From: Ed Falis @ 2007-01-23 15:23 UTC (permalink / raw)


adaworks@sbcglobal.net wrote:
> Ada suffered, in its early days, from a convergence of several
> things.

It would make a fascinating book.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 10:02 ` Stephen Leake
@ 2007-01-23 16:49   ` adaworks
  2007-01-23 17:40     ` Markus E Leypold
  2007-01-23 20:10   ` Jeffrey R. Carter
  2007-01-23 21:19   ` Björn Persson
  2 siblings, 1 reply; 397+ messages in thread
From: adaworks @ 2007-01-23 16:49 UTC (permalink / raw)



"Stephen Leake" <stephen_leake@stephe-leake.org> wrote in message 
news:uy7nucb8e.fsf@stephe-leake.org...
>
> Or, in managers, "everyone else is using C, so it must be the best
> language". When I point out that far more programs are written in
> Visual Basic, or Excel, they look very puzzled :).
>
One of the long-forgotten success stories in Ada was at Xerox, another
company that has a history of grabbing defeat from the jaws of victory.

A team of developers at Xerox decided to use Ada for the software on
a new Xerox copier.   The project was a resounding success and every
team member was enthusiastic about the potential for software on other
Xerox projects.   There were analyses showing how Ada was more
cost-effective than C or other alternatives.  It looked as if Ada might
have found its niche in commercial software development.

Not so.  In spite of all the evidence in support of Ada, some idiot
higher up in management decreed that all software must be written
in C.   He had no understanding of Ada.  All he knew was that it
was a DoD language and he wanted no part of it.

This story has repeated itself over and over.  As noted, a lot of people
are reluctant to use a language designed for "killing and maiming."  It
is silly, of course, but programming aptitude has never been a good
predictor for sensible decision-making.

There is a shortage of Ada programmers, so Lockheed-Martin made
the decision to use C++ on some of our major weapon systems. Not
a particularly wide decision.    They have discovered that, for the software
to be dependable, they must cripple C++ to the point where it is being
used as a "better C" and they have lost all the alleged benefits of C++
except one:  the larger population of university-trained C++ programmers.

Academia has been no better.  As long as the DoD funded projects related
to Ada, professors were happy to take the money.  Once the funding vanished,
those professors redirected their efforts to projects using newer whiz-bang
languages that looked good when they submitted papers for publication.

I recently had the opportunity to teach a beginning class in Java.  What I
discovered is that Java is not type safe, and includes a lot more opportunities
for programming errors than Ada.   It is not any better designed than Ada,
but it does have a lot of libraries.  Most important, it is easier to get a 
paper
published if it mentions Java than if it mentions Ada.  A few years ago I was
invited to submit a paper to a conference by the conference chairperson.  I
was told not to mention Ada.

For a variety of reasons, there is a lot of ignorance and bias regarding Ada and
it will not be easy to overcome.   One bright spot is that SPARK has achieved
a high level of respectability and SPARK is an Ada-based environment.

Richard Riehle





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 16:49   ` adaworks
@ 2007-01-23 17:40     ` Markus E Leypold
  2007-01-24 12:51       ` Peter Hermann
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-23 17:40 UTC (permalink / raw)



<adaworks@sbcglobal.net> writes:

> I recently had the opportunity to teach a beginning class in Java.  What I
> discovered is that Java is not type safe, <...> 

How so? I'd be really intrested to know.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  5:53 How come Ada isn't more popular? artifact.one
                   ` (3 preceding siblings ...)
  2007-01-23 10:38 ` Alex R. Mosteo
@ 2007-01-23 19:16 ` Tero Koskinen
  2007-01-23 21:12   ` Ludovic Brenta
  2007-01-24 20:10   ` Cesar Rabak
  2007-01-23 20:02 ` Jeffrey R. Carter
                   ` (3 subsequent siblings)
  8 siblings, 2 replies; 397+ messages in thread
From: Tero Koskinen @ 2007-01-23 19:16 UTC (permalink / raw)


Hi,

On 22 Jan 2007 21:53:32 -0800 artifact.one@googlemail.com wrote:
> 
> My question is: how come Ada isn't more popular?
> 

I would like to throw in yet another possible reason(*) for Ada being
not popular (within the free software folks).

Currently, the only non-restricted free Ada compiler is FSF GCC/GNAT.
However, the GCC development team doesn't consider Ada to be
a release critical language, so it gets less love whenever a new GCC
is released and its quality is sometimes lower than C and C++ parts
of GCC. In addition, Ada part of GCC supports far less platforms than
C and C++ parts. (**)

So, lets imagine that you are a lead developer in an open source
project (***). One of your goals is to produce software, which will
run atleast on following systems:
 * Debian GNU/Linux 3.1: i386, IA-64, ARM, PowerPC, SPARC, MIPS
 * OpenSUSE 10.2: i386, x86-64
 * OpenBSD 4.0: i386, AMD64, ARM, PowerPC, SPARC
 * FreeBSD 6.2: i386, AMD64, Alpha

The question is: Which programming language do you choose?

Ada is ruled out, because of its limited support for non-mainstream
free operating systems.

-Tero

(*) Actually, only a guess
(**) For example, platforms like OpenBSD/{arm,sparc,ppc,amd64} are
totally unsupported.
(***) Like KDE, Subversion, GTK+, or Sendmail




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  5:53 How come Ada isn't more popular? artifact.one
                   ` (4 preceding siblings ...)
  2007-01-23 19:16 ` Tero Koskinen
@ 2007-01-23 20:02 ` Jeffrey R. Carter
  2007-01-24  7:18   ` adaworks
  2007-01-24 14:19   ` Alex R. Mosteo
  2007-01-23 21:36 ` How come Ada isn't more popular? kevin  cline
                   ` (2 subsequent siblings)
  8 siblings, 2 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-23 20:02 UTC (permalink / raw)


artifact.one@googlemail.com wrote:
> 
> My question is: how come Ada isn't more popular?

Much of what others have posted is good, but I also see:

Most developers (98% in my experience) are coders. Coders like languages 
like C that let them  code. Then then enjoy spending more time debugging 
then they did coding.

The other 2% are SW engineers. They like languages like Ada.

Obviously, C and its ilk are going to be more popular than Ada.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  6:37 ` adaworks
                     ` (2 preceding siblings ...)
  2007-01-23 15:23   ` Ed Falis
@ 2007-01-23 20:09   ` Jeffrey R. Carter
  2007-01-24  8:50     ` Dmitry A. Kazakov
  2007-01-24 11:06     ` gautier_niouzes
  2007-01-25 11:31   ` Ali Bendriss
  4 siblings, 2 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-23 20:09 UTC (permalink / raw)


adaworks@sbcglobal.net wrote:
>>
> Ada suffered, in its early days, from a convergence of several
> things.  One is that the designers of the language did not anticipate
> the impact of the personal computer and the democratization of
> computing.   There were other factors, as well.

An excellent reply. Ada was also ahead of its time. Computers in 1983 
really weren't up to the demands of Ada.

>      Turbo Pascal and other alternatives were already in place and
>      much cheaper than Ada.   A few brave souls tried to compete
>      with products such as RR Software's Janus Ada and Meridian's
>      AdaVantage, but the full environment (e.g., integrated editors,
>      debuggers, etc.) were not in place they were for Turbo Pascal.

There's always the question of why, given TP's widespread popularity, C 
became more popular.

> 6) Really good compilers began to appear around 1989.   By then Ada's reputation
>     for being slow, cumbersome, and hard to use had already been firmly set.

Actually, the DEC Ada compiler of 1984 was pretty good.

> 8) Grabbing defeat from the jaws of victory.   In the mid-90's, when Ada became
>     a powerful alternative to other languages, when tools were in place, the 
> language
>     modernized, and the availability of low-cost (or free) compilers could have 
> made
>     it attractive, the DoD lost its nerve and gave the impression that Ada was 
> no longer
>     part of the DoD language requirement.  A lot of people misinterpreted this 
> and thought
>     the DoD had decided to abandon Ada entirely.

Windows 95 was the 1st widely used OS with support for tasking. Ada (95) 
was the only widely available language with support for tasking at the 
time. We probably lost a good opportunity to gain more acceptance of Ada 
by not including a standard windowing library and promoting Ada as the 
best language for taking advantage of Win95's features.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 10:02 ` Stephen Leake
  2007-01-23 16:49   ` adaworks
@ 2007-01-23 20:10   ` Jeffrey R. Carter
  2007-01-23 22:37     ` Frank J. Lhota
  2007-01-23 21:19   ` Björn Persson
  2 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-23 20:10 UTC (permalink / raw)


Stephen Leake wrote:
> 
> Or, in managers, "everyone else is using C, so it must be the best
> language". When I point out that far more programs are written in
> Visual Basic, or Excel, they look very puzzled :).

More programming is done in COBOL than any other language.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 14:24   ` Arthur Evans Jr
@ 2007-01-23 20:11     ` Jeffrey R. Carter
  2007-01-23 21:14       ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-23 20:11 UTC (permalink / raw)


Arthur Evans Jr wrote:
> 
> Ada came out at a time when the government in general and the defense 
> Department in particular were widely perceived as evil. Since Ada was 
> intended to be used to write programs that would kill people, some 
> perceived it as inherently evil. Many folks, myself included, made the 
> argument that wrenches are used to build weapons; should we ban 
> wrenches? Those who had already made up their minds couldn't or wouldn't 
> hear that argument.

These people never seemed to be very concerned about the DOD's 
involvement in COBOL, either.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: AW: How come Ada isn't more popular?
  2007-01-23 10:31   ` Talulah
  2007-01-23 13:48     ` Anders Wirzenius
@ 2007-01-23 20:17     ` Jeffrey R. Carter
  2007-01-23 20:43       ` Pascal Obry
  2007-01-24  9:42       ` Maciej Sobczak
  1 sibling, 2 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-23 20:17 UTC (permalink / raw)


Talulah wrote:
> 
> Popularity is the key thing surely - the chicken and egg. As a software
> manager in a commercial business, I employ C programmers because there
> are so few Ada programmers around in the UK. There are so few Ada
> programmers in the UK because I employ C programmers! How do you break
> that chain?

No one should be hiring X programmers. For long term employees, you 
should be looking for SW engineers, whom you then train if necessary. 
Identifying SW engineers isn't easy, but one clue is that SW engineers 
generally like Ada once they've been exposed to it.

This approach has long term cost savings, as Ada results in SW that is 
ready for deployment sooner and has far fewer post-deployment errors 
than SW in C. Your SW is ready before your competitors, and is higher 
quality.

Plus, your employees, liking Ada, won't be leaving to do C for your 
competitors.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: AW: How come Ada isn't more popular?
  2007-01-23 20:17     ` Jeffrey R. Carter
@ 2007-01-23 20:43       ` Pascal Obry
  2007-01-24  9:42       ` Maciej Sobczak
  1 sibling, 0 replies; 397+ messages in thread
From: Pascal Obry @ 2007-01-23 20:43 UTC (permalink / raw)
  To: Jeffrey R. Carter

Jeffrey R. Carter a �crit :

> No one should be hiring X programmers. For long term employees, you
> should be looking for SW engineers, whom you then train if necessary.
> Identifying SW engineers isn't easy, but one clue is that SW engineers
> generally like Ada once they've been exposed to it.

How true! But sadly far from the common practice :(

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|              http://www.obry.net
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver wwwkeys.pgp.net --recv-key C1082595



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 19:16 ` Tero Koskinen
@ 2007-01-23 21:12   ` Ludovic Brenta
  2007-01-24  9:59     ` Maciej Sobczak
  2007-01-24 20:10   ` Cesar Rabak
  1 sibling, 1 reply; 397+ messages in thread
From: Ludovic Brenta @ 2007-01-23 21:12 UTC (permalink / raw)


Tero Koskinen writes:
> Ada part of GCC supports far less platforms than C and C++ parts.

Actually, I don't think that that's a result of Ada not being a
release-critical language for GCC.  I rather think that that's a
result of too few people contributing to the Ada part of GCC, which is
itself a result of too few people using Ada.  Chicken and egg,
catch-22.  But that also applies to other software; for example
OpenBSD's ports collection is much smaller than Debian's or FreeBSD's;
why is that?

As a counter-example, Aurélien Jarno single-handedly ported GNAT to
GNU/kFreeBSD, which is hardly a mainstream platform.  His patches,
initially available for several versions of GNAT, are now in the
Debian packages; you can review them if you like to get a feeling of
how hard it would be to support, say, OpenBSD.

I believe that it was Samuel Tardieu who contributed the sparc-linux
port back in the 3.13p or 3.14p days, but I may be wrong on this.

> So, lets imagine that you are a lead developer in an open source
> project (***). One of your goals is to produce software, which will
> run atleast on following systems:
>  * Debian GNU/Linux 3.1: i386, IA-64, ARM, PowerPC, SPARC, MIPS
>  * OpenSUSE 10.2: i386, x86-64
>  * OpenBSD 4.0: i386, AMD64, ARM, PowerPC, SPARC
>  * FreeBSD 6.2: i386, AMD64, Alpha
> 
> The question is: Which programming language do you choose?
> (***) Like KDE, Subversion, GTK+, or Sendmail

If your project consists of general-purpose libraries and you want
them available to as many developers as possible, then your best
choice is C; not only because of compiler availability because, more
importantly, because C makes it easy to call your libraries from other
languages.  It is a design goal of GNOME, for example, to support many
languages for application development, and that's why the GTK+ and
GNOME libraries are implemented in C, despite the fact that they are
object-oriented and so would have benefited from an object-oriented
language.

-- 
Ludovic Brenta.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 20:11     ` Jeffrey R. Carter
@ 2007-01-23 21:14       ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-23 21:14 UTC (permalink / raw)



"Jeffrey R. Carter" <jrcarter@acm.org> writes:

> Arthur Evans Jr wrote:
>> Ada came out at a time when the government in general and the
>> defense Department in particular were widely perceived as
>> evil. Since Ada was intended to be used to write programs that would
>> kill people, some perceived it as inherently evil. Many folks,
>> myself included, made the argument that wrenches are used to build
>> weapons; should we ban wrenches? Those who had already made up their
>> minds couldn't or wouldn't hear that argument.
>
> These people never seemed to be very concerned about the DOD's
> involvement in COBOL, either.

Different generation?

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 10:02 ` Stephen Leake
  2007-01-23 16:49   ` adaworks
  2007-01-23 20:10   ` Jeffrey R. Carter
@ 2007-01-23 21:19   ` Björn Persson
  2 siblings, 0 replies; 397+ messages in thread
From: Björn Persson @ 2007-01-23 21:19 UTC (permalink / raw)


Stephen Leake wrote:

> Because most people don't have the same attitude towards language
> evaluation that you do.
> 
> Most that I've actually asked have the attitude "C was what I learned
> in college, and it's good enough for me".

Lots of people also seem to pounce on the latest hype, thinking that latest
equals greatest. (I suppose that's what makes up a hype.) Those who realize
that C has serious problems typically hope for a new language to solve
those problems. It doesn't seem to occur to them that the problems might
have already been solved in some existing language. At least that's the
impression I get.

-- 
Bj�rn Persson                              PGP key A88682FD
                   omb jor ers @sv ge.
                   r o.b n.p son eri nu



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  5:53 How come Ada isn't more popular? artifact.one
                   ` (5 preceding siblings ...)
  2007-01-23 20:02 ` Jeffrey R. Carter
@ 2007-01-23 21:36 ` kevin  cline
  2007-01-23 22:18   ` Martin Dowie
  2007-01-24 19:33   ` Arthur Evans Jr
  2007-01-24  0:12 ` JPWoodruff
  2007-03-05  2:19 ` Brian May
  8 siblings, 2 replies; 397+ messages in thread
From: kevin  cline @ 2007-01-23 21:36 UTC (permalink / raw)



artifact.one@googlemail.com wrote:
> Hello.
>
> I am a long time C programmer (10 years plus), having a look
> at Ada for the first time. From my (inadequate) testing, it seems
> that performance of average Ada code is on par with average
> C code, and there's a clear advantage in runtime safety. The
> GNU ada compiler makes pretty sure that there are very few
> platforms without easy access to Ada, so portability should be
> on at least an equal footing too.
>
> My question is: how come Ada isn't more popular?

1. Ada-83 simply sucked.  Expensive, inexpressive, with poor libraries
and no support for writing desktop applications.  Ada-83 was designed
for embedded development, and was OK for that purpose, but it was
hopeless for writing hosted applications.

2. For the same reason that Esperanto isn't more popular.  Ada was
designed by a committee to meet theroetical needs.  Most popular
languages have evolved to meet practical needs.  They grew from humble
beginnings to widespread acceptance.  Theoretically, they may be
abominations, but they get the job done.

3. For the same reason that Limburger cheese isn't more popular.  Most
programmers who have tried Ada didn't like it.  What makes a programmer
like a new language?  Usually, someone comes along and says something
like "Remember that program that we spent two weeks writing in C?
Here's a Perl implementation that I put together in three hours and
one-tenth the code."  That's never happened with Ada.



>
> This isn't intended to start a flame war, I'm genuinely interested.
> 
> thanks,
> MC




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 10:38 ` Alex R. Mosteo
  2007-01-23 12:58   ` gautier_niouzes
@ 2007-01-23 21:56   ` Dr. Adrian Wrigley
  2007-01-24 13:52     ` Alex R. Mosteo
                       ` (5 more replies)
  1 sibling, 6 replies; 397+ messages in thread
From: Dr. Adrian Wrigley @ 2007-01-23 21:56 UTC (permalink / raw)


On Tue, 23 Jan 2007 11:38:28 +0100, Alex R. Mosteo wrote:

> artifact.one@googlemail.com wrote:
> 
>> Hello.
>> 
>> I am a long time C programmer (10 years plus), having a look
>> at Ada for the first time. From my (inadequate) testing, it seems
>> that performance of average Ada code is on par with average
>> C code, and there's a clear advantage in runtime safety. The
>> GNU ada compiler makes pretty sure that there are very few
>> platforms without easy access to Ada, so portability should be
>> on at least an equal footing too.
>> 
>> My question is: how come Ada isn't more popular?
> 
> Others have given longer scoped responses, and I will concentrate on the
> hobbyist POV (I have felt an Ada hobbyist for a long time now): there is a
> catch-22 problem with Ada and it is the lack of libraries. This is a
> relative problem, consider these points.
> 
> 1) The standard library is really standard, so this is an advantage if it
> does all you need. Also some features (e.g. fixed point, bound checking,
> tasking!) are in the language so you don't need extra wrappers around the
> basic language or OS.
> 
> 2) There's no good, easy, almost automatic C binding generator, although the
> language has well defined mechanisms for C interfacing. Yes, there was some
> generator. No, it is not trivial at present to get it running in my
> experience. There's some effort to have Ada integrated into SWIG; this is
> promising and IMHO an important selling point to newcomers.

I think this is critical.  Why can't we just say:

with stdio;

pragma import (C, stdio, "stdio.h");

and be able to get structs, functions, constants, variables from C in
an obvious and reasonably reliable way?

Much of what is in C has direct analogs in Ada.  Some of it is via
fiddly #defines, but even a useful subset of these would be e

And of course compilers should spit out header files on request
matching an ada package via the "obvious" rules, so you can
#include it from C.
--
Adrian





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 21:36 ` How come Ada isn't more popular? kevin  cline
@ 2007-01-23 22:18   ` Martin Dowie
  2007-01-24  4:14     ` Alexander E. Kopilovich
                       ` (2 more replies)
  2007-01-24 19:33   ` Arthur Evans Jr
  1 sibling, 3 replies; 397+ messages in thread
From: Martin Dowie @ 2007-01-23 22:18 UTC (permalink / raw)


kevin cline wrote:
> 3. For the same reason that Limburger cheese isn't more popular.  Most
> programmers who have tried Ada didn't like it.  What makes a programmer
> like a new language?  Usually, someone comes along and says something
> like "Remember that program that we spent two weeks writing in C?
> Here's a Perl implementation that I put together in three hours and
> one-tenth the code."  That's never happened with Ada.
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

FUD!!

http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 20:10   ` Jeffrey R. Carter
@ 2007-01-23 22:37     ` Frank J. Lhota
  2007-01-24  7:27       ` Jeffrey R. Carter
  0 siblings, 1 reply; 397+ messages in thread
From: Frank J. Lhota @ 2007-01-23 22:37 UTC (permalink / raw)


"Jeffrey R. Carter" <jrcarter@acm.org> wrote in message 
news:WYtth.318145$FQ1.108931@attbi_s71...
> More programming is done in COBOL than any other language.

Several decades ago, that was definitely true, but is it /still/ true today? 
If one measures things by job openings, then COBOL appears to be outpaced by 
C++, Java, and C#.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  5:53 How come Ada isn't more popular? artifact.one
                   ` (6 preceding siblings ...)
  2007-01-23 21:36 ` How come Ada isn't more popular? kevin  cline
@ 2007-01-24  0:12 ` JPWoodruff
  2007-01-24 10:32   ` gautier_niouzes
  2007-01-25  1:01   ` Alexander E. Kopilovich
  2007-03-05  2:19 ` Brian May
  8 siblings, 2 replies; 397+ messages in thread
From: JPWoodruff @ 2007-01-24  0:12 UTC (permalink / raw)




On Jan 22, 9:53 pm, artifact....@googlemail.com wrote:

>
> My question is: how come Ada isn't more popular?
>
I have another hypothesis that involves the way many programmers got
started at a young age.  For some decades, classes of smart young
teenagers have had easy access to computers and amateur tools, and
have honed their skills at what most of them called "hacking".  They
learned to reason in low levels of abstraction.  They spent a lot of
time in thread-of-execution debugging.

I think that software engineers who started their understanding in
that paradigm are a hard sell for Ada.  They do have techniques that
work and there are plentiful examples of their success, but we Ada
guys prefer something different.

There are other ways to come to the software engineering mindset. One
way is to want to write interesting essays in the form of executable
programs.  Ada is one of the finest tools for this task - at least as
far as our kind of program is concerned.

I submit Richard as an example - he writes whereof he knows.  A few of
the other deponents on this conversation are no slouches either. 

John




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 22:18   ` Martin Dowie
@ 2007-01-24  4:14     ` Alexander E. Kopilovich
  2007-01-24  7:30       ` Jeffrey R. Carter
  2007-01-24  7:31     ` Jeffrey R. Carter
  2007-01-24  7:42     ` kevin  cline
  2 siblings, 1 reply; 397+ messages in thread
From: Alexander E. Kopilovich @ 2007-01-24  4:14 UTC (permalink / raw)
  To: comp.lang.ada

Martin Dowie wrote:

>>  What makes a programmer
>> like a new language?  Usually, someone comes along and says something
>> like "Remember that program that we spent two weeks writing in C?
>> Here's a Perl implementation that I put together in three hours and
>> one-tenth the code."  That's never happened with Ada.
>                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>
>FUD!!
>
>http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html

Is FUD a reserved word in Ada?

By the way, I think that the referenced (by above URL) article does not
contradict the apparently contested (under-marked) sentence, because
circumstances are far too different in several important aspects.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 20:02 ` Jeffrey R. Carter
@ 2007-01-24  7:18   ` adaworks
  2007-01-24 14:19   ` Alex R. Mosteo
  1 sibling, 0 replies; 397+ messages in thread
From: adaworks @ 2007-01-24  7:18 UTC (permalink / raw)



"Jeffrey R. Carter" <jrcarter@acm.org> wrote in message 
news:fRtth.363405$1i1.178883@attbi_s72...
> artifact.one@googlemail.com wrote:
>
> Most developers (98% in my experience) are coders. Coders like languages like 
> C that let them  code. Then then enjoy spending more time debugging then they 
> did coding.
>
> The other 2% are SW engineers. They like languages like Ada.
>
> Obviously, C and its ilk are going to be more popular than Ada.
>
Fast-food joints are more popular than regular restaurants.   Ada is
the equivalent of a nourishing meal rather than a greasy burger.  C++
is the equivalent of peanut-brittle.  It tastes good at first, then it gets
stuck in your teeth, and then it causes your molars to decay.

Richard Riehle 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 22:37     ` Frank J. Lhota
@ 2007-01-24  7:27       ` Jeffrey R. Carter
  2007-01-24  9:50         ` Maciej Sobczak
  0 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-24  7:27 UTC (permalink / raw)


Frank J. Lhota wrote:
> 
> Several decades ago, that was definitely true, but is it /still/ true today? 
> If one measures things by job openings, then COBOL appears to be outpaced by 
> C++, Java, and C#.

The most recent survey of actual projects that I've seen (in CACM, IIRC) 
still had COBOL in 1st place.

-- 
Jeff Carter
"Every sperm is sacred."
Monty Python's the Meaning of Life
55



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  4:14     ` Alexander E. Kopilovich
@ 2007-01-24  7:30       ` Jeffrey R. Carter
  2007-01-24 20:15         ` Alexander E. Kopilovich
  0 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-24  7:30 UTC (permalink / raw)


Alexander E. Kopilovich wrote:
> 
> By the way, I think that the referenced (by above URL) article does not
> contradict the apparently contested (under-marked) sentence, because
> circumstances are far too different in several important aspects.

Could you elaborate? It seemed pretty close to a controlled experiment 
to me. The students were not specially selected; the problem was the 
same; only the language and amount of solution made available to the 
students changed.

-- 
Jeff Carter
"Every sperm is sacred."
Monty Python's the Meaning of Life
55



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 22:18   ` Martin Dowie
  2007-01-24  4:14     ` Alexander E. Kopilovich
@ 2007-01-24  7:31     ` Jeffrey R. Carter
  2007-01-24  7:42     ` kevin  cline
  2 siblings, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-24  7:31 UTC (permalink / raw)


Martin Dowie wrote:
> 
> FUD!!

The whole post is FUD. It's an obvious troll, full of outright lies, and 
hardly worth the effort of a reply.

-- 
Jeff Carter
"Every sperm is sacred."
Monty Python's the Meaning of Life
55



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 22:18   ` Martin Dowie
  2007-01-24  4:14     ` Alexander E. Kopilovich
  2007-01-24  7:31     ` Jeffrey R. Carter
@ 2007-01-24  7:42     ` kevin  cline
  2007-01-24  8:07       ` Ludovic Brenta
                         ` (3 more replies)
  2 siblings, 4 replies; 397+ messages in thread
From: kevin  cline @ 2007-01-24  7:42 UTC (permalink / raw)




On Jan 23, 4:18 pm, Martin Dowie <martin.do...@btopenworld.remove.com>
wrote:
> kevin clinewrote:
> > 3. For the same reason that Limburger cheese isn't more popular.  Most
> > programmers who have tried Ada didn't like it.  What makes a programmer
> > like a new language?  Usually, someone comes along and says something
> > like "Remember that program that we spent two weeks writing in C?
> > Here's a Perl implementation that I put together in three hours and
> > one-tenth the code."  That's never happened with Ada.                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>
> FUD!!
>
> http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html

Yes, I've read that article.  It would really be sad if Ada were not
superior to C for a toy problem in embedded control system development,
since Ada was designed specifically for that purpose.  But the point
was that expressiveness drives programmers to new languages, and Ada
isn't particularly expressive.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  7:42     ` kevin  cline
@ 2007-01-24  8:07       ` Ludovic Brenta
  2007-01-24 12:12         ` Markus E Leypold
                           ` (2 more replies)
  2007-01-24 16:14       ` adaworks
                         ` (2 subsequent siblings)
  3 siblings, 3 replies; 397+ messages in thread
From: Ludovic Brenta @ 2007-01-24  8:07 UTC (permalink / raw)


kevin  cline writes:
> But the point was that expressiveness drives programmers to new
> languages, and Ada isn't particularly expressive.

On the contrary, I think that Ada is the most expressive language
around.  Consider:

procedure Set_Bit_In_Register (At_Address : in System.Address) is
   type Register is array (1 .. 32) of Boolean;
   pragma Pack (Register);
   for Register'Bit_Order use System.High_Order_First;
   pragma Volatile (Register);
   
   R : Register;
   for R'Address use At_Address;
begin
   Register (4) := True;
end;

versus

void set_bit_in_register (volatile unsigned long * at_address)
{
   *at_address |= 2 << 3;
}

The Ada version makes many more things explicit, that are assumed and
implicit in C; for example, the size of the register, the fact that
the parameter is an address and not a pointer (*), the endianness, and
which bit is being set.  As 64-bit architectures become prevalent, the
hidden assumption that C's "unsigned long" is 32 bits wide is more and
more likely to be incorrect.

(*) consider that when we increment the address by one, it then
references the next byte; whereas if we increment the pointer by one,
it points to the next "unsigned long", i.e. 2, 4 or 8 bytes and not 1
byte further.  C makes no distinction between addresses and pointers,
lacking expressiveness in a crucial area.

When calling the subprogram, we get:

Set_Bit_In_Register (At_Address => To_Address (16#DEADBEEF#));

versus

set_bit_in_register (0xDEADBEEF);

Again, at the call site, the Ada version gives more information to the
human programmer, i.e. is more expressive.

Expressiveness is not to be confused with conciseness.

-- 
Ludovic Brenta.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 20:09   ` Jeffrey R. Carter
@ 2007-01-24  8:50     ` Dmitry A. Kazakov
  2007-01-24 20:23       ` Jeffrey R. Carter
  2007-01-24 11:06     ` gautier_niouzes
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-01-24  8:50 UTC (permalink / raw)


On Tue, 23 Jan 2007 20:09:48 GMT, Jeffrey R. Carter wrote:

> adaworks@sbcglobal.net wrote:
> 
>>      Turbo Pascal and other alternatives were already in place and
>>      much cheaper than Ada.   A few brave souls tried to compete
>>      with products such as RR Software's Janus Ada and Meridian's
>>      AdaVantage, but the full environment (e.g., integrated editors,
>>      debuggers, etc.) were not in place they were for Turbo Pascal.
> 
> There's always the question of why, given TP's widespread popularity, C 
> became more popular.

Because C was sufficiently worse. At some point Visual Basic came and
superseded both... (:-))

>> 6) Really good compilers began to appear around 1989.   By then Ada's reputation
>>     for being slow, cumbersome, and hard to use had already been firmly set.
> 
> Actually, the DEC Ada compiler of 1984 was pretty good.

Yes!
 
-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: AW: How come Ada isn't more popular?
  2007-01-23 20:17     ` Jeffrey R. Carter
  2007-01-23 20:43       ` Pascal Obry
@ 2007-01-24  9:42       ` Maciej Sobczak
  2007-01-24 20:48         ` Jeffrey R. Carter
  1 sibling, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-01-24  9:42 UTC (permalink / raw)


Jeffrey R. Carter wrote:

> For long term employees, you 
> should be looking for SW engineers, whom you then train if necessary.

Yes.

> Identifying SW engineers isn't easy

Especially when the management (the hiring guys) are not SWEs themselves.

> but one clue is that SW engineers 
> generally like Ada once they've been exposed to it.

Sorry, but this is made up of very thin air.
What about SWEs that were never exposed to Ada?
What about coders that were exposed to Ada but still have no clue?

The only thing that backs up your statement is that an average Ada 
programmer is probably more competent than an average C programmer, but 
even though this correlation is statically true, it is not necessarily 
related to the virtues of any language, but rather to the fact that it 
requires more self-determination (and therefore professional discipline) 
to learn Ada in the world where Ada just does not sell (see "How come 
Ada isn't more popular" thread). Learning C or Java comes for free and 
the rest is just statistics, not the rule. There *are* SWEs that use 
other languages.

> This approach has long term cost savings, as Ada results in SW that is 
> ready for deployment sooner and has far fewer post-deployment errors 
> than SW in C.

As if these were the only progamming languages in the world. There are 
~2500, according to some very conservative estimations, so there is no 
need to keep comparing just these two. "Ada is good, because it's better 
than C" - is this the only thing that Ada can offer? :-) With ~2500 
languages around just being better than C does not count as any 
advantage, so I don't understand why do you use this as an argument so 
often.
If you want to sell Ada, compare it to Java or C++ or C#, for example.

> Your SW is ready before your competitors, and is higher 
> quality.

Just to flame a bit, I can write a database client library in C++ faster 
than in Ada without compromising its quality (see my recent posts 
concerning how much fun I've had with [Limited_]Controlled). ;-)

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  7:27       ` Jeffrey R. Carter
@ 2007-01-24  9:50         ` Maciej Sobczak
  2007-01-24 20:25           ` Jeffrey R. Carter
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-01-24  9:50 UTC (permalink / raw)


Jeffrey R. Carter wrote:

>> Several decades ago, that was definitely true, but is it /still/ true 
>> today? If one measures things by job openings, then COBOL appears to 
>> be outpaced by C++, Java, and C#.
> 
> The most recent survey of actual projects that I've seen (in CACM, IIRC) 
> still had COBOL in 1st place.

Did it cover open-source projects as well?
COBOL was popular at the time when development was centralized in big 
companies, so it was easier to count the number of lines. Today every 
kid is coding something and it's even hard to estimate how much code is 
written every day that is just unnoticed. Just having Windows as a major 
operating system (with milions of development shops shipping software 
for it) gives a hint that COBOL might not be a winner any longer.

--
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 21:12   ` Ludovic Brenta
@ 2007-01-24  9:59     ` Maciej Sobczak
  2007-01-24 18:22       ` Yves Bailly
  2007-01-24 19:18       ` Markus E Leypold
  0 siblings, 2 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-01-24  9:59 UTC (permalink / raw)


Ludovic Brenta wrote:

> If your project consists of general-purpose libraries and you want
> them available to as many developers as possible, then your best
> choice is C

That's a misconception as well, quite common in an open-source world.

> not only because of compiler availability because, more
> importantly, because C makes it easy to call your libraries from other
> languages.

 > It is a design goal of GNOME, for example, to support many
> languages for application development, and that's why the GTK+ and
> GNOME libraries are implemented in C, despite the fact that they are
> object-oriented and so would have benefited from an object-oriented
> language.

Windows was implemented in C++, and it has C API.
Encapsulation is what separates a language choice for the interface and 
implementation - well, at least to some extent and provided that both 
languages are easily "bindable". You can even have C API for Ada 
implementation - pragma Export is just as useful as pragma Import!.
In other words, you don't need to use C for implementation part even if 
you want to have C API for reasons of useability. This is exacly what 
the open-source guys don't seem to get right.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  0:12 ` JPWoodruff
@ 2007-01-24 10:32   ` gautier_niouzes
  2007-01-25  1:01   ` Alexander E. Kopilovich
  1 sibling, 0 replies; 397+ messages in thread
From: gautier_niouzes @ 2007-01-24 10:32 UTC (permalink / raw)


> > My question is: how come Ada isn't more popular?I have another hypothesis that involves the way many programmers got
> started at a young age.  For some decades, classes of smart young
> teenagers have had easy access to computers and amateur tools, and
> have honed their skills at what most of them called "hacking".  They
> learned to reason in low levels of abstraction.  They spent a lot of
> time in thread-of-execution debugging.
>
> I think that software engineers who started their understanding in
> that paradigm are a hard sell for Ada.  They do have techniques that
> work and there are plentiful examples of their success, but we Ada
> guys prefer something different.
[...]

Probably one hurdle for Ada is that the "Ada guys", self-called "we,
the true Software Engineers" want to keep "their" language for
themselves and discourage the "young hackers" even to take a look at it
when they mature... :-)
You are missing some aspects:
- a hacking teenager (I was one) is able to evolve and see that the
some previous "hacks" stop quicker working because there too much
intrication between I/O, GUI, system, libraries, or the code was too
cryptic
- it is possible to hack in Ada; no surprise, such programs are the
only that compile and run after 10 years sleep, on a new environment;
you are happy there was no conditional compilation in the source,
whereas other hacks in another language stop a compiler at line 2 or
3...
- not having hacked in the young age does not help to program better.
If you look at the code of whatever time you see that untalented people
program exactly as poorly whatever they did in their young age. You see
these same huge chunks of copy-paste style instruction blocks in old
Fortran code or recent code of whatever language, with a mix of
interactive/non-interactive, mix of abstraction levels; these people
find that a subprogram is a kind of magic, so they prefer to activate
the copy-paste machine, which they think is safer...
If you succeeded in your effort of generation split (drawing Ada on the
"old" side, then into the coffin), yes, there would be trouble for that
language...
______________________________________________________________
Gautier         -- http://www.mysunrise.ch/users/gdm/index.htm
Ada programming -- http://www.mysunrise.ch/users/gdm/gsoft.htm

NB: For a direct answer, e-mail address on the Web site!




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 20:09   ` Jeffrey R. Carter
  2007-01-24  8:50     ` Dmitry A. Kazakov
@ 2007-01-24 11:06     ` gautier_niouzes
  2007-01-24 19:25       ` tmoran
  1 sibling, 1 reply; 397+ messages in thread
From: gautier_niouzes @ 2007-01-24 11:06 UTC (permalink / raw)


Jeffrey R. Carter:

> >      Turbo Pascal and other alternatives were already in place and
> >      much cheaper than Ada.   A few brave souls tried to compete
> >      with products such as RR Software's Janus Ada and Meridian's
> >      AdaVantage, but the full environment (e.g., integrated editors,
> >      debuggers, etc.) were not in place they were for Turbo Pascal.
> There's always the question of why, given TP's widespread popularity, C
> became more popular.

It has to do with the deep unportability of Pascal and (consequence)
the fragmentation of Pascal into incompatible dialects. At the time you
had Amiga's, Atari's, Mac's; you had MS Windows coming to replace DOS,
so a DOS-oriented, Pascal dialect had little chance against C, except
for a short time.
TP was an extremely fast compiler producing unoptimized code (except
some trivial XOR AX,AX's), but with the CPU's frequencies quickly up
around 1990, the interest was more targeted to profit from this speed
in the compiled code and less to have a couple millions more of LoC
compiled per second.
...
> Windows 95 was the 1st widely used OS with support for tasking. Ada (95)
> was the only widely available language with support for tasking at the
> time. We probably lost a good opportunity to gain more acceptance of Ada
> by not including a standard windowing library and promoting Ada as the
> best language for taking advantage of Win95's features.

Mmmh I think  it was a good idea *not* to include a standard windowing
library: then now Ada would be stuck with an outdated standard
windowing library. There was also another problem then: the lack of a
good but cheap or free compiler.
Don't be so pessimistic, Ada's quality only appear with time - and of
course with the effort of brave souls.
If you say "I'm a smart software engineer, Ada is for me and not for
you", you won't help Ada.
If you make good, visible, useful open-source software with Ada, you
will help.
______________________________________________________________
Gautier         -- http://www.mysunrise.ch/users/gdm/index.htm
Ada programming -- http://www.mysunrise.ch/users/gdm/gsoft.htm

NB: For a direct answer, e-mail address on the Web site!




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  8:07       ` Ludovic Brenta
@ 2007-01-24 12:12         ` Markus E Leypold
  2007-01-24 12:48           ` Ludovic Brenta
  2007-01-24 13:40           ` Pascal Obry
  2007-01-24 16:25         ` Adam Beneschan
  2007-02-06  9:54         ` Dave Thompson
  2 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-24 12:12 UTC (permalink / raw)



Hi Ludovic,

Ludovic Brenta <ludovic@ludovic-brenta.org> writes:

> kevin  cline writes:
>> But the point was that expressiveness drives programmers to new
>> languages, and Ada isn't particularly expressive.
>
> On the contrary, I think that Ada is the most expressive language
> around.  

If I were in the business of language advocacy as some people in this
thread obviously are, I'd now cry: "FUD!!"

Anyway, I have to contradict. You'd have to restrict the scope
of your statement a bit (to a special application area or a specific
subset of all programming languages) for it to become true.

I stipulate that languages with a Hindley-Milner type system and/or
functional languages are, in many aspects, more expressive. Even more
so, if they have modules, functors and classes (like OCaml) does.

Or consider Haskell which is a VERY expressive language.

Of course it all depends a bit on how you define "expressiveness".

I do not want to denigrate Ada here. But I think judging the place of
Ada in the world right is more important (or useful to Ada or the
community) than claming ALL the superlatives for Ada.


> procedure Set_Bit_In_Register (At_Address : in System.Address) is
>    type Register is array (1 .. 32) of Boolean;
>    pragma Pack (Register);
>    for Register'Bit_Order use System.High_Order_First;
>    pragma Volatile (Register);
>    
>    R : Register;
>    for R'Address use At_Address;
> begin
>    Register (4) := True;
> end;
>
> versus
>
> void set_bit_in_register (volatile unsigned long * at_address)
> {
>    *at_address |= 2 << 3;
> }


<cynical mode>

You're sure you're not confusing verbosity with "expressiveness"? :-)

</cynical mode>


Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 12:12         ` Markus E Leypold
@ 2007-01-24 12:48           ` Ludovic Brenta
  2007-01-24 14:49             ` Markus E Leypold
  2007-01-24 13:40           ` Pascal Obry
  1 sibling, 1 reply; 397+ messages in thread
From: Ludovic Brenta @ 2007-01-24 12:48 UTC (permalink / raw)


Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> Hi Ludovic,
>
> Ludovic Brenta <ludovic@ludovic-brenta.org> writes:
>
>> kevin  cline writes:
>>> But the point was that expressiveness drives programmers to new
>>> languages, and Ada isn't particularly expressive.
>>
>> On the contrary, I think that Ada is the most expressive language
>> around.  
>
> If I were in the business of language advocacy as some people in this
> thread obviously are, I'd now cry: "FUD!!"
[...]
> I do not want to denigrate Ada here. But I think judging the place of
> Ada in the world right is more important (or useful to Ada or the
> community) than claming ALL the superlatives for Ada.

OK, I'll take back what I said above, and replace with "Ada is the
most expressive language I know of."  I can't comment on Haskell or
OCaml because I don't know them well enough.

>> procedure Set_Bit_In_Register (At_Address : in System.Address) is
>>    type Register is array (1 .. 32) of Boolean;
>>    pragma Pack (Register);
>>    for Register'Bit_Order use System.High_Order_First;
>>    pragma Volatile (Register);
>>    
>>    R : Register;
>>    for R'Address use At_Address;
>> begin
>>    Register (4) := True;
>> end;
>>
>> versus
>>
>> void set_bit_in_register (volatile unsigned long * at_address)
>> {
>>    *at_address |= 2 << 3;
>> }
>
> <cynical mode>
> You're sure you're not confusing verbosity with "expressiveness"? :-)
> </cynical mode>

No, but there is bound to be some correlation.  Expressiveness is the
ability to carry a lot of information across to the human programmer
as well as to the compiler.  Verbosity, or its opposite conciseness,
is the density of that information, as in "information units per line
of code" or some such ill-defined measure.

Ada is more expressive than C because it allows programmes to express
more information.  In a way, it is also more concise in that Ada
compilers insert all sorts of implicit checks, and in that Ada has
built-in constructs like tasking, array slices and return of
dynamically-sized objects that require much more lines of code to
achieve in C.

But C more concise than Ada in other ways; for example "volatile
unsigned long *" does not require a separate type definition, and "{}"
takes only 11.765% of the space of "begin; null; end;"

-- 
Ludovic Brenta.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 17:40     ` Markus E Leypold
@ 2007-01-24 12:51       ` Peter Hermann
  2007-01-24 14:42         ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Peter Hermann @ 2007-01-24 12:51 UTC (permalink / raw)


Markus E Leypold wrote:
> <adaworks@sbcglobal.net> writes:
> > discovered is that Java is not type safe, <...> 
> 
> How so? I'd be really intrested to know.

Java adopted a type system (at least for its scalar types)
which was about 20 years outdated at the time of Java's creation.
There is a lack of an important layer of abstraction
resulting from a C and Assembler mindset.

-- 
--Peter.Hermann@ihr.uni-stuttgart.de        (+49)0711-685-872-44(Fax79)
--Nobelstr.19 Raum 0.030, D-70569 Stuttgart IHR Hoechstleistungsrechnen
--http://www.ihr.uni-stuttgart.de/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 12:12         ` Markus E Leypold
  2007-01-24 12:48           ` Ludovic Brenta
@ 2007-01-24 13:40           ` Pascal Obry
  2007-01-24 14:50             ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Pascal Obry @ 2007-01-24 13:40 UTC (permalink / raw)
  To: Markus E Leypold

Markus E Leypold a �crit :
> <cynical mode>
> 
> You're sure you're not confusing verbosity with "expressiveness"? :-)
> 
> </cynical mode>

Let's try :

   pragma Remote_Call_Interface;

as an expressiveness example :)

I won't even try to give the equivalent C/C++ code!

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|              http://www.obry.net
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver wwwkeys.pgp.net --recv-key C1082595



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 21:56   ` Dr. Adrian Wrigley
@ 2007-01-24 13:52     ` Alex R. Mosteo
  2007-01-24 19:25     ` tmoran
                       ` (4 subsequent siblings)
  5 siblings, 0 replies; 397+ messages in thread
From: Alex R. Mosteo @ 2007-01-24 13:52 UTC (permalink / raw)


Dr. Adrian Wrigley wrote:

>> 2) There's no good, easy, almost automatic C binding generator, although
>> the language has well defined mechanisms for C interfacing. Yes, there
>> was some generator. No, it is not trivial at present to get it running in
>> my experience. There's some effort to have Ada integrated into SWIG; this
>> is promising and IMHO an important selling point to newcomers.
> 
> I think this is critical.  Why can't we just say:
> 
> with stdio;
> 
> pragma import (C, stdio, "stdio.h");
> 
> and be able to get structs, functions, constants, variables from C in
> an obvious and reasonably reliable way?

Ah, that sounds even better... One can dream...



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 20:02 ` Jeffrey R. Carter
  2007-01-24  7:18   ` adaworks
@ 2007-01-24 14:19   ` Alex R. Mosteo
  2007-01-24 15:27     ` Poll on background of Ada people (was: How come Ada isn't more po) Larry Kilgallen
  1 sibling, 1 reply; 397+ messages in thread
From: Alex R. Mosteo @ 2007-01-24 14:19 UTC (permalink / raw)


Jeffrey R. Carter wrote:

> artifact.one@googlemail.com wrote:
>> 
>> My question is: how come Ada isn't more popular?
> 
> Much of what others have posted is good, but I also see:
> 
> Most developers (98% in my experience) are coders. Coders like languages
> like C that let them  code. Then then enjoy spending more time debugging
> then they did coding.
> 
> The other 2% are SW engineers. They like languages like Ada.
> 
> Obviously, C and its ilk are going to be more popular than Ada.

Perhaps we could run an informal poll to see the background of Ada people
here.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 12:51       ` Peter Hermann
@ 2007-01-24 14:42         ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-24 14:42 UTC (permalink / raw)



Peter Hermann <ica2ph@csv.ica.uni-stuttgart.de> writes:

> Markus E Leypold wrote:
>> <adaworks@sbcglobal.net> writes:
>> > discovered is that Java is not type safe, <...> 
>> 
>> How so? I'd be really intrested to know.
>
> Java adopted a type system (at least for its scalar types)
> which was about 20 years outdated at the time of Java's creation.
> There is a lack of an important layer of abstraction
> resulting from a C and Assembler mindset.

So the type system is not rich enough, at least if you come from the
Ada world with those interesting subtyping stuff.

But how is Jave not _type safe_? 

I believe Luca Cardelli gave a good definition of what type safe means in his paper

    http://research.microsoft.com/Users/luca/Papers/TypeSystems.pdf

But I fail to see, how "execution errors" in the sense give there can
occur in Java.  Shall I elaborate? 

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 12:48           ` Ludovic Brenta
@ 2007-01-24 14:49             ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-24 14:49 UTC (permalink / raw)



Ludovic Brenta <ludovic@ludovic-brenta.org> writes:

> Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>
>> Hi Ludovic,
>>
>> Ludovic Brenta <ludovic@ludovic-brenta.org> writes:
>>
>>> kevin  cline writes:
>>>> But the point was that expressiveness drives programmers to new
>>>> languages, and Ada isn't particularly expressive.
>>>
>>> On the contrary, I think that Ada is the most expressive language
>>> around.  
>>
>> If I were in the business of language advocacy as some people in this
>> thread obviously are, I'd now cry: "FUD!!"
> [...]
>> I do not want to denigrate Ada here. But I think judging the place of
>> Ada in the world right is more important (or useful to Ada or the
>> community) than claming ALL the superlatives for Ada.
>
> OK, I'll take back what I said above, and replace with "Ada is the
> most expressive language I know of."  I can't comment on Haskell or
> OCaml because I don't know them well enough.

Fine. I agree it's the most expressive one of the "classic" imperative
Familie, under which I count FORTRAN, C, C++, Modula, Pascal and so on
(forgive me, there is a similarity, so I think it makes sense for a
rough classification to put all those languages into a big
super-family, as opposed to in example the more or less functional
languages from LISP to Haskell).


>>
>> <cynical mode>
>> You're sure you're not confusing verbosity with "expressiveness"? :-)
>> </cynical mode>
>
> No, but there is bound to be some correlation.  Expressiveness is the
> ability to carry a lot of information across to the human programmer
> as well as to the compiler.  Verbosity, or its opposite conciseness,
> is the density of that information, as in "information units per line
> of code" or some such ill-defined measure.

Difficult. With time I've learnt to like the type inference and
annotations of Ocaml + Haskell and starting to get annoyed by the
Pascal style. Here verbosity buys me hardly anything and leaves me
with the necessity to state and restate the same thing everywhere. So
no: There is a correlation between verbosity and expressiveness, but
not a very strict one.

>
> Ada is more expressive than C because it allows programmes to express
> more information.  In a way, it is also more concise in that Ada

I admit it has a richer, type system which allows to express more details.

> compilers insert all sorts of implicit checks, and in that Ada has
> built-in constructs like tasking, array slices and return of

Those I wouldn't count towards expressiveness, only as more
functionality in the standard runtime.

> dynamically-sized objects that require much more lines of code to
> achieve in C.

More power.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 13:40           ` Pascal Obry
@ 2007-01-24 14:50             ` Markus E Leypold
  2007-01-24 17:22               ` Pascal Obry
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-24 14:50 UTC (permalink / raw)



Pascal Obry <pascal@obry.net> writes:

> Markus E Leypold a �crit :
>> <cynical mode>
>> 
>> You're sure you're not confusing verbosity with "expressiveness"? :-)
>> 
>> </cynical mode>
>
> Let's try :
>
>    pragma Remote_Call_Interface;
>
> as an expressiveness example :)
>
> I won't even try to give the equivalent C/C++ code!


So what is expressiveness in your definition? :-)

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: Poll on background of Ada people (was: How come Ada isn't more po)
  2007-01-24 14:19   ` Alex R. Mosteo
@ 2007-01-24 15:27     ` Larry Kilgallen
  0 siblings, 0 replies; 397+ messages in thread
From: Larry Kilgallen @ 2007-01-24 15:27 UTC (permalink / raw)


In article <51p863F1k9di7U2@mid.individual.net>, "Alex R. Mosteo" <devnull@mailinator.com> writes:

> Perhaps we could run an informal poll to see the background of Ada people
> here.

(in reverse order) Pascal, Bliss, various assembly languages.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  7:42     ` kevin  cline
  2007-01-24  8:07       ` Ludovic Brenta
@ 2007-01-24 16:14       ` adaworks
  2007-01-25  0:22         ` kevin  cline
  2007-01-25  8:27         ` Harald Korneliussen
  2007-01-25  4:50       ` Alexander E. Kopilovich
  2007-01-27  5:43       ` Charles D Hixson
  3 siblings, 2 replies; 397+ messages in thread
From: adaworks @ 2007-01-24 16:14 UTC (permalink / raw)



"kevin cline" <kevin.cline@gmail.com> wrote in message >
> Yes, I've read that article.  It would really be sad if Ada were not
> superior to C for a toy problem in embedded control system development,
> since Ada was designed specifically for that purpose.  But the point
> was that expressiveness drives programmers to new languages, and Ada
> isn't particularly expressive.
>
It depends on what you want to express.   Expressiveness, like beauty,
is often "in the eye of the beholder."  For many software professionals,
the absence of curly-braces makes a language less expressive.  For
others, their presence is a problem.   For some, brevity is a sign of
expressiveness.  For others, ease of understanding on the part of a
reader is important to expressiveness.

I do know a lot of languages, including many in the C family.  I have
programmed in COBOL, Fortran, PL/I, BASIC, Python, and many
others.   I teach a class in comparative programming languages.

An important distinction to be made is expressibility versus expressiveness.
The fact that I can express a solution in Fortran that is better suited to being
expressed in COBOL does not mean that Fortran is expressive.   When
a solution is possible, that makes it expressible.  When the language is
designed to express such solutions, that makes it expressive -- for that
kind of problem.

Languages evolve to become more expressive.  Fortran has evolved.  COBOL
has evolved.  Ada has evolved.  Even Java continues to evolve.  Some languages
seem not to evolve and remain stagnent.  PL/I comes to mind.   Some programmers
do not evolve, and they also become out-of-date.


As languages evolve some tend to get better at being more directly expressiveof
a larger range of solutions.  Not all languages evolve well.   In my view, C++
has not become better with its most recent "improvements."

Ada is expressive of a large number of solutions that are more difficult to
express in C or other languages.  However, the base language is not as
expressive as Perl for regular expressions.  That has required a separate
library unit.  It is not as directly expressive of mathematical solutions as
Fortran.  That too requires library units.    All the fundamental constructs
are in place to create powerful library units.  This capability allows the
language to be easily extended to accomodate a large number of kinds
of solutions.  Expressiveness is, for some kinds of solutions, through the
creation of such libraries, not by burdening the language.

The type model of Ada does enhance expressiveness since it allows a
lot of extensions.   There are certainly things I would do differently in
that type system, and future languages will come along that improve on
Ada.  At present, that has not happened -- certainly not with a type
model that I like.   The visibility rules of Ada allow some powerful
constructs for designing large-scale software systems.   None of the
C family, including C++, is as expressive in this regard.

Expressiveness needs to be considered beyond toy programs.  Real
software systems also evolve.  The fact that I can program myself
into a corner with some simple code that needs to be rewritten from
scratch each time the problem changes a little, can give me the
illusion of expressiveness.  Real expressiveness includes the ability to
easily adapt the software in-place to the changing realities of the
environment in which it is deployed.

With all its shortcomings, and it does have shortcomings, Ada seems to
be have the capacity for being expressive of most kinds of practical
software solutions.  For many of those solutions, without supporting
libraries, it is characterized by expressibility.  When one adds libraries
written in Ada, we see greater expressiveness.

Expressiveness is not always hampered by the language design but
by those who have not yet learned how to use that design as it was
intended to be used.

Richard riehle 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  8:07       ` Ludovic Brenta
  2007-01-24 12:12         ` Markus E Leypold
@ 2007-01-24 16:25         ` Adam Beneschan
  2007-01-24 17:03           ` Niklas Holsti
  2007-01-25 15:37           ` Bob Spooner
  2007-02-06  9:54         ` Dave Thompson
  2 siblings, 2 replies; 397+ messages in thread
From: Adam Beneschan @ 2007-01-24 16:25 UTC (permalink / raw)




On Jan 24, 12:07 am, Ludovic Brenta <ludo...@ludovic-brenta.org> wrote:
> kevin  cline writes:
> > But the point was that expressiveness drives programmers to new
> > languages, and Ada isn't particularly expressive.On the contrary, I think that Ada is the most expressive language
> around.  Consider:
>
> procedure Set_Bit_In_Register (At_Address : in System.Address) is
>    type Register is array (1 .. 32) of Boolean;
>    pragma Pack (Register);
>    for Register'Bit_Order use System.High_Order_First;
>    pragma Volatile (Register);
>
>    R : Register;
>    for R'Address use At_Address;
> begin
>    Register (4) := True;
> end;
>
> versus
>
> void set_bit_in_register (volatile unsigned long * at_address)
> {
>    *at_address |= 2 << 3;
>
> }

I think maybe I understand how C might be considered "more expressive"
to some programmers.  If you want to set the third bit of this word,
you can just type in what you want, without going through a whole lot
of rigmarole like defining types and telling the compiler what you
intend to do and what order you think the bits are in and stuff like
that.  You can just say it.  Of course, if there's a misunderstanding
between you and the compiler about how your data is laid out, perhaps,
then your code doesn't work and it crashes your PC and introduces a
vulnerability that some clever hacker can take advantage of to turn
your computer into a subservient automaton whose sole purpose in life
is to send millions of "performance-enhancing drug" ads to *my* e-mail
address---but hey, at least you got to express yourself.   C lets you
do this without having to do as much thinking or actual work, which
mean old Ada makes you do.

Perhaps we should just concede that C is a "more expressive
language"---with about as much benefit as there is to teaching math
students to be "more expressive" as opposed to getting the right
answer.

                               -- Adam




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 16:25         ` Adam Beneschan
@ 2007-01-24 17:03           ` Niklas Holsti
  2007-01-25 15:37           ` Bob Spooner
  1 sibling, 0 replies; 397+ messages in thread
From: Niklas Holsti @ 2007-01-24 17:03 UTC (permalink / raw)


Without wishing anyone ill, I want to sound my whistle to point out some 
fouls in this battle...

Adam Beneschan wrote:
> 
> On Jan 24, 12:07 am, Ludovic Brenta <ludo...@ludovic-brenta.org> wrote:
> 
>>kevin  cline writes:
>>
>>>But the point was that expressiveness drives programmers to new
>>>languages, and Ada isn't particularly expressive.On the contrary, I think that Ada is the most expressive language
>>
>>around.  Consider:
>>
>>procedure Set_Bit_In_Register (At_Address : in System.Address) is
>>   type Register is array (1 .. 32) of Boolean;
>>   pragma Pack (Register);
>>   for Register'Bit_Order use System.High_Order_First;

According to RM 13.5.3, the Bit_Order attribute can be specified only 
for record types, not array types like Register, so this Bit_Order 
clause is nonstandard. But this is perhaps not central to the argument...

>>   pragma Volatile (Register);
>>
>>   R : Register;
>>   for R'Address use At_Address;
>>begin
>>   Register (4) := True;
>>end;

If the High_Order_First clause were allowed for type Record, this would 
set the 4th highest bit, that is the bit for 2**28. If the bit indices 
run the other way, this would set the 4th lowest bit, the bit for 2**8.

>>versus
>>
>>void set_bit_in_register (volatile unsigned long * at_address)
>>{
>>   *at_address |= 2 << 3;
>>
>>}

That sets the bit for the 5th lowest bit, the bit for 2**16. Note that 
the shifted thing is 2#10#, not 2#1#.

> I think maybe I understand how C might be considered "more expressive"
> to some programmers.  If you want to set the third bit of this word,
> you can just type in what you want, without going through a whole lot
> of rigmarole like defining types and telling the compiler what you
> intend to do and what order you think the bits are in and stuff like
> that.  You can just say it.  Of course, if there's a misunderstanding
> between you and the compiler...

There is a misunderstanding somewhere, because the oh-so-expressive C 
code is not doing the same thing as the Ada code (in either Bit_Order), 
nor the same thing as you say it does.

And of course the same formula can be written in Ada with modular types, 
Shift_Right, "or". More characters to type, but less line-noise style 
and no need to check the well-thumbed page of my C manual that shows the 
C operator precedence table. This page has had a yellow post-it tab on 
it for a long time... :-)

-- 
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 14:50             ` Markus E Leypold
@ 2007-01-24 17:22               ` Pascal Obry
  2007-01-24 17:56                 ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Pascal Obry @ 2007-01-24 17:22 UTC (permalink / raw)
  To: Markus E Leypold

Markus E Leypold a �crit :

> So what is expressiveness in your definition? :-)

It is so expressive that I won't be able to comment here. See Annex-E in
the Ada reference manual :)

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|              http://www.obry.net
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver wwwkeys.pgp.net --recv-key C1082595



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 17:22               ` Pascal Obry
@ 2007-01-24 17:56                 ` Markus E Leypold
  2007-01-24 18:09                   ` Pascal Obry
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-24 17:56 UTC (permalink / raw)



Pascal Obry <pascal@obry.net> writes:

> Markus E Leypold a �crit :
>
>> So what is expressiveness in your definition? :-)
>
> It is so expressive that I won't be able to comment here. See Annex-E in
> the Ada reference manual :)

No, I asked, how you define expressiveness (generally), not, what is
so expressive in this pragma. But never mind. It's not important.

Regards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 17:56                 ` Markus E Leypold
@ 2007-01-24 18:09                   ` Pascal Obry
  2007-01-24 19:37                     ` Markus E Leypold
  2007-01-25  7:52                     ` Harald Korneliussen
  0 siblings, 2 replies; 397+ messages in thread
From: Pascal Obry @ 2007-01-24 18:09 UTC (permalink / raw)
  To: Markus E Leypold

Markus E Leypold a �crit :

> No, I asked, how you define expressiveness (generally), not, what is
> so expressive in this pragma. But never mind. It's not important.

Sorry I did not understand that! For me, a high expressiveness in a
programming language is how easy/simple it is to declare something that
can be quite complex underneath.

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|              http://www.obry.net
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver wwwkeys.pgp.net --recv-key C1082595



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  9:59     ` Maciej Sobczak
@ 2007-01-24 18:22       ` Yves Bailly
  2007-01-24 19:18       ` Markus E Leypold
  1 sibling, 0 replies; 397+ messages in thread
From: Yves Bailly @ 2007-01-24 18:22 UTC (permalink / raw)


Maciej Sobczak wrote:
> [...]
> implementation - pragma Export is just as useful as pragma Import!.
> In other words, you don't need to use C for implementation part even if
> you want to have C API for reasons of useability. This is exacly what
> the open-source guys don't seem to get right.
  ^^^
Please, replace that "the" by "some", or even "many" if you like ;-)
I'm an open-source guy, currently creating an Ada binding to the
C++ library Qt through a C layer... pragma Export, Import and exern "C"
everywhere :-)

Regards,

-- 
(o< | Yves Bailly  : http://kafka-fr.net   | -o)
//\ | Linux Dijon  : http://www.coagul.org | //\
\_/ |                                      | \_/`



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  9:59     ` Maciej Sobczak
  2007-01-24 18:22       ` Yves Bailly
@ 2007-01-24 19:18       ` Markus E Leypold
  2007-01-25  8:37         ` Maciej Sobczak
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-24 19:18 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> both languages are easily "bindable". You can even have C API for Ada
> implementation - pragma Export is just as useful as pragma Import!.
> In other words, you don't need to use C for implementation part even
> if you want to have C API for reasons of useability. This is exacly
> what the open-source guys don't seem to get right.

So Ada = Closed Source = Good and C = Open Source = Bad or something
like this? I don't understand it.

Apart from that, me seems it would be a bit difficult to have a C API
to some Ada library, since Ada requires quite a lot of runtime support
for tasking, so certainly doesn't interact to well C-code which also
use signals, longjmp etc.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 11:06     ` gautier_niouzes
@ 2007-01-24 19:25       ` tmoran
  2007-01-25  4:46         ` Gautier
  0 siblings, 1 reply; 397+ messages in thread
From: tmoran @ 2007-01-24 19:25 UTC (permalink / raw)


> > Windows 95 was the 1st widely used OS with support for tasking. Ada (95)
> > was the only widely available language with support for tasking at the
> > time. We probably lost a good opportunity to gain more acceptance of Ada
> > by not including a standard windowing library and promoting Ada as the
> > best language for taking advantage of Win95's features.
>
> Mmmh I think  it was a good idea *not* to include a standard windowing
> library: then now Ada would be stuck with an outdated standard
> windowing library. There was also another problem then: the lack of a
> good but cheap or free compiler.

    Actually some of us did try to make a standard Windows library and
promote Ada as the best language for Windows programming (see "CLAW, a
High Level, Portable, Ada 95 Binding for Microsoft Windows" TriAda 1997).
As a matter of fact, it seems everybody who wanted access to the Windows
API designed a standard windowing library - eg GWindows et al.
    Also, there was the Janus Ada compiler which was pretty cheap at
something like $100, and the early versions of the free Gnat compiler were
coming out at that time.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 21:56   ` Dr. Adrian Wrigley
  2007-01-24 13:52     ` Alex R. Mosteo
@ 2007-01-24 19:25     ` tmoran
  2007-01-24 19:38     ` artifact.one
                       ` (3 subsequent siblings)
  5 siblings, 0 replies; 397+ messages in thread
From: tmoran @ 2007-01-24 19:25 UTC (permalink / raw)


> I think this is critical.  Why can't we just say:
>
> with stdio;
>
> pragma import (C, stdio, "stdio.h");
>
> and be able to get structs, functions, constants, variables from C in
> an obvious and reasonably reliable way?

  To some extent one can do this with APIs that come with typelibs.  But
having recently made an Ada binding to the Google Earth API, I observe
that the time spent finding out what the various calls *actually* do vs
what the documentation indicates they do, takes much more time than the
typing of a bunch of pragma imports etc.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 21:36 ` How come Ada isn't more popular? kevin  cline
  2007-01-23 22:18   ` Martin Dowie
@ 2007-01-24 19:33   ` Arthur Evans Jr
       [not found]     ` <egYth.15026$w91.10597@newsread1.news.pas.earthlink.net>
  1 sibling, 1 reply; 397+ messages in thread
From: Arthur Evans Jr @ 2007-01-24 19:33 UTC (permalink / raw)


In article <1169588206.234714.312650@k78g2000cwa.googlegroups.com>,
 "kevin  cline" <kevin.cline@gmail.com> wrote:

>                                                         Ada was
> designed by a committee to meet theroetical needs.

Not so. Before design of Ada started, about 1970, there was a lengthy 
period of gathering requirements. Input was solicited from all over, 
both inside DOD and out, about what should be in the language. DOD 
published in 1971 a Strawman proposal of language features, followed by 
a Tinman, an Ironman, and finally a Steelman. Each of these feature 
requirement documents was widely reviewed by anyone who chose to submit 
comments, and these comments were studied to produce the next document 
in the series. Comments came from language design theorists and 
practitioners, from folks who thought a big program was 5000 lines and 
from those who thought big was 5,000,000 lines, and from pretty much any 
one else who chose to participate. Many had extensive experience in 
large scale software development.

During all of this requirements development process, compiler 
implementation started. In response to an RFP four teams were chosen to 
implement the language. (I think Tinman was then the latest requirement 
document.) The designs were evaluated publicly and the two best were 
told to go on, and later the better of the two completed implementation 
of what became Ada-83. I may have mis-remembered some of the details; 
it's been a long time.

Was the result perfect? Far from it, but I thought then and still think 
that Ada-83 was better for large mission-critical applications than any 
other language then available.

BUT: Ada was not designed by a committee, and the needs it was intended 
to meet were not theoretical. They were very practical.

All of that said, much of what was fine in Ada-83 was due to Jean 
Ichbiah, who led the team that did that implementation. I disagreed with 
some of Jean's decisions, some times vocally, but I later came to 
realize that he was usually right. Participation in the Ada design 
process was one of the high points of my career.

Art Evans
Distinguished Reviewer for Ada-83 and Ada-95



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 18:09                   ` Pascal Obry
@ 2007-01-24 19:37                     ` Markus E Leypold
  2007-01-24 19:52                       ` Pascal Obry
  2007-01-25  7:52                     ` Harald Korneliussen
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-24 19:37 UTC (permalink / raw)



Hi Pascal,

Pascal Obry <pascal@obry.net> writes:

> Markus E Leypold a �crit :
>
>> No, I asked, how you define expressiveness (generally), not, what is
>> so expressive in this pragma. But never mind. It's not important.
>
> Sorry I did not understand that! For me, a high expressiveness in a
> programming language is how easy/simple it is to declare something that
> can be quite complex underneath.

Ludovics example was quite the opposite (verbose if precise). I think,
that expressiveness can not be defined or measured very well. Perhaps
as number of programming patterns that are supported and their
flexibility. So object orientation introduces more expressiveness when
compared with procedures, Hindley-Milner type systems when compared
with languages that have no parameterized types, templates (C++) would
also introduce more expressiveness (don't flame me, we are talking
"expressiveness" now, not safety), range sub-typing Ada style is also
added expressiveness (which on the other side is absent from the
Hindley-Milner type systems ...), functions as closures and 1st class
citizens allow to abstract over iteration patterns (fold, map, etc),
so this also introduces more expressiveness. And so on.

I do not agree with your definition that just measures the ratio
between "things happening underneath" and the construct you write in
your program. If I'd agree, I also had to concede that 

 #include <stdio.h> 

is terribly expressive, because it pulls in such a lot of
functionality with one line (actually I think Fortran programmer would
agree :-) -- since there is no preprocessor defined in Fortran (up to
77 at least), they abuse the C preprocessor to reuse data definitions
and this like).

Perhaps we should ask: Which programming patterns / paradigms does
the language support? I realize that is terribly vague.

To widen the scope of this discussion a bit: I've been using some
10-20 languages in the last 2 decades intensively and looked upon
perhaps double as many. I'm amazed about the narrowing of perspective
I perceive with a number of participants in this discussion and the
other one (where someone asked for an Ada example for his book). Ada
still has it's place in the world, I think, but it would be good to
admit that things have changed a bit during the last 25 years. There
is a lot to be learned from Ada. Exclusive focus on it and ignoring
that not every piece of software fits into the Ada niche (which I
perceive to be reliable embedded programming) won't help to promote
it. I sometimes even doubt it can be rescued in the long run.

(That isn't against you, I'm just adding this not here, because I
can't decide where else).


Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 21:56   ` Dr. Adrian Wrigley
  2007-01-24 13:52     ` Alex R. Mosteo
  2007-01-24 19:25     ` tmoran
@ 2007-01-24 19:38     ` artifact.one
  2007-01-26  2:50     ` Keith Thompson
                       ` (2 subsequent siblings)
  5 siblings, 0 replies; 397+ messages in thread
From: artifact.one @ 2007-01-24 19:38 UTC (permalink / raw)


On Jan 23, 9:56 pm, "Dr. Adrian Wrigley"
<a...@linuxchip.demon.co.uk.uk.uk> wrote:
>
>  Why can't we just say:
>
> with stdio;
>
> pragma import (C, stdio, "stdio.h");
>
> and be able to get structs, functions, constants, variables from C in
> an obvious and reasonably reliable way?
>

That, would be one of the most beautiful things I'd seen in computing
if it existed!

MC




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 19:37                     ` Markus E Leypold
@ 2007-01-24 19:52                       ` Pascal Obry
  2007-01-24 21:31                         ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Pascal Obry @ 2007-01-24 19:52 UTC (permalink / raw)
  To: Markus E Leypold

Markus E Leypold a �crit :

>  #include <stdio.h> 
> 
> is terribly expressive, because it pulls in such a lot of

Certainly not. It does nothing at the semantic level. But I agree that
it is not easy to have a common view about expressiveness, let's drop
this then.

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|              http://www.obry.net
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver wwwkeys.pgp.net --recv-key C1082595



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 19:16 ` Tero Koskinen
  2007-01-23 21:12   ` Ludovic Brenta
@ 2007-01-24 20:10   ` Cesar Rabak
  1 sibling, 0 replies; 397+ messages in thread
From: Cesar Rabak @ 2007-01-24 20:10 UTC (permalink / raw)


Tero Koskinen escreveu:
> Hi,
> 
> On 22 Jan 2007 21:53:32 -0800 artifact.one@googlemail.com wrote:
>> My question is: how come Ada isn't more popular?
>>
> 
> I would like to throw in yet another possible reason(*) for Ada being
> not popular (within the free software folks).

As a lot of interesting and complete info already has been posted here, 
my humble contribution on Tero's view:

> 
> Currently, the only non-restricted free Ada compiler is FSF GCC/GNAT.
> However, the GCC development team doesn't consider Ada to be
> a release critical language, so it gets less love whenever a new GCC
> is released and its quality is sometimes lower than C and C++ parts
> of GCC. In addition, Ada part of GCC supports far less platforms than
> C and C++ parts. (**)
[snipped]

FSF actually in their site 
http://www.gnu.org/prep/standards/standards.html#Design-Advice 
explicitly recommends use of C language as *the* language for FSF Open 
Source projects.

All other factors withstanding, they got clout in their realm.

--
Cesar Rabak



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  7:30       ` Jeffrey R. Carter
@ 2007-01-24 20:15         ` Alexander E. Kopilovich
  2007-01-25 22:16           ` Jeffrey R. Carter
  0 siblings, 1 reply; 397+ messages in thread
From: Alexander E. Kopilovich @ 2007-01-24 20:15 UTC (permalink / raw)
  To: comp.lang.ada

Jeffrey R. Carter wrote:

>> By the way, I think that the referenced (by above URL) article does not
>> contradict the apparently contested (under-marked) sentence, because
>> circumstances are far too different in several important aspects.
>
>Could you elaborate? It seemed pretty close to a controlled experiment 
>to me. The students were not specially selected; the problem was the 
>same; only the language and amount of solution made available to the 
>students changed.

The original statement (from kevin cline) was:

>>>  What makes a programmer
>>> like a new language?  Usually, someone comes along and says something
>>> like "Remember that program that we spent two weeks writing in C?
>>> Here's a Perl implementation that I put together in three hours and
>>> one-tenth the code."  That's never happened with Ada.

The article http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html
presents the case where use of Ada language in very specific circumstances
was much more effective than use of C language in the same circumstances.

Those circumstances include:

1) close support provided by teaching staff
2) full and precise spefications for the problem domain
3) stable general requirements for the task and at the same time relative
   freedom regarding details, and anyway, the absence of a stream of
   unexpected changes in requirements and/or scope and/or additional
   requirements

Compare these circumstances with those implied in the original statement
(quoted above). The latter certainly imply absence of condition 1 (permanent
close support), and also, the freedom of choice of a programming language
(or even an opportunity of discussion about that) very rarely comes to
programmers together with conditions 2 and 3.








^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  8:50     ` Dmitry A. Kazakov
@ 2007-01-24 20:23       ` Jeffrey R. Carter
  0 siblings, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-24 20:23 UTC (permalink / raw)


Dmitry A. Kazakov wrote:
> 
> Because C was sufficiently worse. At some point Visual Basic came and
> superseded both... (:-))

What a depressing idea ...

-- 
Jeff Carter
"Strange women lying in ponds distributing swords
is no basis for a system of government."
Monty Python & the Holy Grail
66



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  9:50         ` Maciej Sobczak
@ 2007-01-24 20:25           ` Jeffrey R. Carter
  2007-01-24 21:34             ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-24 20:25 UTC (permalink / raw)


Maciej Sobczak wrote:
> 
> Did it cover open-source projects as well?
> COBOL was popular at the time when development was centralized in big 
> companies, so it was easier to count the number of lines. Today every 
> kid is coding something and it's even hard to estimate how much code is 
> written every day that is just unnoticed. Just having Windows as a major 
> operating system (with milions of development shops shipping software 
> for it) gives a hint that COBOL might not be a winner any longer.

It covered projects for which people were paid to develop SW. When you 
include open-source and teenage kids creating buffer-overflow errors in 
their bedrooms, you may get a different result.

-- 
Jeff Carter
"Strange women lying in ponds distributing swords
is no basis for a system of government."
Monty Python & the Holy Grail
66



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: AW: How come Ada isn't more popular?
  2007-01-24  9:42       ` Maciej Sobczak
@ 2007-01-24 20:48         ` Jeffrey R. Carter
  0 siblings, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-24 20:48 UTC (permalink / raw)


Maciej Sobczak wrote:
> 
> Sorry, but this is made up of very thin air.
> What about SWEs that were never exposed to Ada?

You're going to expose them to Ada. They will like it. You'll know 
you've got a keeper.

> What about coders that were exposed to Ada but still have no clue?

They don't like Ada.

> As if these were the only progamming languages in the world. There are 
> ~2500, according to some very conservative estimations, so there is no 
> need to keep comparing just these two. "Ada is good, because it's better 
> than C" - is this the only thing that Ada can offer? :-) With ~2500 
> languages around just being better than C does not count as any 
> advantage, so I don't understand why do you use this as an argument so 
> often.
> If you want to sell Ada, compare it to Java or C++ or C#, for example.

The OP said he hired C programmers because he couldn't find Ada 
programmers, and he couldn't Ada programmers because he hired C 
programmers. So my reply is in those terms.

-- 
Jeff Carter
"Strange women lying in ponds distributing swords
is no basis for a system of government."
Monty Python & the Holy Grail
66



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 19:52                       ` Pascal Obry
@ 2007-01-24 21:31                         ` Markus E Leypold
  2007-03-19  2:09                           ` adaworks
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-24 21:31 UTC (permalink / raw)



Pascal Obry <pascal@obry.net> writes:

> Markus E Leypold a �crit :
>
>>  #include <stdio.h> 
>> 
>> is terribly expressive, because it pulls in such a lot of
>
> Certainly not. It does nothing at the semantic level. But I agree that

I can see your point here.

> it is not easy to have a common view about expressiveness, let's drop
> this then.


OK. :-)

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 20:25           ` Jeffrey R. Carter
@ 2007-01-24 21:34             ` Markus E Leypold
  2007-01-25  9:23               ` Markus E Leypold
  2007-01-26  7:59               ` Maciej Sobczak
  0 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-24 21:34 UTC (permalink / raw)



"Jeffrey R. Carter" <jrcarter@acm.org> writes:

> Maciej Sobczak wrote:
>> Did it cover open-source projects as well?
>> COBOL was popular at the time when development was centralized in
>> big companies, so it was easier to count the number of lines. Today
>> every kid is coding something and it's even hard to estimate how
>> much code is written every day that is just unnoticed. Just having
>> Windows as a major operating system (with milions of development
>> shops shipping software for it) gives a hint that COBOL might not be
>> a winner any longer.
>
> It covered projects for which people were paid to develop SW. When you
> include open-source and teenage kids creating buffer-overflow errors
> in their bedrooms, you may get a different result.

Teanage kids these days write c001 PHP web applications. Now buffer
overflows there, but any amount of security holes.

BTW, what one can learn from that, is, that it is the absence of
correct models and absence encapsulation of state and representation
that makes software bad (insecure / unsafe / whatever), not only the
buffer overflows.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 16:14       ` adaworks
@ 2007-01-25  0:22         ` kevin  cline
  2007-01-25  6:04           ` adaworks
  2007-01-25 10:42           ` Dmitry A. Kazakov
  2007-01-25  8:27         ` Harald Korneliussen
  1 sibling, 2 replies; 397+ messages in thread
From: kevin  cline @ 2007-01-25  0:22 UTC (permalink / raw)




On Jan 24, 10:14 am, <adawo...@sbcglobal.net> wrote:
> "kevin cline" <kevin.cl...@gmail.com> wrote in message >
> > Yes, I've read that article.  It would really be sad if Ada were not
> > superior to C for a toy problem in embedded control system development,
> > since Ada was designed specifically for that purpose.  But the point
> > was that expressiveness drives programmers to new languages, and Ada
> > isn't particularly expressive.It depends on what you want to express.   Expressiveness, like beauty,
> is often "in the eye of the beholder."  For many software professionals,
> the absence of curly-braces makes a language less expressive.  For
> others, their presence is a problem.   For some, brevity is a sign of
> expressiveness.  For others, ease of understanding on the part of a
> reader is important to expressiveness.
>
> I do know a lot of languages, including many in the C family.  I have
> programmed in COBOL, Fortran, PL/I, BASIC, Python, and many
> others.   I teach a class in comparative programming languages.
>
> An important distinction to be made is expressibility versus expressiveness.
> The fact that I can express a solution in Fortran that is better suited to being
> expressed in COBOL does not mean that Fortran is expressive.   When
> a solution is possible, that makes it expressible.  When the language is
> designed to express such solutions, that makes it expressive -- for that
> kind of problem.
>
> Languages evolve to become more expressive.  Fortran has evolved.  COBOL
> has evolved.  Ada has evolved.  Even Java continues to evolve.  Some languages
> seem not to evolve and remain stagnent.  PL/I comes to mind.   Some programmers
> do not evolve, and they also become out-of-date.
>
> As languages evolve some tend to get better at being more directly expressiveof
> a larger range of solutions.  Not all languages evolve well.   In my view, C++
> has not become better with its most recent "improvements."
>
> Ada is expressive of a large number of solutions that are more difficult to
> express in C or other languages.  However, the base language is not as
> expressive as Perl for regular expressions.  That has required a separate
> library unit.  It is not as directly expressive of mathematical solutions as
> Fortran.  That too requires library units.    All the fundamental constructs
> are in place to create powerful library units.

Not as powerful as I would like, and definitely not as powerful as C++
templates.  Christopher Grein demonstrated this conclusively in his
paper on modeling scientific units in Ada.  C++ templates allow the
straightforward construction of a compile-time safe system of physical
units, allowing arbitrary computations, while Ada's explicit
instantiation model forced one to rely on run-time checking.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  0:12 ` JPWoodruff
  2007-01-24 10:32   ` gautier_niouzes
@ 2007-01-25  1:01   ` Alexander E. Kopilovich
  2007-01-26  5:01     ` JPWoodruff
  1 sibling, 1 reply; 397+ messages in thread
From: Alexander E. Kopilovich @ 2007-01-25  1:01 UTC (permalink / raw)
  To: comp.lang.ada

JPWoodruff wrote:

> classes of smart young
>teenagers have had easy access to computers and amateur tools, and
>have honed their skills at what most of them called "hacking".  They
>learned to reason in low levels of abstraction.  They spent a lot of
>time in thread-of-execution debugging.

Well, thread-of-execution is an extremely powerful abstraction, which appears
both as high-level and as low-level one. And thread-of-execution debugging
richly furnishes this abstraction with emotionally colored and socially 
sharable practical cases. No wonder that it attracts some fraction of smart
young teenagers.

> One way is to want to write interesting essays in the form of executable
>programs.  Ada is one of the finest tools for this task - at least as
>far as our kind of program is concerned.

The problem with this way is that it is hard to find a publishable subject,
which has sufficient potential of interest for "smart young teenagers" and
at the same time clearly demonstrates the strengths of Ada language.

It is because Ada language isn't adapted well for vague outlining of
possibilities and opportunities. Its main strengths are in expression of
very real and actual things, which, if are interesting, unfortunately almost
always are parts (or contain elements of or somehow connected with)
military/commercial/trade secrets.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 19:25       ` tmoran
@ 2007-01-25  4:46         ` Gautier
  2007-01-25  9:29           ` Markus E Leypold
                             ` (2 more replies)
  0 siblings, 3 replies; 397+ messages in thread
From: Gautier @ 2007-01-25  4:46 UTC (permalink / raw)


tmoran@acm.org wrote:
>>> Windows 95 was the 1st widely used OS with support for tasking. Ada (95)
>>> was the only widely available language with support for tasking at the
>>> time. We probably lost a good opportunity to gain more acceptance of Ada
>>> by not including a standard windowing library and promoting Ada as the
>>> best language for taking advantage of Win95's features.
>> Mmmh I think  it was a good idea *not* to include a standard windowing
>> library: then now Ada would be stuck with an outdated standard
>> windowing library. There was also another problem then: the lack of a
>> good but cheap or free compiler.
> 
>     Actually some of us did try to make a standard Windows library and
> promote Ada as the best language for Windows programming (see "CLAW, a
> High Level, Portable, Ada 95 Binding for Microsoft Windows" TriAda 1997).
> As a matter of fact, it seems everybody who wanted access to the Windows
> API designed a standard windowing library - eg GWindows et al.

The point is that neither CLAW nor GWindows were included in the Ada standard, 
and it is a good thing. And promoting Ada for Windows programming in an Ada 
conference is good, but that won't make that language a lot more popular: you 
have to make promotion outside the insider circle...

>     Also, there was the Janus Ada compiler which was pretty cheap at
> something like $100, and the early versions of the free Gnat compiler were
> coming out at that time.

I'm afraid you read a bit too quickly: I discussed about finding a (_good_) 
and (cheap or free) compiler in 1995. GNAT needed a few years to become really 
good, IHMO.
______________________________________________________________
Gautier         -- http://www.mysunrise.ch/users/gdm/index.htm
Ada programming -- http://www.mysunrise.ch/users/gdm/gsoft.htm

NB: For a direct answer, e-mail address on the Web site!



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  7:42     ` kevin  cline
  2007-01-24  8:07       ` Ludovic Brenta
  2007-01-24 16:14       ` adaworks
@ 2007-01-25  4:50       ` Alexander E. Kopilovich
  2007-01-27  5:43       ` Charles D Hixson
  3 siblings, 0 replies; 397+ messages in thread
From: Alexander E. Kopilovich @ 2007-01-25  4:50 UTC (permalink / raw)
  To: comp.lang.ada

kevin cline wrote:

> the point 
>was that expressiveness drives programmers to new languages, and Ada
>isn't particularly expressive.

This is substantially inaccurate statement.

Ada is very expressive language, perhaps the most expressive language overall
among programming languages for which more than one commercially supported
compiler exists.

But there is indeed big issue with expressiveness in Ada, which hampers it
popularity: that big part of Ada expressiveness, which is really outstanding,
pertains to concrete information (both for data and for program structure),
which is likely to be proprietary in too many cases and thus not sharable. 

At the same time Ada has poor expressiveness for incomplete (in one sense
or another) programs and components - thus making random collective 
development almost impossible.






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  0:22         ` kevin  cline
@ 2007-01-25  6:04           ` adaworks
  2007-01-25 10:37             ` Maciej Sobczak
  2007-01-25 10:42           ` Dmitry A. Kazakov
  1 sibling, 1 reply; 397+ messages in thread
From: adaworks @ 2007-01-25  6:04 UTC (permalink / raw)



"kevin cline" <kevin.cline@gmail.com> wrote in message 
news:1169684558.876074.40530@s48g2000cws.googlegroups.com...
>
>
> On Jan 24, 10:14 am, <adawo...@sbcglobal.net> wrote:
>> "kevin cline" <kevin.cl...@gmail.com> wrote in message >
>> > Yes, I've read that article.  It would really be sad if Ada were not
>> > superior to C for a toy problem in embedded control system development,
>> > since Ada was designed specifically for that purpose.  But the point
>> > was that expressiveness drives programmers to new languages, and Ada
>> > isn't particularly expressive.It depends on what you want to express. 
>> > Expressiveness, like beauty,
>> is often "in the eye of the beholder."  For many software professionals,
>> the absence of curly-braces makes a language less expressive.  For
>> others, their presence is a problem.   For some, brevity is a sign of
>> expressiveness.  For others, ease of understanding on the part of a
>> reader is important to expressiveness.
>>
>> I do know a lot of languages, including many in the C family.  I have
>> programmed in COBOL, Fortran, PL/I, BASIC, Python, and many
>> others.   I teach a class in comparative programming languages.
>>
>> An important distinction to be made is expressibility versus expressiveness.
>> The fact that I can express a solution in Fortran that is better suited to 
>> being
>> expressed in COBOL does not mean that Fortran is expressive.   When
>> a solution is possible, that makes it expressible.  When the language is
>> designed to express such solutions, that makes it expressive -- for that
>> kind of problem.
>>
>> Languages evolve to become more expressive.  Fortran has evolved.  COBOL
>> has evolved.  Ada has evolved.  Even Java continues to evolve.  Some 
>> languages
>> seem not to evolve and remain stagnent.  PL/I comes to mind.   Some 
>> programmers
>> do not evolve, and they also become out-of-date.
>>
>> As languages evolve some tend to get better at being more directly 
>> expressiveof
>> a larger range of solutions.  Not all languages evolve well.   In my view, 
>> C++
>> has not become better with its most recent "improvements."
>>
>> Ada is expressive of a large number of solutions that are more difficult to
>> express in C or other languages.  However, the base language is not as
>> expressive as Perl for regular expressions.  That has required a separate
>> library unit.  It is not as directly expressive of mathematical solutions as
>> Fortran.  That too requires library units.    All the fundamental constructs
>> are in place to create powerful library units.
>
> Not as powerful as I would like, and definitely not as powerful as C++
> templates.

But safer than C++, overall.  And safer than C++ templates.

>Christopher Grein demonstrated this conclusively in his
> paper on modeling scientific units in Ada.  C++ templates allow the
> straightforward construction of a compile-time safe system of physical
> units, allowing arbitrary computations, while Ada's explicit
> instantiation model forced one to rely on run-time checking.
>
I have not read Christopher Grein's paper, and from what I know
of C++ and C++ templates, I am skeptical of the notion of it being
compile-time safe under any circumstances.   I suppose one could
write some very careful code to make that true, but the language
itself is not inherently safe.

I have not had the experience of having to rely on run-time checking
for templates.   In fact, my own experience of generics is that they
tend to enforce a great deal of checking at compile-time.  That being
said, C++ templates do allow more complex variations on genericity,
and that does not contribute to type-safe software.

The whole point of Ada is to allow the maximum amount of checking
at compile-time.   In fact, the point of the language design is to maximize
the amount of checking possible over the entire system as early in the
development as possible.   C++ does not support anything close to
that goal except in a few isolated places within the langauge design.
We would want to evaluate the language as a whole, not one or
two features.   We can compare C++ to Visual Basic and make
Visual Basic come out ahead.  We can compare Ada to COBOL
and make COBOL come out ahead.   In both cases, it will depend
on our criteria and what we decide to compare.

As to power.  With Ada you get more power, safely, than you do in
C++ or Java.   Empowering a language, while making it less safe
is not a good thing.   As for Grein's paper, which I will take the time
to read, it is my experience that one can make a case quite nicely
by stacking the evidence in a particular way.    I will try to find a
copy of the paper and make my own judgement about that.

Richard Riehle








^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 18:09                   ` Pascal Obry
  2007-01-24 19:37                     ` Markus E Leypold
@ 2007-01-25  7:52                     ` Harald Korneliussen
  1 sibling, 0 replies; 397+ messages in thread
From: Harald Korneliussen @ 2007-01-25  7:52 UTC (permalink / raw)




On 24 Jan, 19:09, Pascal Obry <pas...@obry.net> wrote:
> For me, a high expressiveness in a
> programming language is how easy/simple it is to declare something that
> can be quite complex underneath.
>

But does that really matter all that much, as long as there are good
means of abstraction? I'd say that functional languages do perhaps have
an advantage there --- from the little I've tried, it seems that they
have some genuinely different ways of gluing things together.
Combinator-based parsing libraries would be an example. Amazing stuff,
which is not feasible in Ada or C++ (you could probably torture
templates into doing it in C++, but I don't count that as feasible).
But you can't easily do low-level stuff in them, which you can in Ada,
and although the type system in Ada is less extensible, it also has
some lovely features like the Positive numeric type, which can probably
be emulated in Haskell, but usually aren't.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 16:14       ` adaworks
  2007-01-25  0:22         ` kevin  cline
@ 2007-01-25  8:27         ` Harald Korneliussen
  1 sibling, 0 replies; 397+ messages in thread
From: Harald Korneliussen @ 2007-01-25  8:27 UTC (permalink / raw)


Richard Riehle wrote:
> The type model of Ada does enhance expressiveness since it allows a
> lot of extensions.   There are certainly things I would do differently in
> that type system, and future languages will come along that improve on
> Ada.  At present, that has not happened -- certainly not with a type
> model that I like.
Just wondering, what do you think is the problem with H-M-type systems,
like ML and Haskell use? It seems to me they are a wonderful tool for
extending the type system to check properties at compile time that
would otherwise have to be checked at run time. I recently read about
an XML combinator library for Haskell that some researcher had written
that ensured that only valid XML could be generated - and it guaranteed
this at compile time!




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 19:18       ` Markus E Leypold
@ 2007-01-25  8:37         ` Maciej Sobczak
  2007-01-25  9:40           ` Markus E Leypold
  2007-01-25 10:13           ` Harald Korneliussen
  0 siblings, 2 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-01-25  8:37 UTC (permalink / raw)


Markus E Leypold wrote:

>> both languages are easily "bindable". You can even have C API for Ada
>> implementation - pragma Export is just as useful as pragma Import!.
>> In other words, you don't need to use C for implementation part even
>> if you want to have C API for reasons of useability. This is exacly
>> what the open-source guys don't seem to get right.
> 
> So Ada = Closed Source = Good and C = Open Source = Bad or something
> like this? I don't understand it.

Of course not. There is a significant amount of extremely good open 
source software around, and I also suppose that some crappy Ada code 
exists as well (it's harder to find it, but for the sake of argument I 
can write some ;-) ).
My point is that the majority of open source world seems to get stuck 
with C as a main development language for reasons which I don't really 
understand. I understand the use of C on the interface level (everything 
can bind to C, so it's the "least common denominator" for interfacing), 
but internally many projects would greatly benefit from using just about 
anything else. Developers reject this idea on the grounds that C is 
*the* progamming language for open source, end of story. I think that 
GNU luminaries (Stallman in particular) add to this mindset by 
publishing web pages promoting C as the language of choice and the crowd 
follows.

> Apart from that, me seems it would be a bit difficult to have a C API
> to some Ada library, since Ada requires quite a lot of runtime support
> for tasking, so certainly doesn't interact to well C-code which also
> use signals, longjmp etc.

Your argument can be applied in the other direction as well. How about 
binding Ada to C libraries that use non-Ada runtime internally? It's 
enough if the C library uses POSIX threads and its interaction with 
tasking in Ada is already outside of what AARM specifies.
Binding (in any direction) requires some assuptions about the two 
environments. There is way to escape them.


-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 21:34             ` Markus E Leypold
@ 2007-01-25  9:23               ` Markus E Leypold
  2007-01-26  7:59               ` Maciej Sobczak
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-25  9:23 UTC (permalink / raw)




Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> "Jeffrey R. Carter" <jrcarter@acm.org> writes:
>
>> Maciej Sobczak wrote:
>>> Did it cover open-source projects as well?
>>> COBOL was popular at the time when development was centralized in
>>> big companies, so it was easier to count the number of lines. Today
>>> every kid is coding something and it's even hard to estimate how
>>> much code is written every day that is just unnoticed. Just having
>>> Windows as a major operating system (with milions of development
>>> shops shipping software for it) gives a hint that COBOL might not be
>>> a winner any longer.
>>
>> It covered projects for which people were paid to develop SW. When you
>> include open-source and teenage kids creating buffer-overflow errors
>> in their bedrooms, you may get a different result.
>
> Teanage kids these days write c001 PHP web applications. Now buffer
> overflows there, but any amount of security holes.

s/Teanage/Teenage/;s/Now/No/

Sorry.

>
> BTW, what one can learn from that, is, that it is the absence of
> correct models and absence encapsulation of state and representation
> that makes software bad (insecure / unsafe / whatever), not only the
> buffer overflows.
>
> Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  4:46         ` Gautier
@ 2007-01-25  9:29           ` Markus E Leypold
  2007-01-27 16:59             ` Stephen Leake
  2007-01-25 21:42           ` Randy Brukardt
  2007-01-25 22:21           ` Jeffrey R. Carter
  2 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-25  9:29 UTC (permalink / raw)



Gautier <gautier@fakeaddress.nil> writes:

>>     Also, there was the Janus Ada compiler which was pretty cheap at
>> something like $100, and the early versions of the free Gnat compiler were
>> coming out at that time.
>
> I'm afraid you read a bit too quickly: I discussed about finding a
> (_good_) and (cheap or free) compiler in 1995. GNAT needed a few years
> to become really good, IHMO.

Given, that I recently found in 3.15p (which is not the newest one, I know):

  - Really bad bugs handling Read and Write of discrimated records,

  - Race conditions in the runtime system when trying to catch
    interrupts,

  - No way to catch the console break under windows with the interupts
    mechanism (i.e. a design error in my opinion),


I wonder wether GNAT was good even in, say 2002 (or whatever was
3.15p's release date).

And all those problems are really expensive to circumvent (partly
because the runtime system insists on fiddling with the signal
handlers).


Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  8:37         ` Maciej Sobczak
@ 2007-01-25  9:40           ` Markus E Leypold
  2007-01-26  8:52             ` Ludovic Brenta
  2007-01-27 16:56             ` Stephen Leake
  2007-01-25 10:13           ` Harald Korneliussen
  1 sibling, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-25  9:40 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:
>
>>> both languages are easily "bindable". You can even have C API for Ada
>>> implementation - pragma Export is just as useful as pragma Import!.
>>> In other words, you don't need to use C for implementation part even
>>> if you want to have C API for reasons of useability. This is exacly
>>> what the open-source guys don't seem to get right.
>> So Ada = Closed Source = Good and C = Open Source = Bad or something
>> like this? I don't understand it.
>
> Of course not. There is a significant amount of extremely good open
> source software around, and I also suppose that some crappy Ada code
> exists as well (it's harder to find it, but for the sake of argument I
> can write some ;-) ).

Well. I remember having read some of it. :-). The language does not
guarantee good design.

> My point is that the majority of open source world seems to get stuck
> with C as a main development language for reasons which I don't really
> understand. I understand the use of C on the interface level

I share your not-understanding. My understanding of the reason for
this situation was always that it has something todo with (a)
premature optimization and (b) the lowest common denominator: A C
library can be bound to different language APIs, but a, say Python
library cannot. I doubt a Ada library could.

Apropos Python: There are a lot of languages that are rather popular
in OSS development apart from C. KDE is bound to Qt so cannot leave
C. GNOME AFAI understand will slowly shift to Mono as application run
time and thus will have more and mor C# code (that also shows what the
OSS guys percieve as their main problem: Pointers and memory
management, both of which have not been really solved in Ada to an
extent desirable for application programming (as opposed to
embedded)).

> (everything can bind to C, so it's the "least common denominator" for

Ooops. I didn't read so far. We agree :-).

> interfacing), but internally many projects would greatly benefit from
> using just about anything else. Developers reject this idea on the
> grounds that C is *the* progamming language for open source, end of

More likely, its "the available compiler".

> story. I think that GNU luminaries (Stallman in particular) add to
> this mindset by publishing web pages promoting C as the language of
> choice and the crowd follows.

I don't think so. Nobody really has a problem to contradict RMS :-).

>> Apart from that, me seems it would be a bit difficult to have a C API
>> to some Ada library, since Ada requires quite a lot of runtime support
>> for tasking, so certainly doesn't interact to well C-code which also
>> use signals, longjmp etc.
>
> Your argument can be applied in the other direction as well. How about
> binding Ada to C libraries that use non-Ada runtime internally? It's

Yes, this is a problem. Therefore the usual approach is, to avoid this
or at least to make the runtime separable and exchangeable.

> enough if the C library uses POSIX threads and its interaction with
> tasking in Ada is already outside of what AARM specifies.

Exactly. But you can writ Code in C which just does something with
things on the stack, doesn't fiddle with threads or signals and thus
interacts minimally with the host runtime. That's the reason why C is
so popular for libraries: You can.

> Binding (in any direction) requires some assuptions about the two
> environments. There is way to escape them.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  8:37         ` Maciej Sobczak
  2007-01-25  9:40           ` Markus E Leypold
@ 2007-01-25 10:13           ` Harald Korneliussen
  2007-01-25 12:54             ` Markus E Leypold
                               ` (3 more replies)
  1 sibling, 4 replies; 397+ messages in thread
From: Harald Korneliussen @ 2007-01-25 10:13 UTC (permalink / raw)




On 25 Jan, 09:37, Maciej Sobczak <no.s...@no.spam.com> wrote:
> Markus E Leypold wrote:
> I think that
> GNU luminaries (Stallman in particular) add to this mindset by
> publishing web pages promoting C as the language of choice and the crowd
> follows.

I'm pretty sure Stallman has more love for Lisp and Scheme than C,
given his background. I don't think it's a coincidence that the lexer
amd parser, Bison/Flex, was something of the first they made for GNU,
and that gcc, pretty uniquely among C compilers, has support for
tail-call elimination and inner functions. Raymond is another case, and
quite explicitly anti-Ada. (I may be wrong, but it seems to me that
Larry Wall, the author of Perl, has something of a love/hate
relationship with Ada. Didn't he first publish the diff algorithm in
Ada way back when?)

Anyway, I think individuals are less important than culture. You could
read this straight out of the wikipedia page for C (until I changed it
slightly ...): "the safe, effective use of C requires more programmer
skill, experience, effort, and attention to detail than is required for
some other programming languages." So if you use C, that means you are
skilled, experienced and attentive, by some people's logic. It's a
macho thing, "If you can't handle the power, don't use it!".

Klingon programmers if there ever were any.

It's of course a symptom of lack of professionalism: construction
workers who are proud of what they do wear their helmets.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  6:04           ` adaworks
@ 2007-01-25 10:37             ` Maciej Sobczak
  2007-01-25 23:36               ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-01-25 10:37 UTC (permalink / raw)


adaworks@sbcglobal.net wrote:

> I have not read Christopher Grein's paper, and from what I know
> of C++ and C++ templates, I am skeptical of the notion of it being
> compile-time safe under any circumstances.

What is there in C++ templates that makes them unsafe?

> C++ templates do allow more complex variations on genericity,
> and that does not contribute to type-safe software.

Could you elaborate on this?
Do you mean that expressiveness and flexibility go against type-safety?


-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  0:22         ` kevin  cline
  2007-01-25  6:04           ` adaworks
@ 2007-01-25 10:42           ` Dmitry A. Kazakov
  1 sibling, 0 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-01-25 10:42 UTC (permalink / raw)


On 24 Jan 2007 16:22:38 -0800, kevin  cline wrote:

> Christopher Grein demonstrated this conclusively in his
> paper on modeling scientific units in Ada.  C++ templates allow the
> straightforward construction of a compile-time safe system of physical
> units, allowing arbitrary computations, while Ada's explicit
> instantiation model forced one to rely on run-time checking.

There exist numerous application where run-time unit checking is a
requirement. For instance in automation and control, data acquisition and
distribution, HMI. Templates fail there.

BTW, I am afraid that the solution you referred to, is unable to handle �C,
�F. Not much promising, I would say...

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  6:37 ` adaworks
                     ` (3 preceding siblings ...)
  2007-01-23 20:09   ` Jeffrey R. Carter
@ 2007-01-25 11:31   ` Ali Bendriss
  2007-01-27  5:12     ` Charles D Hixson
  4 siblings, 1 reply; 397+ messages in thread
From: Ali Bendriss @ 2007-01-25 11:31 UTC (permalink / raw)
  To: comp.lang.ada

On Tuesday 23 January 2007 06:37, adaworks@sbcglobal.net wrote:
> <artifact.one@googlemail.com> wrote in message
> news:1169531612.200010.153120@38g2000cwa.googlegroups.com...
>
> > My question is: how come Ada isn't more popular?
>
> Ada suffered, in its early days, from a convergence of several
> things.  One is that the designers of the language did not anticipate
> the impact of the personal computer and the democratization of
> computing.   There were other factors, as well.
> [...]

Does the fact that there is no "public" operating system written in Ada is one 
of this factor ? 
I think about the closest link between the C language and the *nix systems.


-- 
Ali 



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 10:13           ` Harald Korneliussen
@ 2007-01-25 12:54             ` Markus E Leypold
  2007-01-26  7:03               ` Harald Korneliussen
  2007-01-25 13:08             ` Markus E Leypold
                               ` (2 subsequent siblings)
  3 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-25 12:54 UTC (permalink / raw)



"Harald Korneliussen" <vintermann@gmail.com> writes:

> On 25 Jan, 09:37, Maciej Sobczak <no.s...@no.spam.com> wrote:
>> Markus E Leypold wrote:
>> I think that
>> GNU luminaries (Stallman in particular) add to this mindset by
>> publishing web pages promoting C as the language of choice and the crowd
>> follows.

It was not me, that wrote this, but Maciej. Looks a bit strange this
way to quote.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 10:13           ` Harald Korneliussen
  2007-01-25 12:54             ` Markus E Leypold
@ 2007-01-25 13:08             ` Markus E Leypold
  2007-01-25 22:36             ` Jeffrey R. Carter
  2007-01-27  5:30             ` Charles D Hixson
  3 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-25 13:08 UTC (permalink / raw)




"Harald Korneliussen" <vintermann@gmail.com> writes:

> skill, experience, effort, and attention to detail than is required for
> some other programming languages." So if you use C, that means you are
> skilled, experienced and attentive, by some people's logic. It's a
> macho thing, "If you can't handle the power, don't use it!".
>
> Klingon programmers if there ever were any.

That, of course is a general problem in IT: Public
criticism/discussion of "popular" (read hyped) paradigms or tools is
hardly possible, because that seems to imply that the critic is just
not able to use it ("What you can't program safely in C -- I can!"). I
might go farther and note that in teams/projects criticism is often
disliked: What is called a 'can-do' attitude in cheap management
literature is considered to be important. ("According to my analysis,
we can not write this piece of software in 2 months, since ..." --
"What, are you so incompetent? Your colleague / competitor says he can
do it. So'll give it to him").

So in my opinion you only point out a sub-aspect of a quite larger
problem.

> It's of course a symptom of lack of professionalism: construction
> workers who are proud of what they do wear their helmets.

Quite right. But in computer science (esp. software engineering) the
hard data is often quite missing, so the participants in these
discussions replace it by belief. Whereas nobody expects a proud
construction worker to wear amulets and nobody seriously suggests it,
because the majority agrees that helmets protect and amulets probably
not, the situation is different for development paradigms, processes
and tools. And usually there is no hard data available. So even those
which act wrong from your point of view can continue believing that
they are doing the right thing. 

There is also the aspect of marketing which tends to further distort
reality (or whatever goes for it in IT).

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 16:25         ` Adam Beneschan
  2007-01-24 17:03           ` Niklas Holsti
@ 2007-01-25 15:37           ` Bob Spooner
  1 sibling, 0 replies; 397+ messages in thread
From: Bob Spooner @ 2007-01-25 15:37 UTC (permalink / raw)



"Adam Beneschan" <adam@irvine.com> wrote in message 
news:1169655934.028861.51000@j27g2000cwj.googlegroups.com...
> ...
> Perhaps we should just concede that C is a "more expressive
> language"---with about as much benefit as there is to teaching math
> students to be "more expressive" as opposed to getting the right
> answer.
>
>                               -- Adam
>
My view of expressiveness in a computer language is that it is inversely 
proportional to the distance between the problem space and the solution 
space. In other words, it is a measure of how high a level of abstraction in 
which you can work when you use it.

Bob 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  4:46         ` Gautier
  2007-01-25  9:29           ` Markus E Leypold
@ 2007-01-25 21:42           ` Randy Brukardt
  2007-01-28 19:32             ` Gautier
  2007-01-25 22:21           ` Jeffrey R. Carter
  2 siblings, 1 reply; 397+ messages in thread
From: Randy Brukardt @ 2007-01-25 21:42 UTC (permalink / raw)


"Gautier" <gautier@fakeaddress.nil> wrote in message
news:45b8361a_5@news.bluewin.ch...
...
> I'm afraid you read a bit too quickly: I discussed about finding a
(_good_)
> and (cheap or free) compiler in 1995. GNAT needed a few years to become
really
> good, IHMO.

"Good" is highly subjective, and what you think of as "good" may differ
widely from other people. I know Tom thinks Janus/Ada was pretty good, and
long before 1995...

Your opinion of good may differ, depending on what you want to do.

                      Randy.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 20:15         ` Alexander E. Kopilovich
@ 2007-01-25 22:16           ` Jeffrey R. Carter
  2007-01-25 23:32             ` Markus E Leypold
  2007-01-26 22:05             ` Alexander E. Kopilovich
  0 siblings, 2 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-25 22:16 UTC (permalink / raw)


Alexander E. Kopilovich wrote:
> 
> The original statement (from kevin cline) was:
> 
>>>>  What makes a programmer
>>>> like a new language?  Usually, someone comes along and says something
>>>> like "Remember that program that we spent two weeks writing in C?
>>>> Here's a Perl implementation that I put together in three hours and
>>>> one-tenth the code."  That's never happened with Ada.
> 
> The article http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html
> presents the case where use of Ada language in very specific circumstances
> was much more effective than use of C language in the same circumstances.

The circumstances differed in this respect: The C students were given 
60% of the teacher's solution. Initially, the Ada students were given 
10% (or maybe less).

> Those circumstances include:
> 
> 1) close support provided by teaching staff
> 2) full and precise spefications for the problem domain
> 3) stable general requirements for the task and at the same time relative
>    freedom regarding details, and anyway, the absence of a stream of
>    unexpected changes in requirements and/or scope and/or additional
>    requirements

I agree that 1. doesn't apply to the OP's statement. However, I consider 
a running program to be a very precise and stable specification, so 2. 
and 3. appear to apply. And it's definitely a case of "Remember that 
program you couldn't get working in C in an entire semester even though 
you were given 60% of the code? Here's an Ada implementation that I got 
working in a semester when given 10% of the code."

-- 
Jeff Carter
"From this day on, the official language of San Marcos will be Swedish."
Bananas
28



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  4:46         ` Gautier
  2007-01-25  9:29           ` Markus E Leypold
  2007-01-25 21:42           ` Randy Brukardt
@ 2007-01-25 22:21           ` Jeffrey R. Carter
  2 siblings, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-25 22:21 UTC (permalink / raw)


Gautier wrote:
> 
> The point is that neither CLAW nor GWindows were included in the Ada 
> standard, and it is a good thing. And promoting Ada for Windows 
> programming in an Ada conference is good, but that won't make that 
> language a lot more popular: you have to make promotion outside the 
> insider circle...

I think I may have been misunderstood. I wasn't talking about Ada 95 
including a standard library for MS Windows; I was talking about a 
portable standard windowing library. On MS Windows platforms, that 
library would target MS Windows.

> I'm afraid you read a bit too quickly: I discussed about finding a 
> (_good_) and (cheap or free) compiler in 1995. GNAT needed a few years 
> to become really good, IHMO.

Janus Ada was, and I'm sure still is, a good compiler. I used it for 
many years after winning a copy at the Tri-Ada programming contest.

-- 
Jeff Carter
"From this day on, the official language of San Marcos will be Swedish."
Bananas
28



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
       [not found]     ` <egYth.15026$w91.10597@newsread1.news.pas.earthlink.net>
@ 2007-01-25 22:34       ` Jeffrey R. Carter
  2007-01-25 22:55         ` Robert A Duff
  2007-01-27  3:54         ` Randy Brukardt
  0 siblings, 2 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-25 22:34 UTC (permalink / raw)


Dennis Lee Bieber wrote:
>>
> 	Well... as I recall, it ("Green") became DOD standard 1815 in
> December of 1980. It took another few years to become an ISO standard.

1980 Dec: MIL-STD-1815, Ada 80
1983 Feb: ANSI/MIL-STD-1815A, Ada 83 (adopted as an ISO standard 1987)
1995 Jan: ISO/IEC 8652:1995, Ada 95 (Technical Corrigendum 1 adopted in 
2000)

Hopefully 2007 will see Ada 0X become a standard. On the Ada-comment 
mailing list there's a discussion of revising the wording for 
"equivalence" for ordered containers (originally discussed on c.l.a), so 
it may be a while yet.

-- 
Jeff Carter
"From this day on, the official language of San Marcos will be Swedish."
Bananas
28



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 10:13           ` Harald Korneliussen
  2007-01-25 12:54             ` Markus E Leypold
  2007-01-25 13:08             ` Markus E Leypold
@ 2007-01-25 22:36             ` Jeffrey R. Carter
  2007-01-25 23:26               ` Markus E Leypold
  2007-01-26  7:16               ` Harald Korneliussen
  2007-01-27  5:30             ` Charles D Hixson
  3 siblings, 2 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-25 22:36 UTC (permalink / raw)


Harald Korneliussen wrote:
> 
> Anyway, I think individuals are less important than culture. You could
> read this straight out of the wikipedia page for C (until I changed it
> slightly ...): "the safe, effective use of C requires more programmer
> skill, experience, effort, and attention to detail than is required for
> some other programming languages." So if you use C, that means you are
> skilled, experienced and attentive, by some people's logic. It's a
> macho thing, "If you can't handle the power, don't use it!".

The only safe use of C is as a target language for code generators (such 
as the SofCheck Ada -> C compiler). The continuing creation of 
buffer-overflow errors in C shows that, in practice, it is impossible 
for humans to create safe C.

-- 
Jeff Carter
"From this day on, the official language of San Marcos will be Swedish."
Bananas
28



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 22:34       ` Jeffrey R. Carter
@ 2007-01-25 22:55         ` Robert A Duff
  2007-01-26 19:59           ` Jeffrey R. Carter
  2007-01-27  3:54         ` Randy Brukardt
  1 sibling, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-01-25 22:55 UTC (permalink / raw)


"Jeffrey R. Carter" <jrcarter@acm.org> writes:

> Dennis Lee Bieber wrote:
>>>
>> 	Well... as I recall, it ("Green") became DOD standard 1815 in
>> December of 1980. It took another few years to become an ISO standard.
>
> 1980 Dec: MIL-STD-1815, Ada 80

I don't think the so-called Ada 80 standard was used for much of
anything.  Was it?

> 1983 Feb: ANSI/MIL-STD-1815A, Ada 83 (adopted as an ISO standard 1987)

That was the first real standard.

> 1995 Jan: ISO/IEC 8652:1995, Ada 95 (Technical Corrigendum 1 adopted in
> 2000)

> Hopefully 2007 will see Ada 0X become a standard.

I think Ada 2005 will become an ISO standard in 2007.
Even if ISO red tape holds it up for another 50 years,
it's still Ada 2005, in common parlance, and just "Ada"
in official parlance.

>... On the Ada-comment
> mailing list there's a discussion of revising the wording for
> "equivalence" for ordered containers (originally discussed on c.l.a), so
> it may be a while yet.

No.  Such minor bugs in the Standard will not delay it.
If a fix is necessary, it will be a correction to Ada 2005.

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 22:36             ` Jeffrey R. Carter
@ 2007-01-25 23:26               ` Markus E Leypold
  2007-01-26  4:23                 ` Jeffrey R. Carter
  2007-01-26  7:21                 ` Harald Korneliussen
  2007-01-26  7:16               ` Harald Korneliussen
  1 sibling, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-25 23:26 UTC (permalink / raw)



"Jeffrey R. Carter" <jrcarter@acm.org> writes:

> Harald Korneliussen wrote:
>> Anyway, I think individuals are less important than culture. You
>> could
>> read this straight out of the wikipedia page for C (until I changed it
>> slightly ...): "the safe, effective use of C requires more programmer
>> skill, experience, effort, and attention to detail than is required for
>> some other programming languages." So if you use C, that means you are
>> skilled, experienced and attentive, by some people's logic. It's a
>> macho thing, "If you can't handle the power, don't use it!".
>
> The only safe use of C is as a target language for code generators
> (such as the SofCheck Ada -> C compiler). The continuing creation of
> buffer-overflow errors in C shows that, in practice, it is impossible
> for humans to create safe C.

Not to promote C, but purely from a logical perspective: The
"continuing creation of buffer-overflow errors in C" only shows that
there exist programmers/humans that don't always create safe programs
in C.

Since I assume that the same thing (unsafe programs) applies to Ada
(say with respect to unhandled exceptions (Ariane, for instance ...)
or with respect to memory leaks or priority inversions in tasking),
the only remaining line on which an argument of the usability of C
vs. that of Ada can develop is quantitatively, not qualitatively.

I'd like just to point out that the world even in this respect is
neither black and white (or at least you haven't demonstrated that
convincingly yet), but more shades of gray. I concede the gray might
be lighter in Ada sector and darker in the C sector, but I'd prefer
quantitative arguments / reasoning in this world and some effort to
estimate the difference between those shades over striking but
logically flawed reasoning. I want absolutes I'll turn to religion. :-).

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 22:16           ` Jeffrey R. Carter
@ 2007-01-25 23:32             ` Markus E Leypold
  2007-01-26  8:50               ` AW: " Grein, Christoph (Fa. ESG)
  2007-01-26  8:56               ` Ludovic Brenta
  2007-01-26 22:05             ` Alexander E. Kopilovich
  1 sibling, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-25 23:32 UTC (permalink / raw)



"Jeffrey R. Carter" <jrcarter@acm.org> writes:

> Alexander E. Kopilovich wrote:
>> The original statement (from kevin cline) was:
>>
>>>>>  What makes a programmer
>>>>> like a new language?  Usually, someone comes along and says something
>>>>> like "Remember that program that we spent two weeks writing in C?
>>>>> Here's a Perl implementation that I put together in three hours and
>>>>> one-tenth the code."  That's never happened with Ada.
>> The article
>> http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html
>> presents the case where use of Ada language in very specific circumstances
>> was much more effective than use of C language in the same circumstances.
>
> The circumstances differed in this respect: The C students were given
> 60% of the teacher's solution. Initially, the Ada students were given
> 10% (or maybe less).

So it was not really a comparable situation. One might argue that
already given partial solutions demotivate and that they have dangers
of its own (often draw people on a wrong path of thinking). I've
actually seen people given a partial solution perfoming worse than
those who had to do without.

Mind you, I do not suggest that really is the reason for the
differences found (I haven't even read the study yet), but I see a
problem with a study where the 2 groups compared are not really
starting from the same point. The only different factor should be the
programming language, not the rest of the setup.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 10:37             ` Maciej Sobczak
@ 2007-01-25 23:36               ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-25 23:36 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> adaworks@sbcglobal.net wrote:
>
>> I have not read Christopher Grein's paper, and from what I know
>> of C++ and C++ templates, I am skeptical of the notion of it being
>> compile-time safe under any circumstances.
>
> What is there in C++ templates that makes them unsafe?

Yes, the "under any circumstance" makes me rather
suspicious. Overgeneralizing? There might be (indeed are) good reasons
for disliking C++ templates but that statement, I think, will not hold
under examanation as a sufficient one.

>
>> C++ templates do allow more complex variations on genericity,
>> and that does not contribute to type-safe software.
>
> Could you elaborate on this?
> Do you mean that expressiveness and flexibility go against type-safety?

'adaworks' has already suggested that Java is not type safe. I fear he
does not agree with my definition of type safe.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 21:56   ` Dr. Adrian Wrigley
                       ` (2 preceding siblings ...)
  2007-01-24 19:38     ` artifact.one
@ 2007-01-26  2:50     ` Keith Thompson
  2007-01-26  5:29     ` Gautier
  2007-01-27  5:22     ` Charles D Hixson
  5 siblings, 0 replies; 397+ messages in thread
From: Keith Thompson @ 2007-01-26  2:50 UTC (permalink / raw)


"Dr. Adrian Wrigley" <amtw@linuxchip.demon.co.uk.uk.uk> writes:
[...]
> I think this is critical.  Why can't we just say:
> 
> with stdio;
> 
> pragma import (C, stdio, "stdio.h");
> 
> and be able to get structs, functions, constants, variables from C in
> an obvious and reasonably reliable way?
[...]

Probably because there is no "obvious and reasonably reliable way" to
do this.

Presumably the above is intended to import the contents of the C
standard header <stdio.h>.  The problem is that, assuming it works by
processing the actual file (typically "/usr/include/stdio.h" on
Unix-like systems), it's going to pick up a huge amount of irrelevant
cruft.

For example, files in C are handled via the type "FILE*", which is a
pointer to an object of type FILE.  The C standard doesn't say what's
in a FILE object (it only requires it to be an object type), and the
actual contents can vary from one implementation to another.  C
programs are expected to use only FILE* objects, and only via the
manipulation functions defined by the standard (fopen, fclose, etc.).
In effect, FILE is an opaque type (and C programmers actually tend to
avoid writing code that depends on the unspecified internal details).

But C has no good way to express this in the language, so you're
probably going to import the actual definition of type FILE for the
current system, translated into an Ada record declaration.  You really
want
    type FILE is private;
but I don't see how you can get it.

Ideally, I suppose you'd like to be able to automatically translate
the contents of <stdio.h> into an Ada interface that's about as clean
as the C interface *as defined in the C standard*.  I'm not convinced
that it's practical to do that automatically.

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
San Diego Supercomputer Center             <*>  <http://users.sdsc.edu/~kst>
We must do something.  This is something.  Therefore, we must do this.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 23:26               ` Markus E Leypold
@ 2007-01-26  4:23                 ` Jeffrey R. Carter
  2007-01-26 11:35                   ` Markus E Leypold
  2007-01-28 20:32                   ` adaworks
  2007-01-26  7:21                 ` Harald Korneliussen
  1 sibling, 2 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-26  4:23 UTC (permalink / raw)


Markus E Leypold wrote:
> 
> Not to promote C, but purely from a logical perspective: The
> "continuing creation of buffer-overflow errors in C" only shows that
> there exist programmers/humans that don't always create safe programs
> in C.

Don't you get logical on me here. I maintain that it is impossible for 
humans to write safe C. For proof by blatant assertion, see my previous 
post.

> I'd like just to point out that the world even in this respect is
> neither black and white (or at least you haven't demonstrated that
> convincingly yet), but more shades of gray. I concede the gray might
> be lighter in Ada sector and darker in the C sector, but I'd prefer
> quantitative arguments / reasoning in this world and some effort to
> estimate the difference between those shades over striking but
> logically flawed reasoning. I want absolutes I'll turn to religion. :-).

All together now: Put your hands on the monitor. Say, "I believe in Ada!"

-- 
Jeff Carter
"From this day on, the official language of San Marcos will be Swedish."
Bananas
28



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  1:01   ` Alexander E. Kopilovich
@ 2007-01-26  5:01     ` JPWoodruff
  0 siblings, 0 replies; 397+ messages in thread
From: JPWoodruff @ 2007-01-26  5:01 UTC (permalink / raw)




On Jan 24, 5:01 pm, "Alexander E. Kopilovich" <a...@VB1162.spb.edu>
wrote:
> JPWoodruff wrote:
> > classes of smart young
> >teenagers

> And thread-of-execution debugging
> richly furnishes this abstraction with emotionally colored and socially
> sharable practical cases. No wonder that it attracts some fraction of smart
> young teenagers.

I take your point.  For all I know that's a good thing.  But my remark
was a claim that Ada won't capture those guys' attention easily.  Ada -
the way I've practiced her - has small use for execution tracing.

Still, the popularity topic is an open issue.

After all these years - thanks Arthur, Richard, et al for laying out
the history so clearly.  I like to believe the influence of that early
politics is waning.  Technical and economic arguments can be heard
above the echo of those early decisions (right?).

Maybe it's time to engage a social dimension in programming
experience.  Not for the first time, but this time on behalf of the
technical benefits of Ada.

I suggest that my hypothesis points to a way to engage next-generation
programmers.

I'd like there to be an opportunity to influence young learners toward
the "story telling" aspect of programming.  It would be nice to have
starter exercises with *narrative*.  These would accentuate the
program's content nouns above its behavior verbs.  Sad to say I have
absolutely no idea - nor any aptitude - how to make this happen.  So
it's just another pipe dream unless an idea lights the way.

> It is because Ada language isn't adapted well for vague outlining of
> possibilities and opportunities. Its main strengths are in expression of
> very real and actual things <...>

>From my experience, Ada does well at outlining, but that's a different
topic - maybe take it up later.

John




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 21:56   ` Dr. Adrian Wrigley
                       ` (3 preceding siblings ...)
  2007-01-26  2:50     ` Keith Thompson
@ 2007-01-26  5:29     ` Gautier
  2007-01-27  5:22     ` Charles D Hixson
  5 siblings, 0 replies; 397+ messages in thread
From: Gautier @ 2007-01-26  5:29 UTC (permalink / raw)


Dr. Adrian Wrigley:

> I think this is critical.  Why can't we just say:
> 
> with stdio;
> 
> pragma import (C, stdio, "stdio.h");
> 
> and be able to get structs, functions, constants, variables from C in
> an obvious and reasonably reliable way?
> 
> Much of what is in C has direct analogs in Ada.  Some of it is via
> fiddly #defines, but even a useful subset of these would be e

A problem is that you will want to import something else than "stdio.h", for 
instance "XYZ.h" which contains

#if defined(_MSC_VER) || defined(__CYGWIN__) || defined(__MINGW32__)
#   define WIN32_LEAN_AND_MEAN
#   define NO_MIN_MAX
#    include <windows.h>
#   undef min
#   undef max
...

or worse messes... You need the whole #include and #define preprocessing, 
admit that "with ABC, XYZ;" may well behave differently than "with XYZ, ABC;" 
and plenty of funny things! In addition, I guess that with this "feature" each 
Ada compiler and compiler version would still behave slightly differently in 
that respect, with big differences at the bottom line.
That way you quickly scale down most interesting aspects of Ada: non-flat 
modularity, strong typing (Keith's reply), portability and probably more.
It is a bit too much retro-computing for Ada, I'm afraid.

You'd better to write a preprocessor that takes XYZ.h plus the 
pre-defined-#define's as input, and spits XYZ.ads as output with the Import 
pragmata. You can take a look at the preprocessor I made for Borland Pascal / 
Delphi, BP2P, in newp2ada.zip there:

http://www.mysunrise.ch/users/gdm/gsoft.htm#p2ada

> And of course compilers should spit out header files on request
> matching an ada package via the "obvious" rules, so you can
> #include it from C.

For that, again, you don't need to wait an (IMHO) unlikely change in the 
standard; especially the because the "obvious" is maybe not so clear to define 
- or is it ?
You have all ingredients to make a nice tool: tons of Ada yacc grammars on the 
Internet, even an up to date ayacc/aflex if you want to do your tool in Ada 
(see again newp2ada.zip).

The "obvious and reasonably reliable way" of importing C stuff is certainly 
more something soft (just like C itself) that you can express better in doing 
import/export tools than something solid you can define in an Ada standard...
______________________________________________________________
Gautier         -- http://www.mysunrise.ch/users/gdm/index.htm
Ada programming -- http://www.mysunrise.ch/users/gdm/gsoft.htm

NB: For a direct answer, e-mail address on the Web site!



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 12:54             ` Markus E Leypold
@ 2007-01-26  7:03               ` Harald Korneliussen
  0 siblings, 0 replies; 397+ messages in thread
From: Harald Korneliussen @ 2007-01-26  7:03 UTC (permalink / raw)




On 25 Jan, 13:54, Markus E Leypold
<development-2006-8ecbb5cc8aREMOVET...@ANDTHATm-e-leypold.de> wrote:
> It was not me, that wrote this, but Maciej. Looks a bit strange this
> way to quote.
> 
> Regards -- Markus

Oops, sorry.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 22:36             ` Jeffrey R. Carter
  2007-01-25 23:26               ` Markus E Leypold
@ 2007-01-26  7:16               ` Harald Korneliussen
  1 sibling, 0 replies; 397+ messages in thread
From: Harald Korneliussen @ 2007-01-26  7:16 UTC (permalink / raw)




On 25 Jan, 23:36, "Jeffrey R. Carter" <jrcar...@acm.org> wrote:
> The only safe use of C is as a target language for code generators (such
> as the SofCheck Ada -> C compiler). The continuing creation of
> buffer-overflow errors in C shows that, in practice, it is impossible
> for humans to create safe C.
>
Not to promote C myself, but there exist tools which may be useful.
Run-time checking such as can be achieved with profilers and
malloc-replacements don't cut it, IMO, since they only find errors in
the most trodden paths of the program. Static analysers, like the one
promoted by Coverity, now there's something valuable. Aslo,
annotation-based analyzers like splint can probably make programs as
safe as Ada if the developers actually take the time to use them.
Another interesting approach is that taken by CCured, of transforming C
programs by classifying pointers according to usage, and then inserting
run-time checks to guarantee no access violations of memory corruption.
These are interesting, practical approaches for those of us who have to
maintain C code. The downside is that the further your program strays
from ansi C (not to say into C++!), the less tools are available.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 23:26               ` Markus E Leypold
  2007-01-26  4:23                 ` Jeffrey R. Carter
@ 2007-01-26  7:21                 ` Harald Korneliussen
  1 sibling, 0 replies; 397+ messages in thread
From: Harald Korneliussen @ 2007-01-26  7:21 UTC (permalink / raw)




On 26 Jan, 00:26, Markus E Leypold
<development-2006-8ecbb5cc8aREMOVET...@ANDTHATm-e-leypold.de> wrote:
> "Jeffrey R. Carter" <jrcar...@acm.org> writes:

> Since I assume that the same thing (unsafe programs) applies to Ada
> (say with respect to unhandled exceptions (Ariane, for instance ...)
> or with respect to memory leaks or priority inversions in tasking),
> the only remaining line on which an argument of the usability of C
> vs. that of Ada can develop is quantitatively, not qualitatively.
>
A line is usually drawn at languages whose type systems (and possibly
inserted runtime checks) don't prevent invalid memory accesses. For
instance the paper linked to above, about type systems, draws this
line, and argues why it matters. Great paper, by the way.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 21:34             ` Markus E Leypold
  2007-01-25  9:23               ` Markus E Leypold
@ 2007-01-26  7:59               ` Maciej Sobczak
  2007-01-26 20:05                 ` Jeffrey R. Carter
  1 sibling, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-01-26  7:59 UTC (permalink / raw)


Markus E Leypold wrote:

> Teanage kids these days write c001 PHP web applications. Now buffer
> overflows there, but any amount of security holes.
> 
> BTW, what one can learn from that, is, that it is the absence of
> correct models and absence encapsulation of state and representation
> that makes software bad (insecure / unsafe / whatever), not only the
> buffer overflows.

Exactly.

http://www.owasp.org/index.php/OWASP_Top_Ten_Project#Top_Ten_Overview

Just changing the implementation language from C to whatever else (Ada 
including) can rule out only one (buffer overflows) of the top 10 
security flaws - and even that not always (especially when binding to 
some C code is used, where the buffer overflow can happen on the 
language border).

Security holes are not about just language choices.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* AW: How come Ada isn't more popular?
  2007-01-25 23:32             ` Markus E Leypold
@ 2007-01-26  8:50               ` Grein, Christoph (Fa. ESG)
  2007-01-26 11:52                 ` Markus E Leypold
  2007-01-26  8:56               ` Ludovic Brenta
  1 sibling, 1 reply; 397+ messages in thread
From: Grein, Christoph (Fa. ESG) @ 2007-01-26  8:50 UTC (permalink / raw)
  To: comp.lang.ada

Von: comp.lang.ada-bounces@ada-france.org
[mailto:comp.lang.ada-bounces@ada-france.org] Im Auftrag von Markus E
Leypold

>> The circumstances differed in this respect: The C students were given
>> 60% of the teacher's solution. Initially, the Ada students were given
>> 10% (or maybe less).

> So it was not really a comparable situation. One might argue that
> already given partial solutions demotivate and that they have dangers
> of its own (often draw people on a wrong path of thinking). I've
> actually seen people given a partial solution perfoming worse than
> those who had to do without.

> ... (I haven't even read the study yet), but ...

You really should have looked before commenting... :-(

Fact is: The students were not able to finish the job, so they were
given more and more source. And even with 60%, they were still not able.

With Ada, the situation changed. Disaster was expected, but to the big
surprise the students finished the work with only 10% code given.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  9:40           ` Markus E Leypold
@ 2007-01-26  8:52             ` Ludovic Brenta
  2007-01-26 11:40               ` Markus E Leypold
  2007-01-27 16:56             ` Stephen Leake
  1 sibling, 1 reply; 397+ messages in thread
From: Ludovic Brenta @ 2007-01-26  8:52 UTC (permalink / raw)


Markus E Leypold writes:
> Apropos Python: There are a lot of languages that are rather popular
> in OSS development apart from C. KDE is bound to Qt so cannot leave
> C.

C++, actually.  And there are bindings in Python and a few other
languages.  Yves Bailly is even busy writing an Ada binding for Qt,
using an intermediate C binding.  GNOME has more language bindings
because it is written directly in C.

> GNOME AFAI understand will slowly shift to Mono as application run
> time and thus will have more and mor C# code (that also shows what the
> OSS guys percieve as their main problem: Pointers and memory
> management, both of which have not been really solved in Ada to an
> extent desirable for application programming (as opposed to
> embedded)).

There seems to be disagreements between the GNOME developers.  Some
promote the shift towards C#, but others would rather stay with C in
order to avoid the need for the large C# library and interpreter
infrastructure.

>>> Apart from that, me seems it would be a bit difficult to have a C API
>>> to some Ada library, since Ada requires quite a lot of runtime support
>>> for tasking, so certainly doesn't interact to well C-code which also
>>> use signals, longjmp etc.
>>
>> Your argument can be applied in the other direction as well. How about
>> binding Ada to C libraries that use non-Ada runtime internally? It's
>
> Yes, this is a problem. Therefore the usual approach is, to avoid this
> or at least to make the runtime separable and exchangeable.

The usual approach is to either stick to C (see e.g. Evolution), or
accept the overhead of another language's runtime because the cost is
spread over a large code base (see the new GNOME applications written
in C#).  As I said, both approaches have their proponents within the
GNOME community.  The run-time is always separate (i.e. in shared
libraries), but not necessarily exchangeable, and exchangeability is
not a design goal.

-- 
Ludovic Brenta.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 23:32             ` Markus E Leypold
  2007-01-26  8:50               ` AW: " Grein, Christoph (Fa. ESG)
@ 2007-01-26  8:56               ` Ludovic Brenta
  2007-01-26 11:49                 ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Ludovic Brenta @ 2007-01-26  8:56 UTC (permalink / raw)


Markus E Leypold writes:
> "Jeffrey R. Carter" writes:
>
>> Alexander E. Kopilovich wrote:
>>> The original statement (from kevin cline) was:
>>>
>>>>>>  What makes a programmer
>>>>>> like a new language?  Usually, someone comes along and says something
>>>>>> like "Remember that program that we spent two weeks writing in C?
>>>>>> Here's a Perl implementation that I put together in three hours and
>>>>>> one-tenth the code."  That's never happened with Ada.
>>> The article
>>> http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html
>>> presents the case where use of Ada language in very specific circumstances
>>> was much more effective than use of C language in the same circumstances.
>>
>> The circumstances differed in this respect: The C students were given
>> 60% of the teacher's solution. Initially, the Ada students were given
>> 10% (or maybe less).
>
> So it was not really a comparable situation. One might argue that
> already given partial solutions demotivate and that they have dangers
> of its own (often draw people on a wrong path of thinking). I've
> actually seen people given a partial solution perfoming worse than
> those who had to do without.

In the first years of the course, students started with 0% of the code
given to them, and the instructor started giving them parts of the
solution, gradually increasing to 60% over the years, only because all
students failed to complete the assignment.  So, your argument about
motivation does not hold.

> Mind you, I do not suggest that really is the reason for the
> differences found (I haven't even read the study yet), but I see a
> problem with a study where the 2 groups compared are not really
> starting from the same point. The only different factor should be the
> programming language, not the rest of the setup.

After you read the report, you will find that that was indeed the
case.  It is premature to comment before you read the report.

-- 
Ludovic Brenta.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26  4:23                 ` Jeffrey R. Carter
@ 2007-01-26 11:35                   ` Markus E Leypold
  2007-01-26 20:22                     ` Jeffrey R. Carter
  2007-01-28 20:32                   ` adaworks
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-26 11:35 UTC (permalink / raw)



"Jeffrey R. Carter" <jrcarter@acm.org> writes:

> Markus E Leypold wrote:
>> Not to promote C, but purely from a logical perspective: The
>> "continuing creation of buffer-overflow errors in C" only shows that
>> there exist programmers/humans that don't always create safe programs
>> in C.
>
> Don't you get logical on me here. I maintain that it is impossible for
> humans to write safe C. For proof by blatant assertion, see my
> previous post.

:-)

>
>> I'd like just to point out that the world even in this respect is
>> neither black and white (or at least you haven't demonstrated that
>> convincingly yet), but more shades of gray. I concede the gray might
>> be lighter in Ada sector and darker in the C sector, but I'd prefer
>> quantitative arguments / reasoning in this world and some effort to
>> estimate the difference between those shades over striking but
>> logically flawed reasoning. I want absolutes I'll turn to religion. :-).
>
> All together now: Put your hands on the monitor. Say, "I believe in Ada!"

Yeees, that's more like it. If we all do this and are pure of spirit
(no C-thought!!) we'll overwhelm THEM with the pure power of our mind
fields.

I'm glad you take this with a bit of humor. But I would like to
emphasize once again that -- not only in the game Ada vs. C, but
most/every other programming language / development paradigm advocacy
-- I'm missing hard data. 

(I say this again here, since a number of my posts from yesterday
where I explained all that seem to have fallen into a black hole).

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26  8:52             ` Ludovic Brenta
@ 2007-01-26 11:40               ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-26 11:40 UTC (permalink / raw)



Ludovic Brenta <ludovic@ludovic-brenta.org> writes:

> Markus E Leypold writes:
>> Apropos Python: There are a lot of languages that are rather popular
>> in OSS development apart from C. KDE is bound to Qt so cannot leave
>> C.
>
> C++, actually.  And there are bindings in Python and a few other

Ooops, yes :-).

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26  8:56               ` Ludovic Brenta
@ 2007-01-26 11:49                 ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-26 11:49 UTC (permalink / raw)



Ludovic Brenta <ludovic@ludovic-brenta.org> writes:

> Markus E Leypold writes:
>> "Jeffrey R. Carter" writes:
>>
>>> Alexander E. Kopilovich wrote:
>>>> The original statement (from kevin cline) was:
>>>>
>>>>>>>  What makes a programmer
>>>>>>> like a new language?  Usually, someone comes along and says something
>>>>>>> like "Remember that program that we spent two weeks writing in C?
>>>>>>> Here's a Perl implementation that I put together in three hours and
>>>>>>> one-tenth the code."  That's never happened with Ada.
>>>> The article
>>>> http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html
>>>> presents the case where use of Ada language in very specific circumstances
>>>> was much more effective than use of C language in the same circumstances.
>>>
>>> The circumstances differed in this respect: The C students were given
>>> 60% of the teacher's solution. Initially, the Ada students were given
>>> 10% (or maybe less).
>>
>> So it was not really a comparable situation. One might argue that
>> already given partial solutions demotivate and that they have dangers
>> of its own (often draw people on a wrong path of thinking). I've
>> actually seen people given a partial solution perfoming worse than
>> those who had to do without.
>
> In the first years of the course, students started with 0% of the code
> given to them, and the instructor started giving them parts of the
> solution, gradually increasing to 60% over the years, only because all
> students failed to complete the assignment.  So, your argument about
> motivation does not hold.

OK. This was not intended as an argument really (I didn't read the
paper) but rather as food for thought. I've seen beginners playing GO
(a board game) worse when they got more tokens head start (which is
supposes to help beginners and level the playing field with a more
advanced partner). My theory was that this is because they do not know
how to use the advance tokens properly and get side tracked into a
less than optimal game. Suggestion was, that the same might have
applied here to a ceratin degree: Given parts of the solution might
keep people from finding their own and tied up in efforts to fit their
probably different approach to the pieces they got.

Again: I do not seriously put that forward as an argument in the given
study, since I haven't read it. But my experiance is, that help is not
always helping, so that should be well controlled factor in a study
and arguments of the kind "they got helped more and did come less far"
are flawed as a general approach because the help could have been well
the factor hindering them.

>
>> Mind you, I do not suggest that really is the reason for the
>> differences found (I haven't even read the study yet), but I see a
>> problem with a study where the 2 groups compared are not really
>> starting from the same point. The only different factor should be the
>> programming language, not the rest of the setup.

> After you read the report, you will find that that was indeed the
> case.  It is premature to comment before you read the report.

OK. Some time soon I'll read it. I find the different amount of "help"
they got still irritating.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: AW: How come Ada isn't more popular?
  2007-01-26  8:50               ` AW: " Grein, Christoph (Fa. ESG)
@ 2007-01-26 11:52                 ` Markus E Leypold
  2007-01-29  6:16                   ` AW: " Grein, Christoph (Fa. ESG)
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-26 11:52 UTC (permalink / raw)



"Grein, Christoph (Fa. ESG)" <Christoph.Grein@eurocopter.com> writes:

> Von: comp.lang.ada-bounces@ada-france.org
> [mailto:comp.lang.ada-bounces@ada-france.org] Im Auftrag von Markus E
> Leypold
>
>>> The circumstances differed in this respect: The C students were given
>>> 60% of the teacher's solution. Initially, the Ada students were given
>>> 10% (or maybe less).
>
>> So it was not really a comparable situation. One might argue that
>> already given partial solutions demotivate and that they have dangers
>> of its own (often draw people on a wrong path of thinking). I've
>> actually seen people given a partial solution perfoming worse than
>> those who had to do without.
>
>> ... (I haven't even read the study yet), but ...
>
> You really should have looked before commenting... :-(

Yes, yes.

>
> Fact is: The students were not able to finish the job, so they were
> given more and more source. And even with 60%, they were still not able.

See my comment on the helpfullness of help in another articel. I've
only been trying to put forward a general point of view on the idea
that more help brings the students nearer to success in any case. 

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 22:55         ` Robert A Duff
@ 2007-01-26 19:59           ` Jeffrey R. Carter
  0 siblings, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-26 19:59 UTC (permalink / raw)


Robert A Duff wrote:
> 
> I don't think the so-called Ada 80 standard was used for much of
> anything.  Was it?

IIRC, the 1st delivered Ada system was a payroll system in Ada 80.

> I think Ada 2005 will become an ISO standard in 2007.
> Even if ISO red tape holds it up for another 50 years,
> it's still Ada 2005, in common parlance, and just "Ada"
> in official parlance.

I'll call it Ada 07 when I need to distinguish it.

> No.  Such minor bugs in the Standard will not delay it.
> If a fix is necessary, it will be a correction to Ada 2005.

That's good to hear.

-- 
Jeff Carter
"People called Romanes, they go the house?"
Monty Python's Life of Brian
79



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26  7:59               ` Maciej Sobczak
@ 2007-01-26 20:05                 ` Jeffrey R. Carter
  2007-01-26 22:43                   ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-26 20:05 UTC (permalink / raw)


Maciej Sobczak wrote:
> 
> Just changing the implementation language from C to whatever else (Ada 
> including) can rule out only one (buffer overflows) of the top 10 
> security flaws - and even that not always (especially when binding to 
> some C code is used, where the buffer overflow can happen on the 
> language border).

I recall reading that buffer overflows account for about 50% of actually 
exploited vulnerabilities in networking SW. I'm not sure if that's still 
true when one considers "web applications". Also, it documents 
exploitation, not existence.

-- 
Jeff Carter
"People called Romanes, they go the house?"
Monty Python's Life of Brian
79



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26 11:35                   ` Markus E Leypold
@ 2007-01-26 20:22                     ` Jeffrey R. Carter
  2007-01-26 23:04                       ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-26 20:22 UTC (permalink / raw)


Markus E Leypold wrote:
> 
> I'm glad you take this with a bit of humor. But I would like to
> emphasize once again that -- not only in the game Ada vs. C, but
> most/every other programming language / development paradigm advocacy
> -- I'm missing hard data. 

Hard data are hard to come by. Few organizations have the resources or 
the inclination to do such studies. Where we do have hard data (that I 
know of) are 2 studies, 1 by Rational available at adaic.org, and the 
other by P&W, posted here some years ago by Condic; McCormick's results; 
results published by Praxis on the C-130J project; and a controlled 
study at the US Military Academy comparing Ada and Pascal as 1st-course 
languages published in /Ada Letters/.

The 1st 2 both showed that Ada, compared to C, offered a factor of 2 
improvement in cost of reaching deployment, a factor of 4 improvement in 
post-deployment errors, and a factor of 10 improvement in cost of 
correcting a post-deployment error.

The 3rd we've discussed adequately here, I think.

The 4th showed a factor of 10 improvement in post-deployment errors 
compared to C (and a further factor of 10 improvement of SPARK over Ada).

The 5th concluded that Ada was a better 1st-course language than Pascal.

-- 
Jeff Carter
"People called Romanes, they go the house?"
Monty Python's Life of Brian
79



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 22:16           ` Jeffrey R. Carter
  2007-01-25 23:32             ` Markus E Leypold
@ 2007-01-26 22:05             ` Alexander E. Kopilovich
  1 sibling, 0 replies; 397+ messages in thread
From: Alexander E. Kopilovich @ 2007-01-26 22:05 UTC (permalink / raw)
  To: comp.lang.ada

Jeffrey R. Carter wrote:
 
>> The original statement (from kevin cline) was:
>> 
>>>>>  What makes a programmer
>>>>> like a new language?  Usually, someone comes along and says something
>>>>> like "Remember that program that we spent two weeks writing in C?
>>>>> Here's a Perl implementation that I put together in three hours and
>>>>> one-tenth the code."  That's never happened with Ada.
>> 
>> The article http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html
>> presents the case where use of Ada language in very specific circumstances
>> was much more effective than use of C language in the same circumstances.
>
>...
>
>> Those circumstances include:
>> 
>> 1) close support provided by teaching staff
>> 2) full and precise spefications for the problem domain
>> 3) stable general requirements for the task and at the same time relative
>>    freedom regarding details, and anyway, the absence of a stream of
>>    unexpected changes in requirements and/or scope and/or additional
>>    requirements
>
>I agree that 1. doesn't apply to the OP's statement. However, I consider 
>a running program to be a very precise and stable specification, so 2. 
>and 3. appear to apply.

Well, of course, after the program is completed and accepted then it 
represents a kind of precise specification. But in the case described in
the article those full specifications was known in advance, which, generally
(for common programmer) does not happen very often.

When, in the original statement, the teller points at particular program,
he certainly means that the information probably will be relevant not just
to that particular program, but to some wide class of programs, and it is
quite possible that many programs in this class have only partial and vague
specifications at the start of their development. 

> And it's definitely a case of "Remember that 
>program you couldn't get working in C in an entire semester even though 
>you were given 60% of the code? Here's an Ada implementation that I got 
>working in a semester when given 10% of the code."

Still note, that it was the teacher of the course making his comparative
conclusion to the readers of professional journal, and not one of his students
to another student. Actually that article does not mention any opportunity
for those students to make that comparison among themselves.

Therefore it is not exactly the case described in original statement, but
rather somehow similar case, no more than that. There is a teacher telling
another teachers (or project leaders or managers) how he, by switching to Ada,
enabled his students to do things which were too hard for them with C.
Consequently (and in full agreement with the original statement) this article
may contribute to Ada popularity among teachers, software development managers
or, perhaps, project leaders - but it is not the audience implied in the
original statement.








^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26 20:05                 ` Jeffrey R. Carter
@ 2007-01-26 22:43                   ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-26 22:43 UTC (permalink / raw)



"Jeffrey R. Carter" <jrcarter@acm.org> writes:

> Maciej Sobczak wrote:
>> Just changing the implementation language from C to whatever else
>> (Ada including) can rule out only one (buffer overflows) of the top
>> 10 security flaws - and even that not always (especially when
>> binding to some C code is used, where the buffer overflow can happen
>> on the language border).
>
> I recall reading that buffer overflows account for about 50% of
> actually exploited vulnerabilities in networking SW. 

> I'm not sure if that's still true when one considers "web
> applications". Also, it documents exploitation, not existence.

I don't think it's true any more. Bugs that have to do with processing
unchecked user input (esp. quoting and unquoting to/from html and
SQL-statements) in "scripted" web applications seem to be the majority
now. Of course buffer overflow is still a concern (i.e. in gpg or mail
processing application) but it's not restricted to networking software
since user very often process foreign data/"documents". Se, in
example, the resent flurry of bugs in MS word and PDF viewers (and
even those are often a bit more complicated than simple overflow as
the WMF-bug demonstrates).

I haven't any quantitative data though, but one can try to count the
security alerts in the full disclosure list or relevant security
forums. My impression: Buffer overflow still happens, but is not the
main problem any more. The main problem IMHO are bad software
engineers / programmers that create monolithic applications (no
modules, no abstraction boundaries) without contracts.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26 20:22                     ` Jeffrey R. Carter
@ 2007-01-26 23:04                       ` Markus E Leypold
  2007-01-27 19:57                         ` Frank J. Lhota
  2007-01-28 20:43                         ` adaworks
  0 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-26 23:04 UTC (permalink / raw)




"Jeffrey R. Carter" <jrcarter@acm.org> writes:

> Markus E Leypold wrote:
>> I'm glad you take this with a bit of humor. But I would like to
>> emphasize once again that -- not only in the game Ada vs. C, but
>> most/every other programming language / development paradigm advocacy
>> -- I'm missing hard data.

> Hard data are hard to come by. 

Yes, but -- computer science and software engineering strife to be
_science_. So there should be a bit more proof and a bit less advocacy
(I've elaborated on this on one of the posts that went into a black
hole).

> Few organizations have the resources or the inclination to do such
> studies.

Simple: Science, esp. software engineering should. A whole discipline
of science (materials science)is dedicated to evaluate, test, and
describe the properties of new and old materials. Another science
disciplin (education science) is dedicated to research how to teach
things and how people learn things (not such a hard science but the
results are overall still better and more reliable than just having
Joe Sixpack say his part about how education should be done).

I think that there should be a discipline in computer science that
researches and quantitatively classifies the tools and methods which
are used to create new programs. Actually there is such a discipline
(software engineering) but there is also a lot of advocacy and
philosophy there and not so much hard data.

Presently I not, that new paradigms are put forward more in a
religious manner, and if they succeed just produce more opportunity for
research how to support the paradigm with give tools -- whereas the
first proof that the new paradigm / technique actually buys anything
is usually a bit thin.


> Where we do have hard data (that I know of) are 2 studies, 1 by
> Rational available at adaic.org, and the other by P&W, posted here
> some years ago by Condic; McCormick's results; results published by
> Praxis on the C-130J project; and a controlled study at the US
> Military Academy comparing Ada and Pascal as 1st-course languages
> published in /Ada Letters/.

Amazing. 2 Studies for 20 years of Ada :-).

> The 1st 2 both showed that Ada, compared to C, offered a factor of 2
> improvement in cost of reaching deployment, a factor of 4 improvement
> in post-deployment errors, and a factor of 10 improvement in cost of
> correcting a post-deployment error.

Actually I know the study by Rational.

> The 3rd we've discussed adequately here, I think.

Yes.

> The 4th showed a factor of 10 improvement in post-deployment errors
> compared to C (and a further factor of 10 improvement of SPARK over
> Ada).


Impressive :-).

> The 5th concluded that Ada was a better 1st-course language than Pascal.

I'm a bit surprised, actually. I would have thought Pascal simpler to
learn.

I'll have to try to find those studies some time in the future and
read up on them. 

Unfortunately, even agreeing Ada is the "better language" in most
aspects as a programming language (and I can imagine a number of
scenarios where some of the Ada stuff really get's in the way and that
is e.g. when I want to write something without the Ada runtime: I can
write a GNU Pascal procedure and link it into a C program, but can I
do that with something compiled by GNAT?), whatever -- even agreeing
Ada is the "better language", I've tried to point out in other posts
(hopefully not all of which went into the black whole) that -- coming
back to the topic of the thread -- real world decisions for tools are
not always influenced by technical merit alone. Availability of
libraries, costs of entering a market, existing code base, available
tools + compilers, flexibility after deciding on certain tools (read
alternative tools and second source vendors) are also important. Some
people here have argued along the line that "people simply don't know
better". I find that difficult to accept.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 22:34       ` Jeffrey R. Carter
  2007-01-25 22:55         ` Robert A Duff
@ 2007-01-27  3:54         ` Randy Brukardt
  1 sibling, 0 replies; 397+ messages in thread
From: Randy Brukardt @ 2007-01-27  3:54 UTC (permalink / raw)


"Jeffrey R. Carter" <jrcarter@acm.org> wrote in message
news:Jfauh.240347$aJ.979@attbi_s21...
...
> Hopefully 2007 will see Ada 0X become a standard. On the Ada-comment
> mailing list there's a discussion of revising the wording for
> "equivalence" for ordered containers (originally discussed on c.l.a), so
> it may be a while yet.

Those sorts of bugs will go in a corrigendum or amendment to come down the
road.

The Amendment completed its last approval stage (the JTC1 vote) this week.
(There is a press release in preparation.) The only remaining task is for
the Amendment to be officially published. Can't say how long that will take,
but we now know that there won't be any substantial changes to it.

                        Randy Brukardt.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 11:31   ` Ali Bendriss
@ 2007-01-27  5:12     ` Charles D Hixson
  2007-01-27  9:52       ` Markus E Leypold
  2007-01-29 23:56       ` Randy Brukardt
  0 siblings, 2 replies; 397+ messages in thread
From: Charles D Hixson @ 2007-01-27  5:12 UTC (permalink / raw)


Ali Bendriss wrote:
> On Tuesday 23 January 2007 06:37, adaworks@sbcglobal.net wrote:
>> <artifact.one@googlemail.com> wrote in message
>> news:1169531612.200010.153120@38g2000cwa.googlegroups.com...
>>
>>> My question is: how come Ada isn't more popular?
>> Ada suffered, in its early days, from a convergence of several
>> things.  One is that the designers of the language did not anticipate
>> the impact of the personal computer and the democratization of
>> computing.   There were other factors, as well.
>> [...]
> 
> Does the fact that there is no "public" operating system written in Ada is one 
> of this factor ? 
> I think about the closest link between the C language and the *nix systems.
> 
> 
That's a factor, but only a small one, and getting smaller as 
time goes on.  I, personally, think that one factor IS the 
lack of a standard garbage collector.  Related to this is the 
awkwardness of dealing with strings of varying lengths.

There are many features which aren't significant after you've 
learned how to work your way around them that can be sizeable 
blockages at the start.

Remember that at the start, Ada was competing solely against 
C.  C++ barely existed, and what did exist didn't bear that 
much resemblance to what we now call C++.  (I used to use C++ 
rather than C solely because of the typed constants.)  At that 
point C had to be cut down to compile on a micro-computer. 
Look up BSD C or Lifeboat C.  These were SUBSETS of C, but 
they could be used, and they could call themselves C.  (Once 
you started using them, you became well aware that they were 
subsets...works in progress as it were.)  Ada subsets couldn't 
use the name Ada.  Janus Ada couldn't call itself Ada for 
quite awhile.  And even Janus Ada couldn't run on most CP/M 
machines.  Too resource intensive.  I bought an Apple ][ to 
run UCSD Pascal, and was so disappointed that I saved up and 
installed a CP/M card so that I could run C.  Ada wasn't a 
possibility.  (I never seriously considered Basic.  It was too 
non-portable.  If I wanted non-portable, I'd go for assembler.)

Ada was HUGE and C++ was only a slight bit larger than C. 
(The times they DO change!)  At that time I was a PL/I 
programmer on a mainframe, and Ada had a reputation that 
caused me to both lust after it, and to fear it's 
complexities.  Actually, however, it was never a realistic 
possibility.  I couldn't run it on my home machine, and work 
sure wasn't going to pay the have the service bureau install 
it.  So I dreamed about it...and Algol68, and Snobol, and IPL, 
and APL, and LISP...and didn't take any of those dreams 
seriously.  But of those only Ada and Algol were frightening 
as well as lust provoking.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23 21:56   ` Dr. Adrian Wrigley
                       ` (4 preceding siblings ...)
  2007-01-26  5:29     ` Gautier
@ 2007-01-27  5:22     ` Charles D Hixson
  5 siblings, 0 replies; 397+ messages in thread
From: Charles D Hixson @ 2007-01-27  5:22 UTC (permalink / raw)


Dr. Adrian Wrigley wrote:
> On Tue, 23 Jan 2007 11:38:28 +0100, Alex R. Mosteo wrote:
> 
>> artifact.one@googlemail.com wrote:
>>
>>> ...
> I think this is critical.  Why can't we just say:
> 
> with stdio;
> 
> pragma import (C, stdio, "stdio.h");
> 
> and be able to get structs, functions, constants, variables from C in
> an obvious and reasonably reliable way?
> 
> Much of what is in C has direct analogs in Ada.  Some of it is via
> fiddly #defines, but even a useful subset of these would be e
> 
> And of course compilers should spit out header files on request
> matching an ada package via the "obvious" rules, so you can
> #include it from C.
> --
> Adrian
> 
> 
That would be very nice.  Notice that it's divided into two 
pieces:
1) import c header files, and
2) export header files for C
While both would be extremely useful, and 1, perhaps, more 
than 2, it seems to me that 2 would be much more readily 
accomplished.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 10:13           ` Harald Korneliussen
                               ` (2 preceding siblings ...)
  2007-01-25 22:36             ` Jeffrey R. Carter
@ 2007-01-27  5:30             ` Charles D Hixson
  3 siblings, 0 replies; 397+ messages in thread
From: Charles D Hixson @ 2007-01-27  5:30 UTC (permalink / raw)


Harald Korneliussen wrote:
> 
> On 25 Jan, 09:37, Maciej Sobczak <no.s...@no.spam.com> wrote:
>> Markus E Leypold wrote:
>> ...
> 
> ...
> Anyway, I think individuals are less important than culture. You could
> read this straight out of the wikipedia page for C (until I changed it
> slightly ...): "the safe, effective use of C requires more programmer
> skill, experience, effort, and attention to detail than is required for
> some other programming languages." So if you use C, that means you are
> skilled, experienced and attentive, by some people's logic. It's a
> macho thing, "If you can't handle the power, don't use it!".
> 
> Klingon programmers if there ever were any.
> 
> It's of course a symptom of lack of professionalism: construction
> workers who are proud of what they do wear their helmets.
> 
I think you picked up the right reason by the wrong handle.  C 
was the language of Dec computers, and of Kernighan and 
Ritchie, so everyone associated with Unix started using C.  So 
everyone associated with Unix and Linux uses C.  It's more 
historic than anything else.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  7:42     ` kevin  cline
                         ` (2 preceding siblings ...)
  2007-01-25  4:50       ` Alexander E. Kopilovich
@ 2007-01-27  5:43       ` Charles D Hixson
  2007-01-27  8:38         ` Dmitry A. Kazakov
  2007-01-27 13:06         ` Gautier
  3 siblings, 2 replies; 397+ messages in thread
From: Charles D Hixson @ 2007-01-27  5:43 UTC (permalink / raw)


kevin cline wrote:
> 
> On Jan 23, 4:18 pm, Martin Dowie <martin.do...@btopenworld.remove.com>
> wrote:
>> kevin clinewrote:
>>> 3. For the same reason that Limburger cheese isn't more popular.  Most
>>> programmers who have tried Ada didn't like it.  What makes a programmer
>>> like a new language?  Usually, someone comes along and says something
>>> like "Remember that program that we spent two weeks writing in C?
>>> Here's a Perl implementation that I put together in three hours and
>>> one-tenth the code."  That's never happened with Ada.                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>> FUD!!
>>
>> http://www.stsc.hill.af.mil/crosstalk/2000/08/mccormick.html
> 
> Yes, I've read that article.  It would really be sad if Ada were not
> superior to C for a toy problem in embedded control system development,
> since Ada was designed specifically for that purpose.  But the point
> was that expressiveness drives programmers to new languages, and Ada
> isn't particularly expressive.
> 
Ada is quite expressive, but it can be very clumsy when you 
want to write a flexible routine.  You can do it, and you can 
do it with much greater safety, but it takes more work.

Just, for instance, look at the Ada version of:
#include  <stdio.h>
void main()  {  print ("Hello", " ", "World!");	exit(0);  }

(I may have gotten that wrong, my main language is Python. 
I'm intending to move to Ada for a particular project that 
I've got in mind, because I despise the way C uses pointers.)



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27  5:43       ` Charles D Hixson
@ 2007-01-27  8:38         ` Dmitry A. Kazakov
  2007-01-28 12:11           ` Michael Bode
  2007-01-27 13:06         ` Gautier
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-01-27  8:38 UTC (permalink / raw)


On Sat, 27 Jan 2007 05:43:28 GMT, Charles D Hixson wrote:

> Ada is quite expressive, but it can be very clumsy when you 
> want to write a flexible routine.  You can do it, and you can 
> do it with much greater safety, but it takes more work.
> 
> Just, for instance, look at the Ada version of:
> #include  <stdio.h>
> void main()  {  print ("Hello", " ", "World!");	exit(0);  }
> 
> (I may have gotten that wrong, my main language is Python. 
> I'm intending to move to Ada for a particular project that 
> I've got in mind, because I despise the way C uses pointers.)

The example is wrong C, but the point is that such examples show little if
anything. Who is designing console applications today? What about "Hello
World" in X11, Windows API, GTK+ or similar with a requirement of some
definite look-and-feel? If we looked at C, Ada, Python code of that, we
would find all them far from being expressive.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27  5:12     ` Charles D Hixson
@ 2007-01-27  9:52       ` Markus E Leypold
  2007-01-27 22:01         ` Charles D Hixson
  2007-01-29 23:56       ` Randy Brukardt
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-27  9:52 UTC (permalink / raw)



Charles D Hixson <charleshixsn@earthlink.net> writes:

> goes on.  I, personally, think that one factor IS the lack of a
> standard garbage collector.  Related to this is the awkwardness of
> dealing with strings of varying lengths.

Actually that is rather good compared to C or C++. The different
string package s make it possible to often use stack allocated storage
only or get the comfort of unbounded strings. After a while I really
started to like this.

> There are many features which aren't significant after you've learned
> how to work your way around them that can be sizeable blockages at the
> start.
>
> Remember that at the start, Ada was competing solely against C.  C++
> barely existed, and what did exist didn't bear that much resemblance
> to what we now call C++.  (I used to use C++ rather than C solely
> because of the typed constants.)  At that point C had to be cut down
> to compile on a micro-computer. Look up BSD C or Lifeboat C.  These
> were SUBSETS of C, but they could be used, and they could call
> themselves C.  


Yes. That is the availability issue again. I personally would have
wished that there where a suitable subset of Ada (without tasking)
that could be linked with C (or FORTRAN). That would have helped to
survive in a mixed environment a furthered slow migration from an
existing code base.

> (Once you started using them, you became well aware
> that they were subsets...works in progress as it were.)  

> Ada subsets couldn't use the name Ada.  Janus Ada couldn't call
> itself Ada for quite awhile.  And even Janus Ada couldn't run on
> most CP/M machines.  Too resource intensive.  I bought an Apple ][
> to run UCSD Pascal, and was so disappointed that I saved up and
> installed a CP/M card so that I could run C.  Ada wasn't a
> possibility.  (I never seriously considered Basic.  It was too
> non-portable.  If I wanted non-portable, I'd go for assembler.)


> Ada was HUGE and C++ was only a slight bit larger than C. (The times
> they DO change!)  At that time I was a PL/I programmer on a mainframe,
> and Ada had a reputation that caused me to both lust after it, and to
> fear it's complexities.  Actually, however, it was never a realistic
> possibility.  I couldn't run it on my home machine, and work sure
> wasn't going to pay the have the service bureau install it.  So I
> dreamed about it...and Algol68, and Snobol, and IPL, and APL, and
> LISP...and didn't take any of those dreams seriously.  But of those
> only Ada and Algol were frightening as well as lust provoking.

:-) Nice account.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27  5:43       ` Charles D Hixson
  2007-01-27  8:38         ` Dmitry A. Kazakov
@ 2007-01-27 13:06         ` Gautier
  2007-01-27 16:28           ` Ludovic Brenta
  2007-01-28  0:55           ` Charles D Hixson
  1 sibling, 2 replies; 397+ messages in thread
From: Gautier @ 2007-01-27 13:06 UTC (permalink / raw)


Charles D Hixson:

> Ada is quite expressive, but it can be very clumsy when you want to 
> write a flexible routine.  You can do it, and you can do it with much 
> greater safety, but it takes more work.
> 
> Just, for instance, look at the Ada version of:
> #include  <stdio.h>
> void main()  {  print ("Hello", " ", "World!");    exit(0);  }

with Ada.Text_IO;
procedure pied is begin Ada.Text_IO.Put("Hello" & " " & "World!"); end;

Is it so clumsy ?
______________________________________________________________
Gautier         -- http://www.mysunrise.ch/users/gdm/index.htm
Ada programming -- http://www.mysunrise.ch/users/gdm/gsoft.htm

NB: For a direct answer, e-mail address on the Web site!



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 13:06         ` Gautier
@ 2007-01-27 16:28           ` Ludovic Brenta
  2007-01-28  0:55           ` Charles D Hixson
  1 sibling, 0 replies; 397+ messages in thread
From: Ludovic Brenta @ 2007-01-27 16:28 UTC (permalink / raw)


Gautier writes:
>> #include  <stdio.h>
>> void main()  {  print ("Hello", " ", "World!");    exit(0);  }
>
> with Ada.Text_IO;
> procedure pied is begin Ada.Text_IO.Put("Hello" & " " & "World!"); end;

LOL!

For those who don't know French: "main" means "hand", "pied" means
"foot".

-- 
Ludovic Brenta.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  9:40           ` Markus E Leypold
  2007-01-26  8:52             ` Ludovic Brenta
@ 2007-01-27 16:56             ` Stephen Leake
  2007-01-27 19:58               ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Stephen Leake @ 2007-01-27 16:56 UTC (permalink / raw)


Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> But you can writ Code in C which just does something with
> things on the stack, doesn't fiddle with threads or signals and thus
> interacts minimally with the host runtime. That's the reason why C is
> so popular for libraries: You can.

You can do that in Ada as well. It is perhaps not as easy in Ada as in
C, because you must be more aware of what language constructs
require run-time support. 

For example, fixed-point types require run-time support. If you use
fixed-point types in C, it is obvious from the source code that the
run-time must include a fixed-point library, because all fixed-point
operations are function calls. In Ada, that's not so clear from the
source, because fixed-point operations are just operators.

GNAT provides 'pragma No_Run_Time' for this; it makes source code that
requires run-time support illegal. I've written code with this pragma
turned on (for export to a spacecraft executive that was mostly
written in C), and it does make Ada feel more like C. Still far
preferable to actually using C, though!

In GNAT GPL-2006, 'pragma No_Run_Time' is listed as 'obsolescent'; you
are supposed to use a "configurable run-time" instead. I'm not sure how
that impacts this issue.

-- 
-- Stephe



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25  9:29           ` Markus E Leypold
@ 2007-01-27 16:59             ` Stephen Leake
  2007-01-27 20:40               ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Stephen Leake @ 2007-01-27 16:59 UTC (permalink / raw)


Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> Given, that I recently found in 3.15p (which is not the newest one, I know):
>
>
>   - Really bad bugs handling Read and Write of discrimated records,
>
>   - Race conditions in the runtime system when trying to catch
>     interrupts,
>
>   - No way to catch the console break under windows with the interupts
>     mechanism (i.e. a design error in my opinion),
>
>
> I wonder wether GNAT was good even in, say 2002 (or whatever was
> 3.15p's release date).

It was good, in my opinion; at that time, I was far more productive
writting real applications using GNAT 3.15p than Borland C++.

> And all those problems are really expensive to circumvent (partly
> because the runtime system insists on fiddling with the signal
> handlers).

But to be fair, you have to say how easy the solution was in Ada, vs
the solution to the same problem in some other language. 

What is the equivalent of Discriminated_Record'Write in C? The concept
doesn't even exist!

-- 
-- Stephe



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26 23:04                       ` Markus E Leypold
@ 2007-01-27 19:57                         ` Frank J. Lhota
  2007-01-28 20:43                         ` adaworks
  1 sibling, 0 replies; 397+ messages in thread
From: Frank J. Lhota @ 2007-01-27 19:57 UTC (permalink / raw)


"Markus E Leypold" 
<development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> wrote in 
message news:viveito0e5.fsf@hod.lan.m-e-leypold.de...
> Unfortunately, even agreeing Ada is the "better language" in most
> aspects as a programming language (and I can imagine a number of
> scenarios where some of the Ada stuff really get's in the way and that
> is e.g. when I want to write something without the Ada runtime: I can
> write a GNU Pascal procedure and link it into a C program, but can I
> do that with something compiled by GNAT?), whatever -- even agreeing
> Ada is the "better language",
\
Actually, the GCC people have done a really good job of supporting 
multi-language programming. Only the VMS people supported multiple 
high-level languages as well. 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 16:56             ` Stephen Leake
@ 2007-01-27 19:58               ` Markus E Leypold
  2007-01-28 17:12                 ` Ed Falis
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-27 19:58 UTC (permalink / raw)



Stephen Leake <stephen_leake@stephe-leake.org> writes:

> Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>
>> But you can writ Code in C which just does something with
>> things on the stack, doesn't fiddle with threads or signals and thus
>> interacts minimally with the host runtime. That's the reason why C is
>> so popular for libraries: You can.
>
> You can do that in Ada as well. It is perhaps not as easy in Ada as in
> C, because you must be more aware of what language constructs
> require run-time support. 

:-) good to know.

>
> For example, fixed-point types require run-time support. If you use
> fixed-point types in C, it is obvious from the source code that the
> run-time must include a fixed-point library, because all fixed-point
> operations are function calls. In Ada, that's not so clear from the
> source, because fixed-point operations are just operators.
>
> GNAT provides 'pragma No_Run_Time' for this; it makes source code that

Yes. I've been researching this some time ago and had the impression
that this is what No_Run_Time does, but for some reason I was not
completely convinced (the docs on that where very sparse) and since I
didn't need it at that time I just didn't look at it further. Of
course one would wish (sometimes) not to have tasking and perhaps not
even exceptions (I think that requires run time support also, doesn't
it), but have float point.

> requires run-time support illegal. I've written code with this pragma
> turned on (for export to a spacecraft executive that was mostly
> written in C), and it does make Ada feel more like C. Still far
> preferable to actually using C, though!

I completely agree with this, especially if the compiler works (which
is another can of worms).

> In GNAT GPL-2006, 'pragma No_Run_Time' is listed as 'obsolescent'; you
> are supposed to use a "configurable run-time" instead. I'm not sure how
> that impacts this issue.

Perhaps even better: Might that provide for the option to select
exception support w/o getting tasking etc.?


Regards -- Markus






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 16:59             ` Stephen Leake
@ 2007-01-27 20:40               ` Markus E Leypold
  2007-01-27 21:19                 ` Markus E Leypold
  2007-01-29  8:56                 ` Maciej Sobczak
  0 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-27 20:40 UTC (permalink / raw)



Stephen Leake <stephen_leake@stephe-leake.org> writes:

> Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>
>> Given, that I recently found in 3.15p (which is not the newest one, I know):
>>
>>
>>   - Really bad bugs handling Read and Write of discrimated records,
>>
>>   - Race conditions in the runtime system when trying to catch
>>     interrupts,
>>
>>   - No way to catch the console break under windows with the interupts
>>     mechanism (i.e. a design error in my opinion),
>>
>>
>> I wonder wether GNAT was good even in, say 2002 (or whatever was
>> 3.15p's release date).
>
> It was good, in my opinion; at that time, I was far more productive
> writting real applications using GNAT 3.15p than Borland C++.
>
>> And all those problems are really expensive to circumvent (partly
>> because the runtime system insists on fiddling with the signal
>> handlers).
>
> But to be fair, you have to say how easy the solution was in Ada, vs
> the solution to the same problem in some other language.

Unfortunately that's not been the case that Ada was cheaper here:

 - Read/Write Bug (see below).

 - Races with interrupts / catching break -- well, catching INT or
   TERM in C or console breaks is really dead easy. With GNAT you've
   first to remove all interrupt handling, then re-attach the
   interrupts and introduce a race at the begin of the program (which
   makes the program freeze -- aborting properly at that point would
   be OK, but a freeze is not). And catching the windows console break
   in C just takes a C procedure call. Since in the GNAT RTS it's not
   mapped to an interrupt (design error, IMHO), you'll have to write a
   C stub and do some magic there (see, C again, only more effort).


> What is the equivalent of Discriminated_Record'Write in C? The concept
> doesn't even exist!

Partly the circumvention was not to use discriminated records together
with controlled types, meaning instead of having a string in an
discrimated part, the string was set to empty when it would have been
hidden by the discriminant. I could have done that in C. The other
part was to write 'Write myself and do the right thing. I could also
have done the same in C.

And C has union types. Of course they are not type safe (as C always
is), but the same effect can be achieved when writing a proper
handling procedure that depend on some discriminating field.

Look I do not want to promote C (far from it). And I'm not talking
about embedded programming or aerospace, but end user PC
applications. So in this not uncommon case / market my hypothesis is
(has been now for some time) that the ecological niche for Ada has
been closed by the development of languages and tools during the last
20 years:

  - Some low level stuff is still better done in C, since the
    operating systems and their interface libraries are written in C
    and often fit better to that language (I've the impression that
    there are some problems with the GNAT tasking runtime, since it
    introduce a new layer above the host operating system's
    philosophy). 


  - Other languages are also type safe and have become really
    fast. They have a more "expressive" and powerful type system (I'm
    basically talking about the Hindley-Milner type system here and
    OCaml or SML, not about Java, mind you. If I look at the new Java
    generics (there's a paper by Wadler about it) and compare the new
    subtyping polymorphism there with the Hindley-Milner type system,
    I just begin to see how impossible it is to write really useful
    container libraries without polymorphism of that kind. No, Ada
    doesn't have it and C doesn't have it. And the absence of useful
    generics and the absolute impossibility to get those with
    preprocessing in C is in my eyes a much more important argument
    against C as an application programming language than the buffer
    overflow problems. And yes, that applies to Java up to 1.4 also).

  - Other languages have garbage collection.

  - Even C has acquired a number of new tools (splint, valgrind, C
    Cured) that make reasonable reliable programming (we are not
    talking about autopilots and not about rockets here, at least I
    don't!) in C much more feasible.


  - The vendor situation ...

Overall I'm haunted by the impression that C + some high level
language of your choice with a proper FFI makes more sense for day to
day development of (a) complete PC applications, (b) Web software and
(c) small tools (say: "scripting").

I'm not alone with that, AFAIK. But note: I don't say this to
disparage Ada. Ada, to me, is some kind of super-Pascal. But for
historical reasons that has not played out (between Turbo Pascal being
traditional on Micros and having the community Ada didn't have at that
time and the Ada compilers coming just a bit too late), and now the
situation has changed. The time of the Pascals is over. Their "mind
share" has been swallowed by Python, Ocaml, Java, dependent on to
which paradigm of programming those people adhere.
 
If you look upon the number of books sold by O'Reilly, the big
contender is not C (and not C++ anymore). They are Java (basically for
people with a C++ mind that have seen the light and don't want to do
manual memory management any more), Python, Pearl and Ruby (no pointers, no
types, but GC) and of course C#.

The trend I see, is that GC is a must, clumsy pointer handling is out
and types are an option for those that can understand them.

I admit I'm not completely sure how all that fits together. But the
perception that it is C vs. Ada is IMHO wrong. That particular fight
was -- in a sense -- already lost when Turbo Pascal and Modula lost
out against C. (Examining that part of history should answer the OPs
question). Perhaps the answer in general is, that unreliability
doesn't matter so much in most software (MS Office i.e. has become
pretty stable despite being written in C AFAIK), since it is not
embedded or not in aerospace.

Regards -- Markus










^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 20:40               ` Markus E Leypold
@ 2007-01-27 21:19                 ` Markus E Leypold
  2007-01-28  8:44                   ` Ray Blaak
  2007-01-29  8:56                 ` Maciej Sobczak
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-27 21:19 UTC (permalink / raw)



Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> I admit I'm not completely sure how all that fits together. But the
> perception that it is C vs. Ada is IMHO wrong. That particular fight

To complement that: My impression (which I've already expressed in one
those mails that went into the black hole), is, that a number of
protagonists arguing for Ada here in this thread, argue as if Ada was
the famous silver bullet of software engineering.

But as we all know (or should), that is not true:

 - There is no silver bullet.

 - The development process has a lot of influence on the quality of
   code produced.

 - You cannot just compare languages: You need to compare the language
   in the context of existing culture / community and the market of
   tools provided for that language and the quality of thos tools.

 - There is no way to get really high quality code w/o a formal code
   review.


So now, that the pressure is of to take Ada under every circumstance
and we have recognized that Ada is just one of many factors
contributing to quality and competitiveness of a project, we are
perhaps open to see why some project might decide (have decided)
against Ada or for that against any "exotic" language for purely
economic reasons.

Just one more consideration: It might be true that Ada avoids costs in
the long run. This doesn't help if the project has a limited budget
and the primary consideration need to be to get the initial up front
costs (tools, training) below a certain limit.

We have to scenarios:

  1. Aim for high quality, spent a lot of money at the beginning for tools and
    training, but after longish preparations bring a product into the
    market which has hardly any bugs and doesn't cost much maintaining.

    Now you can have 

       (a) Success: All customers buy your product, install it and
           finally forget you, since the product works flawlessly.

       (b) Failure: Product doesn't fly. The users don't like
           it. You've lost lots of money.


  2. Aim for acceptable quality (how ever low that is). Spent some money
   for tools and hire programmers for a mainstream language. Some are
   not really good and have to be trained, but you can either fire
   them in time or train them. At least you don't have to train them
   much. Then push the product to the market as fast as possible.

   Now you can have 

       (a) Failure: You lost money, but not as much as in 1.a. 

       (b) Success: Customers like your product. Unfortunately they
           have to patch it now and then, but that makes you look good
           actually: You provide a service for your customers and they
           remember you because they have an on-going relation to your
           company.

           Eventually they buy / have to buy new versions, so become
           customers a second time (if you don't annoy them too much,
           that is just the definition of "acceptable quality").

           Of course the overall costs are higher, they have to be
           financed from the sales of the software. But that is money
           you now have.


And now pretend you're an investor: Which approach would you like
best? (1) or (2)? Note that the success is mostly dependent on wether
the customer likes your product or not -- basically the same in with
acceptable and outstanding code quality (like means: makes a good
impression at the first look). So in (1) the risk*money is higher than
in (2).

Furthermore consider now you don't have the money to pull through
(1). You can only go for (2) now. That of course costs in the long run,
but then you spent money later that you have later and don't try to
spent money now you don't have (and perhaps won't get from an
investor).

I think that is basically what has been happening in the PC software
industry for the last 20 years. Many software engineers cannot
understand why one would not ensure quality as early as possible in
the development process, since fixing bugs later costs much more than
fixing them early. But this way to look upon things assumes that the
product succeeds. If you think in terms of risk or sometimes only in
terms of liquid assets the perspective changes.

No, I certainly don't like it very much. But I suggest one should take
these factors into consideration before wondering that Ada is not more
popular then C and attributing everything to the presumed fact that
those people that make the other decision are just morons.

 - There are other languages than C around. The success of C has been
   largely the result of historical circumstances.

 - It is not technical merit alone on which languages succeed (wether
   generally or wether in single projects).

Hope this text isn't too disjoint.


Regards -- Markus















^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27  9:52       ` Markus E Leypold
@ 2007-01-27 22:01         ` Charles D Hixson
  2007-01-27 23:24           ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Charles D Hixson @ 2007-01-27 22:01 UTC (permalink / raw)


Markus E Leypold wrote:
> Charles D Hixson <charleshixsn@earthlink.net> writes:
> 
>> goes on.  I, personally, think that one factor IS the lack of a
>> standard garbage collector.  Related to this is the awkwardness of
>> dealing with strings of varying lengths.
> 
> Actually that is rather good compared to C or C++. The different
> string package s make it possible to often use stack allocated storage
> only or get the comfort of unbounded strings. After a while I really
> started to like this.
> 
Note that "After a while"?  The first hour, the first day, the 
first week, and the first month are the most important for 
forming lasting impressions.  I still remember the extreme 
frustration I experienced the first time I tried to print a 
string...it was an unbounded string, and I hadn't used 
Ada.Text_IO.Unbounded...and I didn't have anyone I could ask 
"What does that silly error message MEAN!?!?"

It's easier to do simple things in Fortran, C, Pascal, Modula 
II, PL/I or even Snobol.  Oh, yes, and BASIC, too.  (The other 
current contenders weren't around then, I never learned Cobol, 
and it isn't easier in any assembler I ever learned.)
>...
>> only Ada and Algol were frightening as well as lust provoking.
> 
> :-) Nice account.
> 
> Regards -- Markus
> 
Thanks



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 22:01         ` Charles D Hixson
@ 2007-01-27 23:24           ` Markus E Leypold
  2007-01-28  9:14             ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-27 23:24 UTC (permalink / raw)




Charles D Hixson <charleshixsn@earthlink.net> writes:

> Markus E Leypold wrote:
>> Charles D Hixson <charleshixsn@earthlink.net> writes:
>>
>>> goes on.  I, personally, think that one factor IS the lack of a
>>> standard garbage collector.  Related to this is the awkwardness of
>>> dealing with strings of varying lengths.
>> Actually that is rather good compared to C or C++. The different
>> string package s make it possible to often use stack allocated storage
>> only or get the comfort of unbounded strings. After a while I really
>> started to like this.
>>
> Note that "After a while"?  

Yes. I couldn't decide at first when better to use Fixed and when
Unbounded strings. The problem got aggravated by my attempts to stay
minimalist and use fixed string where ever possible and also where
impossible. The insight that some algorithms are better not performed
with bounded buffers took a while to take hold.


> The first hour, the first day, the first week, and the first month
> are the most important for forming lasting impressions.

Yes, but it's not the "impressions" that count. I came from Pascal
background (after learning programming in BASIC and assembler) and
looking at Ada was love at the first glance -- as far as languages of
that family go. I often thought: Wow, there is a design decision where
everything has done right after a lot of thinking and not decided in
an ad hoc fashion.

I hope I'm not limited to my first expressions, though.

> I still remember the extreme frustration I experienced the first
> time I tried to print a string...it was an unbounded string, and I
> hadn't used Ada.Text_IO.Unbounded...and I didn't have anyone I could
> ask "What does that silly error message MEAN!?!?"

 :-). Fun, yes. The error messages of GNAT(e.g.)  are sometimes less
 than helpful. Fortunately I always had my copy of Barnes' book with
 me and from C/C++/SML etc I learned to compile often and morph
 programs through syntactically correct intermediate stages. So that
 couldn't trip me up too much :-).

> It's easier to do simple things in Fortran, C, Pascal, Modula II, PL/I
> or even Snobol.  Oh, yes, and BASIC, too.  (The other current

It's easier to do simple things with languages that have lists in
them. E.g. Lisp or Scheme (or Python for today programmers). No, I
disagree: It's not even easy to do simple things in C: Every time I'm
astounded by the contortions I've to go through for anything involving
strings of varying length (OK, I can work in large statically
allocated buffer, but this simply stinks, since it imposes arbitrary
limits). And very soon the wish comes up not to have to write out
specialized list processing code every time for 'list of ints', 'list
of floats', 'list of strings', 'list of lists of strings', 'list of
trees' etc -- and than you start to suffer and it never ends. There is
no way to write generics in C, to fake them or even graft them onto
the language as an afterthought and use a precompiler or a
preprocessor (I'm perhaps exaggerating here, but the pain is there
nonetheless. I've written those things (a generics "expander" for C
:-) and I think/hope I know about what I'm talking here).

> contenders weren't around then, I never learned Cobol, and it isn't
> easier in any assembler I ever learned.)

Well, that what all my recent posts to this group try to assert: That
there are languages beyond assembler (including C) and the pascalish
family (including Ada). Of course I'm posting to c.l.a. so I know that
this might develop into flame bait. But it wasn't me, starting to bash
C (It was J R Carter :-), and when we start to discuss the respective
merits and dis-merits of various languages than we can as well go the
whole way and accept the fact that development has not stopped with
imperative compiled languages. I'm also pining for the days when I
almost understood everything the compiler did and there was no VM and
no JIT and no whatever-optimization. If you're a control freak like
me, you just feel better if you understand everything at least in
principle and can say things like "this statement/call expands to the
following instruction sequence but ..., so if you compare you can ..."
and if you can -- in principle -- recreate your own tools (which is of
course an illusion). 

But those days are long gone and, if I think about it for a moment,
good riddance. Give me a modern functional languages with lot's of
complicated types and katamorphisms every day. I finally want to get
things done -- and I suggest to compare e.g. haskell.org or Jon
Harrop's demo source for OCaml with the list of "free projects" on the
ACT site (or where ever ...) and moreover the scope of ambition of
those projects and I'm starting to anticipate what an expressive
language can be. 

Let me put it like that: Trees and graphs are fundamental models in
computer science, mathematics and finally document processing (XML
...). An expressive language supports to manipulate those easily
without much ado, without wrapping objects or iterators or factory
patterns around a simple concept, without the need for explicit memory
management, preferably by recursive functions which are the natural
form of expressing a transformation of those structure.

How does Ada stand there? How does C? Both bad. Fixing the argument
(Why/where does Ada fail?) on C vs. Ada misses an important
point. Both languages are relatively good in shoveling bits from one
place (bus, memory, ports, interfaces) to another and doing not too
complicated things to those bits.  But that is not where the real
interesting things happen these days.  Say, in example, "scene
graph". Can you feel the complexity :-)? Do you want to maintain it
with explicit pointers and memory management in an imperative fashion?
I don't.

But whatever - YMMV (that applies to everyone on c.l.a. :-)

Regards -- Markus













^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 13:06         ` Gautier
  2007-01-27 16:28           ` Ludovic Brenta
@ 2007-01-28  0:55           ` Charles D Hixson
  2007-01-28  1:18             ` Ludovic Brenta
                               ` (2 more replies)
  1 sibling, 3 replies; 397+ messages in thread
From: Charles D Hixson @ 2007-01-28  0:55 UTC (permalink / raw)


Gautier wrote:
> Charles D Hixson:
> 
>> Ada is quite expressive, but it can be very clumsy when you want to 
>> write a flexible routine.  You can do it, and you can do it with much 
>> greater safety, but it takes more work.
>>
>> Just, for instance, look at the Ada version of:
>> #include  <stdio.h>
>> void main()  {  print ("Hello", " ", "World!");    exit(0);  }
> 
> with Ada.Text_IO;
> procedure pied is begin Ada.Text_IO.Put("Hello" & " " & "World!"); end;
> 
> Is it so clumsy ?
> ______________________________________________________________
> Gautier         -- http://www.mysunrise.ch/users/gdm/index.htm
> Ada programming -- http://www.mysunrise.ch/users/gdm/gsoft.htm
> 
> NB: For a direct answer, e-mail address on the Web site!
Sorry, I knew I shouldn't have used that example.  (Of course, 
maybe it's the texts I'm looking at for Ada...).

But I was intentionally NOT concatenating the strings. 
Perhaps if the central string were a number?  or an unbounded 
string instead of a fixed string?  Or perhaps my concerns are 
those of a neophyte...but I've got around 8 texts, and they 
all use that verbose form...Thus:

with Ada.Text_IO;
use Ada.Text_IO;
procedure pied is
begin
   Put("Hello ");
   Put(3);
   put("World!");
   newline;
end pied;

I acknowledge that this vertical separation is optional ... 
but doesn't the put(3) need to be a separate print statement?
(And now that my C is coming back to me a bit the C print 
statement should have been:
printf("Hello, %d %s\n", 3, "World");

As shown by the silly mistake that I made in C, it's not my 
favorite language.  But when I compare a book on expert 
systems written in C and one written in Ada (well, Ada83), the 
book in Ada is both thicker and has more pages devoted to code 
than the book in C, and the expert systems are approximately 
of equivalent power (i.e., toy systems).   Well, the type size 
is a trifle larger, and the paper a bit thicker...so the 
comparison isn't quite as straightforward as I'm making it 
out, but basically Ada83 appears to take approx 1.5 times as 
many lines to do the same thing.  (I didn't count the lines. 
It's a rough estimate.  Say somewhere between 1.1 and 1.9 
times as many lines.)  Possibly this is a comment on the 
skills of the authors, but the number of extensive comparisons 
I've encountered is very limited.  (Or maybe it's a comment on 
Ada83, and doesn't apply to Ada95.)



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28  0:55           ` Charles D Hixson
@ 2007-01-28  1:18             ` Ludovic Brenta
  2007-01-28 17:06             ` Jeffrey R. Carter
  2007-01-28 21:11             ` adaworks
  2 siblings, 0 replies; 397+ messages in thread
From: Ludovic Brenta @ 2007-01-28  1:18 UTC (permalink / raw)


Charles D Hixson writes:
> basically Ada83 appears to take approx 1.5 times as many lines to do
> the same thing.  (I didn't count the lines. It's a rough estimate.
> Say somewhere between 1.1 and 1.9 times as many lines.)  Possibly
> this is a comment on the skills of the authors, but the number of
> extensive comparisons I've encountered is very limited.  (Or maybe
> it's a comment on Ada83, and doesn't apply to Ada95.)

In my experience, the ratio is close to 1.0.  Ada's verbosity in some
areas (type declarations and explicit conversions, access types,
traditionally longer identifiers and keywords, etc.)  is compensated
for by more concise constructs in others like array slices, returning
objects of unconstrained types, generics, overloading, implicit rather
than explicit run-time checks, etc., so it all evens out.

Ada 95's main feature, object-oriented programming, does not have a
direct equivalent in C; so writing an object-oriented program in C,
complete with dynamic dispatching and run-time type identification,
would be much more verbose than in Ada.  For example, GTK+ and GNOME
are written in pure C but are object-oriented, and _quite_ verbose.
The same would apply to tasking and generics.

Maybe your observation is true for smaller programs.  It may be that
for very large programs, Ada is actually more concise than C...

-- 
Ludovic Brenta.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 21:19                 ` Markus E Leypold
@ 2007-01-28  8:44                   ` Ray Blaak
  0 siblings, 0 replies; 397+ messages in thread
From: Ray Blaak @ 2007-01-28  8:44 UTC (permalink / raw)


Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
> No, I certainly don't like it very much. But I suggest one should take
> these factors into consideration before wondering that Ada is not more
> popular then C and attributing everything to the presumed fact that
> those people that make the other decision are just morons.
> 
>  - There are other languages than C around. The success of C has been
>    largely the result of historical circumstances.
> 
>  - It is not technical merit alone on which languages succeed (wether
>    generally or wether in single projects).
> 
> Hope this text isn't too disjoint.

Excellent actually. It is always good to be reminded that the real reasons
things succeed or fail often (usually?) have nothing to do with sound logical
principles, but instead everything to do with those confusing irrational
behaviours people tend to engage in: politics, emotions, first impressions,
the insanity of the marketplace in general.

It is not enough or even economical to be "right".

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 23:24           ` Markus E Leypold
@ 2007-01-28  9:14             ` Dmitry A. Kazakov
  2007-01-28 15:06               ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-01-28  9:14 UTC (permalink / raw)


On Sun, 28 Jan 2007 00:24:27 +0100, Markus E Leypold wrote:

> Charles D Hixson <charleshixsn@earthlink.net> writes:

>> It's easier to do simple things in Fortran, C, Pascal, Modula II, PL/I
>> or even Snobol.  Oh, yes, and BASIC, too.  (The other current
> 
> It's easier to do simple things with languages that have lists in
> them. E.g. Lisp or Scheme (or Python for today programmers). No, I
> disagree: It's not even easy to do simple things in C: Every time I'm
> astounded by the contortions I've to go through for anything involving
> strings of varying length (OK, I can work in large statically
> allocated buffer, but this simply stinks, since it imposes arbitrary
> limits). And very soon the wish comes up not to have to write out
> specialized list processing code every time for 'list of ints', 'list
> of floats', 'list of strings', 'list of lists of strings', 'list of
> trees' etc -- and than you start to suffer and it never ends. There is
> no way to write generics in C, to fake them or even graft them onto
> the language as an afterthought and use a precompiler or a
> preprocessor (I'm perhaps exaggerating here, but the pain is there
> nonetheless. I've written those things (a generics "expander" for C
> :-) and I think/hope I know about what I'm talking here).

Generics is a wrong answer and always was. As well as built-in lists you
are praising is, because what about trees of strings, trees of lists etc.
You cannot build every and each type of containers in.

Right answers should be a more powerful type system than Ada presently has.
In my view there are three great innovations Ada made, which weren't
explored at full:

1. Constrained subtypes (discriminants)
2. Implementation inheritance without values (type N is new T;)
3. Typed classes (T /= T'Class)

P.S. All strings have fixed length. It is just so that you might not know
the length at some point... (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27  8:38         ` Dmitry A. Kazakov
@ 2007-01-28 12:11           ` Michael Bode
  2007-01-28 15:20             ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Michael Bode @ 2007-01-28 12:11 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> The example is wrong C, but the point is that such examples show little if
> anything. Who is designing console applications today? What about "Hello
> World" in X11, Windows API, GTK+ or similar with a requirement of some
> definite look-and-feel? If we looked at C, Ada, Python code of that, we
> would find all them far from being expressive.

I'd say such micro examples don't show anything even if you make them
GUI examples. A graphical "hello world" in Tcl/Tk would be:

pack [label .l -text "Hello World!"]

So why isn't  Tcl/Tk the most popular language today?

One hint on why Ada isn't popular might be my own case. I'm an
autodidact in programming (I'd never dare to call me software
engineer). I learned programming at the age of 14 with BASIC, later I
discovered Turbo-Pascal and liked it much better (I had to unlearn
some quite bad BASIC habits).

When I switched from DOS to OS/2 I found out what's so bad about
vendor lock in: no Turbo-Pascal for OS/2. So I looked for a better
standardized language and changed to C/C++ but never liked it
much. I've found out about Ada only 3 years ago on Usenet. Finally
there was a (Turbo-)Pascal done right. Had there only been more
advertising for Ada95 10 years ago that could have saved me years of
C.

-- 
Michael Bode



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28  9:14             ` Dmitry A. Kazakov
@ 2007-01-28 15:06               ` Markus E Leypold
  2007-01-29 14:37                 ` Dmitry A. Kazakov
  2007-01-29 16:23                 ` How come Ada isn't more popular? Georg Bauhaus
  0 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-28 15:06 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Sun, 28 Jan 2007 00:24:27 +0100, Markus E Leypold wrote:
>
>> Charles D Hixson <charleshixsn@earthlink.net> writes:
>
>>> It's easier to do simple things in Fortran, C, Pascal, Modula II, PL/I
>>> or even Snobol.  Oh, yes, and BASIC, too.  (The other current
>> 
>> It's easier to do simple things with languages that have lists in
>> them. E.g. Lisp or Scheme (or Python for today programmers). No, I
>> disagree: It's not even easy to do simple things in C: Every time I'm
>> astounded by the contortions I've to go through for anything involving
>> strings of varying length (OK, I can work in large statically
>> allocated buffer, but this simply stinks, since it imposes arbitrary
>> limits). And very soon the wish comes up not to have to write out
>> specialized list processing code every time for 'list of ints', 'list
>> of floats', 'list of strings', 'list of lists of strings', 'list of
>> trees' etc -- and than you start to suffer and it never ends. There is
>> no way to write generics in C, to fake them or even graft them onto
>> the language as an afterthought and use a precompiler or a
>> preprocessor (I'm perhaps exaggerating here, but the pain is there
>> nonetheless. I've written those things (a generics "expander" for C
>> :-) and I think/hope I know about what I'm talking here).
>
> Generics is a wrong answer and always was. As well as built-in lists you
> are praising is, because what about trees of strings, trees of lists etc.
> You cannot build every and each type of containers in.

You missed my point. :-). A language with a Hinldey-Milner type system
has a 'tree of something' type where something can be anything in
every given instance of usage.

So you get 'tree of foo' from it and 'tree of bar' when you use it
like this, but there are no generics instantiated. There are trade
offs, of course (like that you probably can't design such a language
without GC and without loosing a certain degree of control of the
representation of types), but since C D Hixson and me have been
talking about usefuleness in a general sense, that doesn't count
here. A useful language needs to have lists, you must be able to write
recursive functions on lists and preferably in a functional way and
you should not need explicit instantiation declaration for every
instance (like 'of float', 'of int', 'of ...').

Languages of the Algol-FORTRAN-C-Pascal-Ada group are all far from
that ideal. Since a lot of programming these days is general list
manipulation, everyday jobs become painful. 

Has anybody here aver wondered about the popularity of "scripting",
like with Perl, PHP, Python and so on? As I see it, the presence of
lists and hashtables/dictionaries in those languages together with the
absence of memory management is a factor in this popularity. Which I
hope, if not proves, at least makes my point plausible. :-).


>
> Right answers should be a more powerful type system than Ada presently has.

Yes.

> In my view there are three great innovations Ada made, which weren't
> explored at full:
>
> 1. Constrained subtypes (discriminants)
> 2. Implementation inheritance without values (type N is new T;)
> 3. Typed classes (T /= T'Class)


Here 2 things are missing: 

  - parameterized types (see Java generics and the ML type system, the
    first is just borrowing from the latter).

  - Separation of implementation and type or to put it differently
    inheritance and subtyping compatibility. See the ocaml type system.

    I admit the contracts are weaker for allowing to instante a
    generic with a package with the "right type signature" as
    parameter instead of requiring an instance of another specific
    generic.

    But what is absolutely annoying, is, that the compatibility of
    objects is determined by inheritance instead by the type
    signature. This makes things like the factory pattern necessary
    and it really doesn't work in general case. (And yes, Java and C++
    share this defect).

I suggest that anyone trying to improve on the type system of
languages of the Algol family first should aquire a knowledge of the
Hindley-Milner type system, then read the part about objects and
classes in the OCAML manual (this is an extension of Hindley-Milner) 

  http://caml.inria.fr/pub/docs/manual-ocaml/manual005.html

and finally reads the tutorial on Java generics (introduced in 1.5):

  http://java.sun.com/j2se/1.5/pdf/generics-tutorial.pdf


To me those three together were an eye opener. The Java generics
tutorial in I my eyes documents 2 things: (a) what has been really
sorely missing from Java for 10 years and (b) that you can complement
a type safe pascalish type system usefully with subtyping and
parameterized types.
      
   
> P.S. All strings have fixed length. It is just so that you might not know
> the length at some point... (:-))

Ah, well, thats splitting hairs. We are talking about 2 different
lengths (of strings in C) here: The one is the allocated storage, the
other the distance from the start to the first occurrence of the '\0'
delimiter. Since strlen() returns the latter ...

The problem I've been talking above, is of course to manage growing
strings that might outgrow the allocated storage.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 12:11           ` Michael Bode
@ 2007-01-28 15:20             ` Markus E Leypold
  2007-01-29  9:44               ` Martin Krischik
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-28 15:20 UTC (permalink / raw)




Michael Bode <m.g.bode@web.de> writes:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>
>> The example is wrong C, but the point is that such examples show little if
>> anything. Who is designing console applications today? What about "Hello
>> World" in X11, Windows API, GTK+ or similar with a requirement of some
>> definite look-and-feel? If we looked at C, Ada, Python code of that, we
>> would find all them far from being expressive.
>
> I'd say such micro examples don't show anything even if you make them
> GUI examples. A graphical "hello world" in Tcl/Tk would be:
>
> pack [label .l -text "Hello World!"]
>
> So why isn't  Tcl/Tk the most popular language today?
>
> One hint on why Ada isn't popular might be my own case. I'm an
> autodidact in programming (I'd never dare to call me software
> engineer). I learned programming at the age of 14 with BASIC, later I
> discovered Turbo-Pascal and liked it much better (I had to unlearn
> some quite bad BASIC habits).
>
> When I switched from DOS to OS/2 I found out what's so bad about
> vendor lock in: no Turbo-Pascal for OS/2. So I looked for a better
> standardized language and changed to C/C++ but never liked it
> much. 

Almost sounds like my biography :-). I'd bet you're +/- 3 years my age
(which is not going to be disclosed here ...).

> I've found out about Ada only 3 years ago on Usenet. Finally
> there was a (Turbo-)Pascal done right. Had there only been more
> advertising for Ada95 10 years ago that could have saved me years of
> C.

On factor that hugely contributed to the success of Borland's Turbo
line of products (at least in Germany) were the excellently written
manuals which documented anything -- from the language itself,
library, the stack layout and the interfacing with C and assembler. If
documents like this would have been available as a boxed set for, say,
GNAT in, say, 1990 or even 1995, then the world would perhaps look
differently today. 

As it is, as a hobbyist, you still have to collect stuff from all over
the internet (libraries, documents and some stuff is really not
documented at all). And I still think that making it easy for
beginners (people who learn their first language and don't know much
about operating systems, linking and how all that fits together) is
absolutely necessary to achieve popularity.

In some other post somebody asked wether it was a factor contributing
toward C's popularity that C was the language in which major
operating systems were written in. I don't think so. I suggest that
the availability of Quick C and Turbo C for 8/16-Bit micros was
perhaps a large if not the major factor.

Regards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28  0:55           ` Charles D Hixson
  2007-01-28  1:18             ` Ludovic Brenta
@ 2007-01-28 17:06             ` Jeffrey R. Carter
  2007-01-28 21:11             ` adaworks
  2 siblings, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-28 17:06 UTC (permalink / raw)


Charles D Hixson wrote:
> 
> As shown by the silly mistake that I made in C, it's not my favorite 
> language.  But when I compare a book on expert systems written in C and 
> one written in Ada (well, Ada83), the book in Ada is both thicker and 
> has more pages devoted to code than the book in C, and the expert 
> systems are approximately of equivalent power (i.e., toy systems).   
> Well, the type size is a trifle larger, and the paper a bit thicker...so 
> the comparison isn't quite as straightforward as I'm making it out, but 
> basically Ada83 appears to take approx 1.5 times as many lines to do the 
> same thing.  (I didn't count the lines. It's a rough estimate.  Say 
> somewhere between 1.1 and 1.9 times as many lines.)  Possibly this is a 
> comment on the skills of the authors, but the number of extensive 
> comparisons I've encountered is very limited.  (Or maybe it's a comment 
> on Ada83, and doesn't apply to Ada95.)

That's a good thing. Ada emphasizes ease of reading over ease of 
writing, since code is read far more often than it is written. C's 
emphasis is the opposite.

-- 
Jeff Carter
"All citizens will be required to change their underwear
every half hour. Underwear will be worn on the outside,
so we can check."
Bananas
29



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 19:58               ` Markus E Leypold
@ 2007-01-28 17:12                 ` Ed Falis
  2007-01-28 18:38                   ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Ed Falis @ 2007-01-28 17:12 UTC (permalink / raw)


Markus E Leypold wrote:
..
> Perhaps even better: Might that provide for the option to select
> exception support w/o getting tasking etc.?

Check out pragma Restrictions in Annex H and in the GNAT RM.

- Ed



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 17:12                 ` Ed Falis
@ 2007-01-28 18:38                   ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-28 18:38 UTC (permalink / raw)



Ed Falis <falis@verizon.net> writes:

> Markus E Leypold wrote:
> ..
>> Perhaps even better: Might that provide for the option to select
>> exception support w/o getting tasking etc.?
>
> Check out pragma Restrictions in Annex H and in the GNAT RM.

Yes. Thanks. That's cool. 

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-25 21:42           ` Randy Brukardt
@ 2007-01-28 19:32             ` Gautier
  2007-01-30 19:41               ` tmoran
  0 siblings, 1 reply; 397+ messages in thread
From: Gautier @ 2007-01-28 19:32 UTC (permalink / raw)


Randy Brukardt:

> "Good" is highly subjective, and what you think of as "good" may differ
> widely from other people. I know Tom thinks Janus/Ada was pretty good, and
> long before 1995...
> 
> Your opinion of good may differ, depending on what you want to do.
> 
>                       Randy.

I could not agree more. One could add that the quality (whatever subjectivity) 
of a given software fluctuates with the changes occurring around it. What was 
good long before 1995 was perhaps less good to make a popular Windows 95 + 
Ada95 product, since it is the subject of this discussion thread. I'm not a 
customer of Janus/Ada but I have some reliable 3rd-party comments :-).
On the "-" side, some limitations and rests from the DOS & Ada 83 version (I 
was told that the type Integer is on 16 bits ?!), persistent doubts about 
future developments, strange shipping media (floppy disks).
On the "+" side, an full compilation system for Ada, from source to 
executable, with smart linking, when GNAT has sometimes trouble with the gnu 
doing fuzzy things in the basement (e.g. sometimes absurd trace-backs...) and 
needs a longish and unreliable workaround (gnatelim) to achieve what a smart 
linker does in no supplemental time.
Perhaps the way Janus/Ada (95) is presented on the RR Software site is also 
misleading and hides the real virtues of the product. It seems like an Ada 83 
compiler for DOS doing a tentative trip towards Ada 95 (an "extension") and on 
other operating systems.
Anyway, it is never too late and Janus/Ada has certainly a big potential that 
relatively small details can release. A version of the CLAW demo that compiles 
with GNAT 3.15p (2002 !) or later wouldn't hurt, either, IMHO, since it's the 
pieces of RR Software people can try.
______________________________________________________________
Gautier         -- http://www.mysunrise.ch/users/gdm/index.htm
Ada programming -- http://www.mysunrise.ch/users/gdm/gsoft.htm



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26  4:23                 ` Jeffrey R. Carter
  2007-01-26 11:35                   ` Markus E Leypold
@ 2007-01-28 20:32                   ` adaworks
  2007-01-28 21:12                     ` Cesar Rabak
                                       ` (2 more replies)
  1 sibling, 3 replies; 397+ messages in thread
From: adaworks @ 2007-01-28 20:32 UTC (permalink / raw)



"Jeffrey R. Carter" <jrcarter@acm.org> wrote in message 
news:Ymfuh.321356$FQ1.7034@attbi_s71...
>
> Don't you get logical on me here. I maintain that it is impossible for humans 
> to write safe C. For proof by blatant assertion, see my previous post.
>
I would take issue with your blanket statement even though I am in
general agreement with the foundation that motivates its expression.

Every programming language isbe error-prone at different levels, including
Ada.   The question is how error-prone a language might be.  On a continuum
of more error-prone to less, Ada is at the end of the scale for less error-prone
_____________________________________________________________
More 
Less
Error-prone            ------------------> 
Error-prone

Assembler       C                           C++           Java    C#         Ada 
SPARK
                    COBOL-68       COBOL-85   Objective-C        Eiffel
                                   Early PL/I                        Recent PL/I
______________________________________________________________

A more detailed continuum could be developed that covers more languages.  I only
show a small representation here.

I show Eiffel and Ada at the same place on the continuum.   Many would argue
with this.   I don't think that anyone who knows both Ada and C++ really well
would argue with their relative positioning.   Java is more error-prone than Ada
due some issues with its type model and the preservation of some dangerous
constructs.  C# design improves the dependability of some of those constructs.

As I have often noted in this forum and elsewhere, I often wonder why someone
who chooses to develop software in a language that is error-prone would expect
a software system that is error-free.   Although a language, by itself, cannot
guarantee a system that is error-free, one would expect an outcome that requires
less debugging time and would have fewer defects than with a language that is
error-prone.

While C++ may have some capabilities not found in other languages, it is still
a poor choice for software where dependability is important.   It is not a 
language
well-suited to safety-critical software.   On the other hand, I am impresed with
a lot of the design elements of C#.   It still fall short of being ideal for 
safety-critical
software, but it is an improvement over C++ and Java.

Richard Riehle






 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-26 23:04                       ` Markus E Leypold
  2007-01-27 19:57                         ` Frank J. Lhota
@ 2007-01-28 20:43                         ` adaworks
  2007-01-28 22:57                           ` Markus E Leypold
  2007-01-29  1:04                           ` Jeffrey R. Carter
  1 sibling, 2 replies; 397+ messages in thread
From: adaworks @ 2007-01-28 20:43 UTC (permalink / raw)



"Markus E Leypold" <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> 
wrote in message >
>
> I'm a bit surprised, actually. I would have thought Pascal simpler to
> learn.
>
It depends on what you want students to learn.   At an equivalent
level to Pascal, Ada is easier to learn primarily due to its improved
consistency of syntax.   At the more advanced level, it is more difficult
because of the difference in design issues.

Ease of learning should not an important factor when considering the
adoption of a programming language.   Python is amazingly easy to
learn.  So is Ruby.  Both are excellent languages. I like them both. Neither
is the right choice for safety-critical embedded systems.

C++ is not easy to learn.   To do C++ safely is really hard to learn. In
this case, it is easier to learn how to safe programming in Ada than in
C++.

Most of the time C++ and Ada are both taught badly.    That can make
them hard to learn.   The most frequently overlooked dimension of Ada
that is misunderstood by those who try to teach it is the scope and
visibility rules.    Yet that, not strong typing, is at the heart of the design
of the language.   It is sad that most instructors who try to teach Ada
don't understand those rules well enough to show how important they
are to being a good Ada programmer/designer.

With Ada, once a someone understands Chapter Eight of the ALRM, the
rest of the language issues fall into place quite easily.

Richard Riehle





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28  0:55           ` Charles D Hixson
  2007-01-28  1:18             ` Ludovic Brenta
  2007-01-28 17:06             ` Jeffrey R. Carter
@ 2007-01-28 21:11             ` adaworks
  2 siblings, 0 replies; 397+ messages in thread
From: adaworks @ 2007-01-28 21:11 UTC (permalink / raw)



"Charles D Hixson" <charleshixsn@earthlink.net> wrote in message 
news:zwSuh.15379>


>  the book in Ada is both thicker and has more pages devoted to code than the 
> book in C, and the expert systems are approximately of equivalent power (i.e., 
> toy systems).

No an expert systems book, but a thin book on Ada that has
most of what a newbie needs

           Ada Distilled

Download it free from www.adaic.com

Richard Riehle






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 20:32                   ` adaworks
@ 2007-01-28 21:12                     ` Cesar Rabak
  2007-01-28 22:43                       ` Markus E Leypold
  2007-01-28 22:38                     ` Markus E Leypold
  2007-01-29  1:02                     ` Jeffrey R. Carter
  2 siblings, 1 reply; 397+ messages in thread
From: Cesar Rabak @ 2007-01-28 21:12 UTC (permalink / raw)


adaworks@sbcglobal.net escreveu:
> "Jeffrey R. Carter" <jrcarter@acm.org> wrote in message 
> news:Ymfuh.321356$FQ1.7034@attbi_s71...
>> Don't you get logical on me here. I maintain that it is impossible for humans 
>> to write safe C. For proof by blatant assertion, see my previous post.
>>
> I would take issue with your blanket statement even though I am in
> general agreement with the foundation that motivates its expression.
> 
> Every programming language isbe error-prone at different levels, including
> Ada.  

Nice contribution, Richard, but the line wrapping mangled the diagram 
you propose to become almost unintelligible...

Would you mind posting it with the scale top to bottom?

>  The question is how error-prone a language might be.  On a continuum
> of more error-prone to less, Ada is at the end of the scale for less error-prone
> _____________________________________________________________
> More 
> Less
> Error-prone            ------------------> 
> Error-prone
> 
> Assembler       C                           C++           Java    C#         Ada 
> SPARK
>                     COBOL-68       COBOL-85   Objective-C        Eiffel
>                                    Early PL/I                        Recent PL/I
> ______________________________________________________________
> 
> A more detailed continuum could be developed that covers more languages.  I only
> show a small representation here.

I think is already a nice account of languages for this discussion.

> 
> I show Eiffel and Ada at the same place on the continuum.   Many would argue
> with this. 

I don't so I'll follow on.

>  I don't think that anyone who knows both Ada and C++ really well
> would argue with their relative positioning.   Java is more error-prone than Ada
> due some issues with its type model and the preservation of some dangerous
> constructs.  C# design improves the dependability of some of those constructs.

Yes, now if one dares to put this in a two axis space (like the 
'quadrants' certain consulting groups love to publish�) with error 
proneness (or its inverse 'safety'[?]) against 'market push', we'll 
arrive at something with Java and C# on the safer and 'market supported' 
space and then all remains the same in the market :-(

> 
> As I have often noted in this forum and elsewhere, I often wonder why someone
> who chooses to develop software in a language that is error-prone would expect
> a software system that is error-free. 

Because:

>  Although a language, by itself, cannot
> guarantee a system that is error-free, one would expect an outcome that requires
> less debugging time and would have fewer defects than with a language that is
> error-prone.

present beliefs in the SW industry are that the process (CMM sense) is 
what makes SW more reliable or not.

Upper management is used to navigate in the muddy waters of non perfect 
systems: unreliable OSs, virus/worm/trojan threats, so coping with a 
'popular' language is part of the (usual) business.

Another question we tend to forget about C is the formidable ecosystem 
that exists to help companies to get right with lint-like tools, memory 
checkers, etc.

> 
> While C++ may have some capabilities not found in other languages, it is still
> a poor choice for software where dependability is important.   It is not a 
> language
> well-suited to safety-critical software.   On the other hand, I am impresed with
> a lot of the design elements of C#.   It still fall short of being ideal for 
> safety-critical
> software, but it is an improvement over C++ and Java.

Food for thought:

Let's see, if we pick a recent project (Next-Generation James Webb Space 
Telescope) what would be the proposed language by IBM?




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 20:32                   ` adaworks
  2007-01-28 21:12                     ` Cesar Rabak
@ 2007-01-28 22:38                     ` Markus E Leypold
  2007-01-29 16:16                       ` adaworks
  2007-01-29  1:02                     ` Jeffrey R. Carter
  2 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-28 22:38 UTC (permalink / raw)



<adaworks@sbcglobal.net> writes:

> As I have often noted in this forum and elsewhere, I often wonder why someone
> who chooses to develop software in a language that is error-prone would expect
> a software system that is error-free.   

But I think that "they" don't expect a software system that is error
free. Generally the probability of errors is just one factor
contributing to the overall cost of a project. Other factors are the
cost of having to re-implement already existing libraries or software,
the cost of having to maintain teams for more than one language,
support fees for tools, the cost of not having some tools for this
language, the cost of hiring for "unusual" languages, the cost of not
having a garbage collection etc.

I think that it is this point of view that makes decisions against Ada
for some other language at least understandable, even if you would
have decided differently. 

As I have said repeatedly (probably in in those posts lost to the
black hole) the decision against Ada is perhaps something you can
lament, but to call it irrational is, IMHO, irrational. One has to
understand the motives of other peoples decision to be able to
influence them. Of course calling them just ill-informed (as some post
in this thread do, not necessarily yours) helps to feel secure in ones
own position, but it doesn't help towards changing other peoples
position.

> Although a language, by itself, cannot guarantee a system that is
> error-free, one would expect an outcome that requires less debugging
> time and would have fewer defects than with a language that is
> error-prone.

Language is only one factor from many that influence the "bug" rate in
the final product. The bug rate on the other side is only one if many
factors that make up the overall cost.

I'm not surprised that "error proneness" doesn't prove to be the final
decisive argument in the favor of some language in all not even in the
majority of cases. 

It is just the economy of software development and the customers
priorities 

   (hello you people here using Word -- why don't you use TeX which
   _definitely_ has less bugs? -- see what I mean?)

which decide on
which kind of development is finally paid for. See the state of Ada as
a summary of the state of demand by the market for certain software
attributes.


> While C++ may have some capabilities not found in other languages, it is still
> a poor choice for software where dependability is important.   

Exactly: "Where dependability is important". Which is not in consumer
software, most embedded software (who cares if MP3 players or mobiles
crashes now and than) etc. 


> It is not a language well-suited to safety-critical software.  On
> the other hand, I am impressed with a lot of the design elements of
> C#.  It still fall short of being ideal for safety-critical
> software, but it is an improvement over C++ and Java.

Exactly. Which will make it (as much as I'm sorry to condone anything
Microsoft had their grubby hands in) probably one of the languages for
the future as far as consumer software, ticket vending machines, etc. 


Regards -- Markus


PS: You didn't, I think, answer my question, why you don't think Java is type safe ... ?









^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 21:12                     ` Cesar Rabak
@ 2007-01-28 22:43                       ` Markus E Leypold
  2007-01-29 22:40                         ` Cesar Rabak
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-28 22:43 UTC (permalink / raw)



Cesar Rabak <csrabak@yahoo.com.br> writes:

> present beliefs in the SW industry are that the process (CMM sense) is
> what makes SW more reliable or not.

Well, that is not completely wrong, you know. Though usually in
practice are code reviews are rarely done, despite the fact that they
are known to be the best instrument for creating code of high quality.


> Upper management is used to navigate in the muddy waters of non
> perfect systems: unreliable OSs, virus/worm/trojan threats, so coping
> with a 'popular' language is part of the (usual) business.

Which just means they recognize the value of the "things that already
exist" oas opposed to "the things that would have yet to be
written". A wise attidute indeed.

> Another question we tend to forget about C is the formidable ecosystem
> that exists to help companies to get right with lint-like tools,
> memory checkers, etc.

Exactly my point. This mitigates the "C is utter s***" judgment a bit.


>> While C++ may have some capabilities not found in other languages,
>> it is still a poor choice for software where dependability is
>> important.  It is not a language well-suited to safety-critical
>> software.  On the other hand, I am impresed with a lot of the
>> design elements of C#.  It still fall short of being ideal for
>> safety-critical software, but it is an improvement over C++ and
>> Java.


> Food for thought:
>
> Let's see, if we pick a recent project (Next-Generation James Webb
> Space Telescope) what would be the proposed language by IBM?

Perhaps mot C#. What do you suggest?

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 20:43                         ` adaworks
@ 2007-01-28 22:57                           ` Markus E Leypold
  2007-01-29  1:04                           ` Jeffrey R. Carter
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-28 22:57 UTC (permalink / raw)



<adaworks@sbcglobal.net> writes:

> "Markus E Leypold" <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> 
> wrote in message >
>>
>> I'm a bit surprised, actually. I would have thought Pascal simpler to
>> learn.
>>

> It depends on what you want students to learn.   At an equivalent
> level to Pascal, Ada is easier to learn primarily due to its improved
> consistency of syntax.   

OK. I never had much issues with the Pascal syntax, but than my memory
might be failing too.

> At the more advanced level, it is more difficult
> because of the difference in design issues.

>
> Ease of learning should not an important factor when considering the
> adoption of a programming language.   

It wasn't me that brought that topic into the discussion. I only
commented as an aside that this finding -- quoted by J R Carter --
surprises me.

> Python is amazingly easy to learn.  So is Ruby.  Both are excellent
> languages. I like them both. Neither is the right choice for
> safety-critical embedded systems.

No, certainly not. But my impression was not that we are discussing
"How come Ada isn't more popular for safety critical embedded system?"
which would be a really small scope indeed, but more "How come Ada
isn't more popular?". Add to that that Ada was originally
proposed/designed to be a general "systems programming language", so
restricting the discussion to "safety-critical embedded systems" would
seem, at least to me, to ditch the issue.

But never mind. If the community could please come forward and say:
Yes Ada is the best language ever (but we are talking ONLY about
safety critical systems), I'd be prepared to accept that statement
:-). 

But then understand 2 things:

 - We shouldn't teach general programming in a fringe language at
   University.

 - The guy asking for the Ada translation of a C algorithm wrote a
   book about "embedded programming". He didn't say "safety
   critical". So the mini-outrage that somebody daring to write about
   embedded programming doesn't know Ada was quite uncalled for, wasn't it.


:-). So what will it be ...?


> C++ is not easy to learn.   To do C++ safely is really hard to learn. In
> this case, it is easier to learn how to safe programming in Ada than in
> C++.

I fully agree with that sentiment (controlling memory is especially
difficult). But usually projects that use C++ for embedded programming
seem only to use a really limited subset, often down to just using
name spaces only because they make modularization a bit easier.

>> Most of the time C++ and Ada are both taught badly.    That can make

Bot. Indeed.

> them hard to learn.   The most frequently overlooked dimension of Ada
> that is misunderstood by those who try to teach it is the scope and
> visibility rules.    Yet that, not strong typing, is at the heart of the design
> of the language.   It is sad that most instructors who try to teach Ada
> don't understand those rules well enough to show how important they
> are to being a good Ada programmer/designer.
>
> With Ada, once a someone understands Chapter Eight of the ALRM, the
> rest of the language issues fall into place quite easily.

Right.

Regards -- Markus (who probably still has not understood everything about Ada :-)



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 20:32                   ` adaworks
  2007-01-28 21:12                     ` Cesar Rabak
  2007-01-28 22:38                     ` Markus E Leypold
@ 2007-01-29  1:02                     ` Jeffrey R. Carter
  2007-01-30  0:21                       ` Randy Brukardt
  2 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-29  1:02 UTC (permalink / raw)


adaworks@sbcglobal.net wrote:
> 
> Every programming language isbe error-prone at different levels, including
> Ada.   The question is how error-prone a language might be.  On a continuum
> of more error-prone to less, Ada is at the end of the scale for less error-prone
> _____________________________________________________________
> More 
> Less
> Error-prone            ------------------> 
> Error-prone
> 
> Assembler       C                           C++           Java    C#         Ada 
> SPARK
>                     COBOL-68       COBOL-85   Objective-C        Eiffel
>                                    Early PL/I                        Recent PL/I
> ______________________________________________________________

Nice concept. Not a very good presentation on my newsreader. Maybe a 
vertical orientation would be better.

-- 
Jeff Carter
"All citizens will be required to change their underwear
every half hour. Underwear will be worn on the outside,
so we can check."
Bananas
29



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 20:43                         ` adaworks
  2007-01-28 22:57                           ` Markus E Leypold
@ 2007-01-29  1:04                           ` Jeffrey R. Carter
  1 sibling, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-01-29  1:04 UTC (permalink / raw)


adaworks@sbcglobal.net wrote:
>>
> It depends on what you want students to learn.   At an equivalent
> level to Pascal, Ada is easier to learn primarily due to its improved
> consistency of syntax.   At the more advanced level, it is more difficult
> because of the difference in design issues.

Right. When you get into ';' as statement separator, the need for 
begin-end blocks everywhere, the dangling-else problem, and probably a 
couple of others I don't remember right now, the "Pascal subset" of Ada 
is  cleaner and easier to use right than Pascal.

-- 
Jeff Carter
"All citizens will be required to change their underwear
every half hour. Underwear will be worn on the outside,
so we can check."
Bananas
29



^ permalink raw reply	[flat|nested] 397+ messages in thread

* AW: AW: How come Ada isn't more popular?
  2007-01-26 11:52                 ` Markus E Leypold
@ 2007-01-29  6:16                   ` Grein, Christoph (Fa. ESG)
  2007-01-29 14:31                     ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Grein, Christoph (Fa. ESG) @ 2007-01-29  6:16 UTC (permalink / raw)
  To: comp.lang.ada

Markus E Leypold wrote:
>> Fact is: The students were not able to finish the job, so they were
>> given more and more source. And even with 60%, they were still not able.

> See my comment on the helpfullness of help in another articel. I've
> only been trying to put forward a general point of view on the idea
> that more help brings the students nearer to success in any case. 

You're absolutely correct on this, no help and also more help up to 60% did not help the students finish their work in C.

On the other hand, using Ada they were able to finish. This is the very definite result of the study - contrary to what was expected by the instructor.

(Wannst 'as g'lesen hättst, dadst ned so daherredn :-)



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27 20:40               ` Markus E Leypold
  2007-01-27 21:19                 ` Markus E Leypold
@ 2007-01-29  8:56                 ` Maciej Sobczak
  2007-01-29 14:21                   ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-01-29  8:56 UTC (permalink / raw)


Markus E Leypold wrote:

 > They are Java (basically for
> people with a C++ mind that have seen the light and don't want to do
> manual memory management any more

Sorry, but that's a misconception - I don't remember when I was the last 
time I was messing with manual memory management in a regular C++ code. 
I estimate that in my current programming I call delete (or free or 
whatever) once in 5-10 kLOC.
Is Java going to save me from this *nightmare*? Wow, I'm impressed.

> The trend I see, is that GC is a must, clumsy pointer handling is out
> and types are an option for those that can understand them.

Indeed, looks like everybody is going in that direction.


-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 15:20             ` Markus E Leypold
@ 2007-01-29  9:44               ` Martin Krischik
  0 siblings, 0 replies; 397+ messages in thread
From: Martin Krischik @ 2007-01-29  9:44 UTC (permalink / raw)


Markus E Leypold schrieb:

> As it is, as a hobbyist, you still have to collect stuff from all over
> the internet (libraries, documents and some stuff is really not
> documented at all). And I still think that making it easy for
> beginners (people who learn their first language and don't know much
> about operating systems, linking and how all that fits together) is
> absolutely necessary to achieve popularity.

Exacly the problem The GNU Ada project [1] set out to solve.

Reminder: The project is not about providing a compiler! It is about 
providing compiler and librarys.

Only we are short of manpower.

Martin

[1] http://gnuada.sf.net



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29  8:56                 ` Maciej Sobczak
@ 2007-01-29 14:21                   ` Markus E Leypold
  2007-01-31  9:23                     ` Maciej Sobczak
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-29 14:21 UTC (permalink / raw)




Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:
>
>  > They are Java (basically for
>> people with a C++ mind that have seen the light and don't want to do
>> manual memory management any more
>
> Sorry, but that's a misconception - I don't remember when I was the
> last time I was messing with manual memory management in a regular C++
> code. I estimate that in my current programming I call delete (or free
> or whatever) once in 5-10 kLOC.

OK. I'll have to reconsider this statement. I usually couldn't trim
down 'automatic' allocation to that extent, but that might have been
my application area. What I remember though, is the difficulty to
recover from exceptions in the presence of automatic (scope bound)
memory management. (I hope I'm making sense here, else I'd really have
to go back to my C++ mind and notes and try to retrieve the right
vocabulary and reasoning and -- well -- I don't really want to have a
C++ discussion in c.l.a. :-). If we must, let's shift that to personal
mail or to another group ...).

> Is Java going to save me from this *nightmare*? Wow, I'm impressed.

Good for you, if it is not a nightmare. But from what I remember in
the 1997s to 1998s (that was when there still were problems with STLs,
exceptions and the string libraries in C++ and when there was no
standard and Java was new), that this was one of the motivations that
people shifted to Java (either from C++ or from C). The other
motivation was the "portable GUI" which, I think, mostly disappointed
the expectations.

Of course I might be wrong. This is just teh impression I got "from
the trenches" and I might be missing a mor global point of view. It
perhaps does not apply today where C++ and the understanding of C++
has matured a bit (there is even an embedded subset of C++ which will
annoy folks here no end :-).


>> The trend I see, is that GC is a must, clumsy pointer handling is out
>> and types are an option for those that can understand them.
>
> Indeed, looks like everybody is going in that direction.

And certainly. Why should advances in hardware only buy more spiffy
GUIs and not something to ease the everyday pain for the everyday
software developer :-).

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: AW: AW: How come Ada isn't more popular?
  2007-01-29  6:16                   ` AW: " Grein, Christoph (Fa. ESG)
@ 2007-01-29 14:31                     ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-29 14:31 UTC (permalink / raw)




"Grein, Christoph (Fa. ESG)" <Christoph.Grein@eurocopter.com> writes:

> Markus E Leypold wrote:

>>> Fact is: The students were not able to finish the job, so they were
>>> given more and more source. And even with 60%, they were still not able.
>
>> See my comment on the helpfullness of help in another articel. I've

Wow, I hate to see the number of spelling errors NNTP introduces in my
articles every time. I swear they were word perfect when they left my
news reader. I swear!!!

(Note to self: Never forget to use flyspell-mode ...)


>> only been trying to put forward a general point of view on the idea
>> that more help brings the students nearer to success in any case. 

> You're absolutely correct on this, no help and also more help up to
> 60% did not help the students finish their work in C.
>
> On the other hand, using Ada they were able to finish. This is the
> very definite result of the study - contrary to what was expected by
> the instructor.


> (Wannst 'as g'lesen h�ttst, dadst ned so daherredn :-)

I definitely should read the study. As I said: Not now. I can't
continue to discuss on this branch any more for lack of having done my
homework :-). So please excuse this student here. 

My other arguments, that suitability of language is only a small
factor in the decision for a certain language / development system and
the final quality of the system delivered, still stands. Yes I'd wish
Linux was written in Ada, and yes I'd wish people would use Ada
instead of C. But I also would wish to be a bit more independent from
the vagaries of AdaCore and that there would be more incentive or will
to actually release a high quality free compiler (with a GMGPL
runtime, mind you). Etc. Never mind.

Regards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 15:06               ` Markus E Leypold
@ 2007-01-29 14:37                 ` Dmitry A. Kazakov
  2007-01-29 15:50                   ` Markus E Leypold
  2007-01-29 16:23                 ` How come Ada isn't more popular? Georg Bauhaus
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-01-29 14:37 UTC (permalink / raw)


On Sun, 28 Jan 2007 16:06:48 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Sun, 28 Jan 2007 00:24:27 +0100, Markus E Leypold wrote:

>> Generics is a wrong answer and always was. As well as built-in lists you
>> are praising is, because what about trees of strings, trees of lists etc.
>> You cannot build every and each type of containers in.
> 
> You missed my point. :-). A language with a Hinldey-Milner type system
> has a 'tree of something' type where something can be anything in
> every given instance of usage.

You have X of Y. Granted, you can play with Y, but what about X?

The point is, the language should not have either X or Y built-in. What
should be built-in is "of". The type system should be able to handle
parallel type hierarchies of X's and Y's bound by the relation "of".

> So you get 'tree of foo' from it and 'tree of bar' when you use it
> like this, but there are no generics instantiated. There are trade
> offs, of course (like that you probably can't design such a language
> without GC and without loosing a certain degree of control of the
> representation of types),

It is not obvious. I saw to good arguments for GC, so far. (it tend to
repeat in c.l.a and comp.object, a lot of foam around mouths... (:-))  Note
that non-contiguous representations of objects is a way different issue. It
seems that you want the latter when you refer to GC. To me GC is about
indeterminable scopes, upward closures and other things I don't want to
have...

> but since C D Hixson and me have been
> talking about usefuleness in a general sense, that doesn't count
> here. A useful language needs to have lists, you must be able to write
> recursive functions on lists and preferably in a functional way and
> you should not need explicit instantiation declaration for every
> instance (like 'of float', 'of int', 'of ...').

Of course you need it. There cannot be a language with only first-class
objects. Let types-1 be values, make them values. Then they will need no
declarations, just literals. Soon you will discover some types-2 which
still require declarations. Make them values too. An so on. At some level n
you will have to stop. Now, rename type-n = "type". value, type-k<n =
"value". See? You are where you have started.

> Languages of the Algol-FORTRAN-C-Pascal-Ada group are all far from
> that ideal. Since a lot of programming these days is general list
> manipulation, everyday jobs become painful. 

There always was Lisp, for those who prefer declarations spelt in the form
of bracket structures...

> Has anybody here aver wondered about the popularity of "scripting",
> like with Perl, PHP, Python and so on?

I did.

> As I see it, the presence of
> lists and hashtables/dictionaries in those languages together with the
> absence of memory management is a factor in this popularity. Which I
> hope, if not proves, at least makes my point plausible. :-).

What about Excel? I have a theory. Look Turbo Pascal was hugely popular.
Why? That is simple, because it has 256 character long strings! Borland
made a great mistake lifting that limitation. Compare it with MS-Excel,
which is still limited to 32767 rows. That is the key to success! (:-))

>> In my view there are three great innovations Ada made, which weren't
>> explored at full:
>>
>> 1. Constrained subtypes (discriminants)
>> 2. Implementation inheritance without values (type N is new T;)
>> 3. Typed classes (T /= T'Class)
> 
> Here 2 things are missing: 
> 
>   - parameterized types (see Java generics and the ML type system, the
>     first is just borrowing from the latter).

See pos.1. The constraint is the parameter of. In rare cases you wanted
different values of parameters producing isolated types you would apply 2
to 1.

>   - Separation of implementation and type or to put it differently
>     inheritance and subtyping compatibility. See the ocaml type system.

That should be interface inheritance from concrete types. Yes, Ada misses
that.

>     I admit the contracts are weaker for allowing to instante a
>     generic with a package with the "right type signature" as
>     parameter instead of requiring an instance of another specific
>     generic.

There should be no generics at all...

>     But what is absolutely annoying, is, that the compatibility of
>     objects is determined by inheritance instead by the type
>     signature.

I see it otherwise. Because "compatibility" is undecidable (for both the
computer and the programmer), the language must be strongly and
manifestedly typed. 

>    This makes things like the factory pattern necessary
>     and it really doesn't work in general case. (And yes, Java and C++
>     share this defect).

I am not sure what you mean, but when 3 is considered as 1, then
dispatching on bare type tags might become possible.

>> P.S. All strings have fixed length. It is just so that you might not know
>> the length at some point... (:-))
> 
> Ah, well, thats splitting hairs. We are talking about 2 different
> lengths (of strings in C) here: The one is the allocated storage, the
> other the distance from the start to the first occurrence of the '\0'
> delimiter. Since strlen() returns the latter ...
> 
> The problem I've been talking above, is of course to manage growing
> strings that might outgrow the allocated storage.

No, the problem is lack of abstract interfaces. I don't care about memory
if I can make my object an implementation of an abstract array interface.
There is no distance beyond the difference between two index values of the
array. You cannot (may not) associate elements of an array and its indices
with any memory locations.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29 14:37                 ` Dmitry A. Kazakov
@ 2007-01-29 15:50                   ` Markus E Leypold
  2007-01-30 19:58                     ` Robert A Duff
  2007-01-31 10:55                     ` Dmitry A. Kazakov
  0 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-29 15:50 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Sun, 28 Jan 2007 16:06:48 +0100, Markus E Leypold wrote:
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>> On Sun, 28 Jan 2007 00:24:27 +0100, Markus E Leypold wrote:
>
>>> Generics is a wrong answer and always was. As well as built-in lists you
>>> are praising is, because what about trees of strings, trees of lists etc.
>>> You cannot build every and each type of containers in.
>> 
>> You missed my point. :-). A language with a Hinldey-Milner type system
>> has a 'tree of something' type where something can be anything in
>> every given instance of usage.
>
> You have X of Y. Granted, you can play with Y, but what about X?

I don't understand the question ...

> The point is, the language should not have either X or Y built-in. What

Pursuing this argument further, a language also shouldn't have strings
built in etc. :-). 

What I think, is, that having at least lists in the "standard library"
and by that I mean functional lists, not arrays, not "containers",
helps tremendously because that is such a common use case.

> should be built-in is "of". The type system should be able to handle
> parallel type hierarchies of X's and Y's bound by the relation "of".

Yes. And exactly that is what a Hindley-Milner type system has built
in. The lists in Haskell and Ocaml are just in the standard library
(conceptually -- knowing that they are here, makes it easy to compile
them by special rules and get a more efficient implementation or have
special instructions in the underlying VM or reduction mechanism).

>> So you get 'tree of foo' from it and 'tree of bar' when you use it
>> like this, but there are no generics instantiated. There are trade
>> offs, of course (like that you probably can't design such a language
>> without GC and without loosing a certain degree of control of the
>> representation of types),
>
> It is not obvious. I saw to good arguments for GC, so far. (it tend to
> repeat in c.l.a and comp.object, a lot of foam around mouths... (:-))  

Foam, yes :-). But you have to admit, that allowing a GC without ever
having an actual implementation (execpt Martin Krischik's adaption of
the B�hm collector) just too much teasing: You could have it, but no,
it's not there.

> Note that non-contiguous representations of objects is a way
> different issue. It seems that you want the latter when you refer to
> GC. 

Not necessarily, but I think it does help a lot towards efficient
garabage collection. I might be mistaken, since I'm not really an
expert in designing garbage collectors.

> To me GC is about indeterminable scopes, upward closures and
> other things I don't want to have...

If yout have only downward scopes for "closures" and memory allocation
this will, finally, interact badly with the fact that "implicit
(i.e. static type safe) casting" of classes is also only possible
downwards. My impression is, that all these things together rule out
some useful designs, that would otherwise possible. Or to say it
differenty: Object orientation w/o indeterminable scopes, upward
closures and GC doesn't work well. Some abstractions cannot be
implemented.

This, of course is just a gut feeling. I do not know about research or
empiricial studies that examine the influence that these various
restrictions have on each other and how they act together.


>> but since C D Hixson and me have been
>> talking about usefuleness in a general sense, that doesn't count
>> here. A useful language needs to have lists, you must be able to write
>> recursive functions on lists and preferably in a functional way and
>> you should not need explicit instantiation declaration for every
>> instance (like 'of float', 'of int', 'of ...').

> Of course you need it. There cannot be a language with only first-class
> objects. 

Well, yes. In a sense java (up to 1.4) was such a language. The only
feasible way to built a general container was casting down to "Object"
type and casting up later again. With the possibility of runtime
casting errors. Bad.

> Let types-1 be values, make them values. Then they will need no
> declarations, just literals. Soon you will discover some types-2 which
> still require declarations. Make them values too. An so on. At some level n
> you will have to stop. Now, rename type-n = "type". value, type-k<n =
> "value". See? You are where you have started.

I do not understand your argument here. Care to give some example and
I'll try write down how it is down in i.e. OCaml? Perhaps we're
talking on cross purposes too, since I'm not sure I really wanted to
desire the thing you insist I want. :-)


>> Languages of the Algol-FORTRAN-C-Pascal-Ada group are all far from
>> that ideal. Since a lot of programming these days is general list
>> manipulation, everyday jobs become painful. 
>
> There always was Lisp, for those who prefer declarations spelt in the form
> of bracket structures...

I'm not talking about syntax. I'm talking about types. Which --
strictly speaking -- lisp hasn't. there is only one (static) type in a
dynamic type system, which is one large union (discriminant) type.

>> Has anybody here aver wondered about the popularity of "scripting",
>> like with Perl, PHP, Python and so on?
>
> I did.

And? What was the result of your considerations? :-)

>
>> As I see it, the presence of
>> lists and hashtables/dictionaries in those languages together with the
>> absence of memory management is a factor in this popularity. Which I
>> hope, if not proves, at least makes my point plausible. :-).

> What about Excel? I have a theory. Look Turbo Pascal was hugely popular.

Excel is just cancer of mind and organisation. People start to use it,
the "application" grows and they miss the point where to stop. I've
seen many "applications" that took away somewhere around 3-4 man
months and would have been easy to write as web application with say
Python or PHP and (God forgive me) MySQL in 1 month.

> Why? That is simple, because it has 256 character long strings! 

> Borland made a great mistake lifting that limitation. Compare it
> with MS-Excel, which is still limited to 32767 rows. That is the key
> to success! (:-))

You make me laugh :-). So what Ada should do, is to introduce a
String256 with appropriate restriction, and then we will succeed. Good
plan ... :-))).


>>> In my view there are three great innovations Ada made, which weren't
>>> explored at full:
>>>
>>> 1. Constrained subtypes (discriminants)
>>> 2. Implementation inheritance without values (type N is new T;)
>>> 3. Typed classes (T /= T'Class)
>> 
>> Here 2 things are missing: 
>> 
>>   - parameterized types (see Java generics and the ML type system, the
>>     first is just borrowing from the latter).
>
> See pos.1. The constraint is the parameter of. In rare cases you wanted
> different values of parameters producing isolated types you would apply 2
> to 1.

Again I do not know what you're denying here...


>>   - Separation of implementation and type or to put it differently
>>     inheritance and subtyping compatibility. See the ocaml type system.
>
> That should be interface inheritance from concrete types. Yes, Ada misses
> that.

No, what I want is even different. 2 values / objects in the OCaml way
of objects are just compatible if their type signatures (which are
calculated by the type inference engine) agree or better one is
contained in the other. This is a weaker contract than in Ada where at
least a behavioural part of the contract is implied by inheriting the
implementation, but which (except for generic packages) is useless,
since you can destroy behaviour by overwriting a method with a
misbehaved procedure.

Actually what are classes in other languages should perhaps better be
mapped to packages in Ada in the mind of the developer.


>>     I admit the contracts are weaker for allowing to instante a
>>     generic with a package with the "right type signature" as
>>     parameter instead of requiring an instance of another specific
>>     generic.
>
> There should be no generics at all...

I'm not sure. Generics provide a relatively stric contract model. I
like this. But the instantation requirement is cumbersome if compared
with the way parameterized types work in other langauges (and that is
exactly what I'm pulling from generics most of the time: parameterized
types). Parameterized types are now 1:1 substitute for generics. ML
has functors too, for a good reason. But they are what makes everyday
live bearable.

>
>>     But what is absolutely annoying, is, that the compatibility of
>>     objects is determined by inheritance instead by the type
>>     signature.
>
> I see it otherwise. Because "compatibility" is undecidable (for both the
> computer and the programmer), the language must be strongly and
> manifestedly typed.

Since the contract can be broken by new methods anyway, the only thing
that counts from a type safety point of view, is, not to break the
abstraction to the underlying processor / VM, that is, to be able to
call the methods with the right number of parameters and the right
representation of the parameters (at the machine level). So the type
signature is enough.

It's really bothersome that one cannot supply a class X which is
compatible to another class Y by just writing out the right methods. 

And BTW -- if inheritance is done right (the keyword here is binary
methods and co- and contravariance -- have a look into the OCaml users
manual), a class Y' derived from Y is not necessarily subtype
compatible. So inheritance of implementations doesn't guarantee
anything anyway. 


>>    This makes things like the factory pattern necessary
>>     and it really doesn't work in general case. (And yes, Java and C++
>>     share this defect).
>
> I am not sure what you mean, but when 3 is considered as 1, then
> dispatching on bare type tags might become possible.


3? 1? Please elaborate? Is "dispatching on bare type tags" a good or a
bad thing? You lost me there ... (my fault probably).


>
>>> P.S. All strings have fixed length. It is just so that you might not know
>>> the length at some point... (:-))
>> 
>> Ah, well, thats splitting hairs. We are talking about 2 different
>> lengths (of strings in C) here: The one is the allocated storage, the
>> other the distance from the start to the first occurrence of the '\0'
>> delimiter. Since strlen() returns the latter ...
>> 
>> The problem I've been talking above, is of course to manage growing
>> strings that might outgrow the allocated storage.

> No, the problem is lack of abstract interfaces. 

Ah! Here we agree! That is the deeper problem.

> I don't care about memory if I can make my object an implementation
> of an abstract array interface.

Yes. 

> There is no distance beyond the difference between two index values
> of the array. 

There is -- (I1 - I2) but it's abstract in number of elements, not in
units of memory.

> You cannot (may not) associate elements of an array and its indices
> with any memory locations.

ACK. 

But my dear Dmitry -- What does your sentence "All strings have fixed
length ..." mean than in this context, eh?

[laughing madly and leaving the stage]

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 22:38                     ` Markus E Leypold
@ 2007-01-29 16:16                       ` adaworks
  2007-01-29 16:35                         ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: adaworks @ 2007-01-29 16:16 UTC (permalink / raw)



"Markus E Leypold" <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> 
wrote in message news:ogy7nmsrog.fsf@hod.lan.m-e-leypold.de...
>
> PS: You didn't, I think, answer my question, why you don't think Java is type 
> safe ... ?
>
To start with, I don't like type promotion.

More later.

Richard 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 15:06               ` Markus E Leypold
  2007-01-29 14:37                 ` Dmitry A. Kazakov
@ 2007-01-29 16:23                 ` Georg Bauhaus
  2007-01-29 16:56                   ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Georg Bauhaus @ 2007-01-29 16:23 UTC (permalink / raw)


On Sun, 2007-01-28 at 16:06 +0100, Markus E Leypold wrote:
> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>  Since a lot of programming these days is general list
> manipulation, everyday jobs become painful. 

That lots of programming these days is list manipulation is a
popular misunderstanding I think. Witness the popularity
of what is known as Okasaki's Book. "forall", "exists",
remove-member, map(filter'access, items), etc. do not require
lists. Some of these operations don't even need traversal
(iteration, recursion, whatever you call it).
On the contrary, old Lisp style sets and maps have
always known to be as inefficient as O(n) can be.

As you have said yourself,

"the presence of
> lists and hashtables/dictionaries in those languages together with the
> absence of memory management is a factor in this popularity. 

It's not lists you manipulate when using hashtables/dictionaries.

>     But what is absolutely annoying, is, that the compatibility of
>     objects is determined by inheritance instead by the type
>     signature.

Compatibility of objects or of references to objects? What about
assignment?

I think interface types are helpful here.

>  The Java generics
> tutorial in I my eyes documents 2 things: (a) what has been really
> sorely missing from Java for 10 years and (b) that you can complement
> a type safe pascalish type system usefully with subtyping and
> parameterized types.

Why isn't Eiffel popular? :-)





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29 16:16                       ` adaworks
@ 2007-01-29 16:35                         ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-29 16:35 UTC (permalink / raw)



<adaworks@sbcglobal.net> writes:

> "Markus E Leypold" <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> 
> wrote in message news:ogy7nmsrog.fsf@hod.lan.m-e-leypold.de...
>>
>> PS: You didn't, I think, answer my question, why you don't think Java is type 
>> safe ... ?
>>
> To start with, I don't like type promotion.
>
> More later.

Yes, please continue. 

Type promotion is annoying with regard to the readers ability to
easily see the semantic of the programm, but it's IMHO not type unsafe
in the sense of the definition given in Cardelli's paper, because it
doesn't break the abstraction of the represention of the program and
its values by the underlying "machine level stuff" (i.e. VM or CPU
instructions and machine words). Again I hope that sentence is
understandable, if not I'll be glad to elaborate :-).

Regards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29 16:23                 ` How come Ada isn't more popular? Georg Bauhaus
@ 2007-01-29 16:56                   ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-29 16:56 UTC (permalink / raw)



Georg Bauhaus <bauhaus@arcor.de> writes:

> On Sun, 2007-01-28 at 16:06 +0100, Markus E Leypold wrote:
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>  Since a lot of programming these days is general list
>> manipulation, everyday jobs become painful. 

In this case it was really me that wrote this, not Dmitry. People,
what's with news readers? :-))

> That lots of programming these days is list manipulation is a
> popular misunderstanding I think. 

I disagree. It's not all there is, but certainly there is lot of lists
everywhere. People enter lists of item into systems, the retrieve
lists of items. Lists of items are shown in GUI windows. Many loops
are actually processing lists of thingies (even if the list never
turns up explicitly -- it perhaps would in a functional language.


> Witness the popularity of what is known as Okasaki's Book.

That notwithstanding. 

> "forall", "exists", remove-member, map(filter'access, items),
> etc. do not require lists. Some of these operations don't even need
> traversal (iteration, recursion, whatever you call it).

> On the contrary, old Lisp style sets and maps have
> always known to be as inefficient as O(n) can be.

I agree here to a certain extend. A list is not always the best
representation, which is the reason that "hashes" or "dictionaries"
(actually: maps) are so prevalent in most modern scripting
languages. But with lists (for sequential processing) and maps (and
some sets) we're almost done for everyday data processing and are
vastly more powerful than C, ..., Ada. 

Therefore my plea that a modern all purpose language should have lists
in the standard library (or in the language syntax), and of course I
should have added maps and sets.

I would like add, that I've been participating in teaching software
engineering for some time. There was one exercise to specify something
which should have been easy to do using sets. Unfortunately a number
of participants ended up trying to emulate list processing (in VDM-SL)
and writing loops instead of just writing x \in M. And this where
students almost the end of their studies. I still wonder what I can
learn from that ... -- perhaps that they unlearned during their
occupation with C++ and Java (their main background) to think in the
proper abstractions (none of those languages has a set primitive, just
clumsy containers).


>
> As you have said yourself,
>
> "the presence of
>> lists and hashtables/dictionaries in those languages together with the
>> absence of memory management is a factor in this popularity. 
>
> It's not lists you manipulate when using hashtables/dictionaries.

Ah, yes. I was a bit short with my original statement, as you have
noted. But since came somewhere from "what should a language have to
be useful /successful" my statement still stands: Should have lists,
but of course ALSO sets and maps.

>
>>     But what is absolutely annoying, is, that the compatibility of
>>     objects is determined by inheritance instead by the type
>>     signature.
>
> Compatibility of objects or of references to objects? What about
> assignment?

Subtype polymorphic compatibility. And there are only references to
objects in OCaml (if you're asking about this). 

In Ada there are certainly various compatibilities

 - Assignment of objects not possible if not the same type -- no
   compatibility there.

 - Passing a A for a B is only possible if A is derived from B (wether
   by reference or not).

 - The same goes for casting

I'm actually missing what it buy you to differentiate "Compatibility
of objects or of references to objects?". Where there is compatibility
(subtype polymorphism) at all, it's tied to inheritance (of
implementation!). Perhaps I'm missing your point.


> I think interface types are helpful here.

A clumsy workaround, which also doesn't fix the co/contravariance
problem with binary methods. :-). 

>
>>  The Java generics
>> tutorial in I my eyes documents 2 things: (a) what has been really
>> sorely missing from Java for 10 years and (b) that you can complement
>> a type safe pascalish type system usefully with subtyping and
>> parameterized types.

> Why isn't Eiffel popular? :-)

I don't know. I know, why it didn't pass my evaluation: No bindings to
GUI toolkits (which is another matter), the same problem with the type
system as far as I can see.

Regards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 22:43                       ` Markus E Leypold
@ 2007-01-29 22:40                         ` Cesar Rabak
  2007-01-30  9:31                           ` Markus E Leypold
  2007-01-30 16:19                           ` adaworks
  0 siblings, 2 replies; 397+ messages in thread
From: Cesar Rabak @ 2007-01-29 22:40 UTC (permalink / raw)


Markus E Leypold escreveu:
> Cesar Rabak <csrabak@yahoo.com.br> writes:
> 
>> present beliefs in the SW industry are that the process (CMM sense) is
>> what makes SW more reliable or not.
> 
> Well, that is not completely wrong, you know. 

Yes I know!

> Though usually in
> practice are code reviews are rarely done, despite the fact that they
> are known to be the best instrument for creating code of high quality.
> 
> 
>> Upper management is used to navigate in the muddy waters of non
>> perfect systems: unreliable OSs, virus/worm/trojan threats, so coping
>> with a 'popular' language is part of the (usual) business.
> 
> Which just means they recognize the value of the "things that already
> exist" oas opposed to "the things that would have yet to be
> written". A wise attidute indeed.

My only issue with this is the lack of a quality metric to ascertain the 
adequateness of the 'thints that already exist'.

> 
>> Another question we tend to forget about C is the formidable ecosystem
>> that exists to help companies to get right with lint-like tools,
>> memory checkers, etc.
> 
> Exactly my point. This mitigates the "C is utter s***" judgment a bit.
> 

I think so, too, and for the worse (at least for me when I try to use 
Ada technology) one can even hear that "...in C the use of these tools 
and their expenditure is at users will...whereas...".

[snipped]

>> Food for thought:
>>
>> Let's see, if we pick a recent project (Next-Generation James Webb
>> Space Telescope) what would be the proposed language by IBM?
> 
> Perhaps mot C#. What do you suggest?

I don't have a clue, but even in the site from Rational (where there is 
mention to other IBM products) there is not a word about Ada...

> 
> Regards -- Markus
> 



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-27  5:12     ` Charles D Hixson
  2007-01-27  9:52       ` Markus E Leypold
@ 2007-01-29 23:56       ` Randy Brukardt
  1 sibling, 0 replies; 397+ messages in thread
From: Randy Brukardt @ 2007-01-29 23:56 UTC (permalink / raw)


"Charles D Hixson" <charleshixsn@earthlink.net> wrote in message
news:WaBuh.15096$pQ3.10501@newsread4.news.pas.earthlink.net...
...
> Ada subsets couldn't
> use the name Ada.  Janus Ada couldn't call itself Ada for
> quite awhile.  And even Janus Ada couldn't run on most CP/M
> machines.  Too resource intensive.  I bought an Apple ][ to
> run UCSD Pascal, and was so disappointed that I saved up and
> installed a CP/M card so that I could run C.  Ada wasn't a
> possibility.  (I never seriously considered Basic.  It was too
> non-portable.  If I wanted non-portable, I'd go for assembler.)

This is somewhat confused. Janus/Ada started out as an Ada subset on CP/M.
Indeed, Apple II CP/M boards were our number 1 platform for several years in
the 1980s. We never actually owned an Apple machine ourselves (we used the
machine of a friend for testing and copying of literally hundreds of floppy
disks). Janus/Ada for CP/M ran on any machine with a 56K TPA area, which was
the vast majority of them that had a full 64K of memory. (Early versions
used even less memory. Nowdays, it's hard to imagine a time when 16K memory
boards cost several hundred dollars each; a modern machine has 16,000 times
as much memory for the same money!) So Ada *was* a possibility on CP/M.

Our early IBM PC versions only required 256K machines. But a complete
implementation of Ada 83 a few years later did take a full 640K machine.

                Randy.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29  1:02                     ` Jeffrey R. Carter
@ 2007-01-30  0:21                       ` Randy Brukardt
  0 siblings, 0 replies; 397+ messages in thread
From: Randy Brukardt @ 2007-01-30  0:21 UTC (permalink / raw)


"Jeffrey R. Carter" <jrcarter@acm.org> wrote in message
news:HIbvh.370756$1i1.249719@attbi_s72...
> adaworks@sbcglobal.net wrote:
> >
> > Every programming language isbe error-prone at different levels,
including
> > Ada.   The question is how error-prone a language might be.  On a
continuum
> > of more error-prone to less, Ada is at the end of the scale for less
error-prone
> > _____________________________________________________________
> > More
> > Less
> > Error-prone            ------------------>
> > Error-prone
> >
> > Assembler       C                           C++           Java    C#
Ada
> > SPARK
> >                     COBOL-68       COBOL-85   Objective-C        Eiffel
> >                                    Early PL/I
Recent PL/I
> > ______________________________________________________________
>
> Nice concept. Not a very good presentation on my newsreader. Maybe a
> vertical orientation would be better.

Richard needs a blog or other website for his pithy discussions of Ada and
programming. (Indeed, I tried to talk him into doing that some years ago.)
Plain text isn't the best format for his work.

                          Randy.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29 22:40                         ` Cesar Rabak
@ 2007-01-30  9:31                           ` Markus E Leypold
  2007-01-30 16:19                           ` adaworks
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-01-30  9:31 UTC (permalink / raw)



Cesar Rabak <csrabak@yahoo.com.br> writes:

> Markus E Leypold escreveu:
>> Cesar Rabak <csrabak@yahoo.com.br> writes:
>>
>>> present beliefs in the SW industry are that the process (CMM sense) is
>>> what makes SW more reliable or not.
>> Well, that is not completely wrong, you know.
>
> Yes I know!
>
>> Though usually in
>> practice are code reviews are rarely done, despite the fact that they
>> are known to be the best instrument for creating code of high quality.
>>
>>> Upper management is used to navigate in the muddy waters of non
>>> perfect systems: unreliable OSs, virus/worm/trojan threats, so coping
>>> with a 'popular' language is part of the (usual) business.
>> Which just means they recognize the value of the "things that already
>> exist" oas opposed to "the things that would have yet to be
>> written". A wise attidute indeed.
>
> My only issue with this is the lack of a quality metric to ascertain
> the adequateness of the 'thints that already exist'.

This is not a question of "quality". It's only a question of "How much
money do I have top spent to recreate those already existing things?"
(Cost C1).

If I'm not going to do that the other question I have to answer (when
shifting to another language / tool set or process) is "How much money
do I have to spent integrate the existing things (process, tools,
code) with the new process / tools /code" (Cost C2)?

Quality only does come in here as C3: The _monetary_ costs (bugs,
maintenance, outright errors) avoided in future by creating better
"things" instead of the existing ones.

Either C1-C3 or C1-C2 is actually a suitable first measure of the
_value_ of the "existing things".

Quality, I insist, has only value as a factor in avoid expensive
failures of the system/programs in the future.  So the value of
quality does depend on context and it's this consideration that is too
often missing from those language advocacy discussions.


>>> Another question we tend to forget about C is the formidable ecosystem
>>> that exists to help companies to get right with lint-like tools,
>>> memory checkers, etc.
>> Exactly my point. This mitigates the "C is utter s***" judgment a
>> bit.
>>
>
> I think so, too, and for the worse (at least for me when I try to use
> Ada technology) one can even hear that "...in C the use of these tools
> and their expenditure is at users will...whereas...".


Yes. But the user's will is exactly what the development process
should be controlling. If we've gone as far as admitting that a proper
development process is necessary for a high quality of the final
product, we can as well go farther and compare not the languages per
se but the languages together with their tools and libraries. In my
opinion that will bring us much nearer to an answer to the question
"Why is Ada not more popular?".

>
> [snipped]
>
>>> Food for thought:
>>>
>>> Let's see, if we pick a recent project (Next-Generation James Webb
>>> Space Telescope) what would be the proposed language by IBM?
>> Perhaps mot C#. What do you suggest?
>
> I don't have a clue, but even in the site from Rational (where there
> is mention to other IBM products) there is not a word about Ada...

Well ...

Regards -- Markus






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29 22:40                         ` Cesar Rabak
  2007-01-30  9:31                           ` Markus E Leypold
@ 2007-01-30 16:19                           ` adaworks
  2007-01-30 21:05                             ` Jeffrey Creem
  1 sibling, 1 reply; 397+ messages in thread
From: adaworks @ 2007-01-30 16:19 UTC (permalink / raw)



"Cesar Rabak" <csrabak@yahoo.com.br> wrote in message 
news:eplsvd$1uu$1@aioe.org...
>
> I don't have a clue, but even in the site from Rational (where there is 
> mention to other IBM products) there is not a word about Ada...
>
It is really sad that Rational has decided to hide the fact that it was
once an Ada company.   This trend began over ten years ago
when they began to avoid any mention of Ada at trade shows,
in advertisements, and elsewhere.   They still make money from
Ada, but they act as if they are ashamed of having anything to
do with it.

IBM has never been good with Ada.  Rarely did anything they ever did that
involved Ada turned out well.   Their commitment to their own Ada
products was wishy-washy, and their Ada projects never seemed to
get very far.   That being said, even their own language, PL/I has been
allowed to spiral into decline.   It seems that IBM has not programming
language strategy, and Rational has adopted the IBM position with
regard to languages.

When I was active in Ada, up through 2000, there were a fair number
of Rational Ada users.   I wonder if there are any new projects using
Rational Ada.   My guess is, very few.   Rational just does not seem to
be interested in Ada anymore.   I would hope I am wrong about that,
but I suspect I'm not.

Richard Riehle 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-28 19:32             ` Gautier
@ 2007-01-30 19:41               ` tmoran
  0 siblings, 0 replies; 397+ messages in thread
From: tmoran @ 2007-01-30 19:41 UTC (permalink / raw)


> A version of the CLAW demo that compiles
> with GNAT 3.15p (2002 !) or later wouldn't hurt, either,
   That's an error on the web page.  Of course Claw has worked with
Gnat 3.15p since the latter was released.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29 15:50                   ` Markus E Leypold
@ 2007-01-30 19:58                     ` Robert A Duff
  2007-01-30 21:52                       ` Markus E Leypold
  2007-01-31 17:49                       ` Ed Falis
  2007-01-31 10:55                     ` Dmitry A. Kazakov
  1 sibling, 2 replies; 397+ messages in thread
From: Robert A Duff @ 2007-01-30 19:58 UTC (permalink / raw)


Markus E Leypold
<development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> If yout have only downward scopes for "closures" and memory allocation
> this will, finally, interact badly with the fact that "implicit
> (i.e. static type safe) casting" of classes is also only possible
> downwards. My impression is, that all these things together rule out
> some useful designs, that would otherwise possible. Or to say it
> differenty: Object orientation w/o indeterminable scopes, upward
> closures and GC doesn't work well. Some abstractions cannot be
> implemented.

Why do we call these closures "downward" and "upward" instead of the
other way around?  A downward closure (allowed in Ada 2005, and to some
extent in Ada 83) is a procedure that is passed IN to another procedure,
which is toward the TOP of the call stack (up!).  An upward closure (not
directly supported in Ada, but supported in Lisp et al) is a procedure
that is passed OUT (return value, 'out' parameter, setting a global
variable) -- which means it's being passed DOWN toward the BOTTOM of the
call stack.  (Of course, the call stack is not quite a stack anymore, if
upward closures are allowed!)

It seems to me "inward" and "outward" closures would be clearer
terminology, for what are normally called "downward" and "upward",
respectively.

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-30 16:19                           ` adaworks
@ 2007-01-30 21:05                             ` Jeffrey Creem
  2007-01-31  7:59                               ` AW: " Grein, Christoph (Fa. ESG)
  0 siblings, 1 reply; 397+ messages in thread
From: Jeffrey Creem @ 2007-01-30 21:05 UTC (permalink / raw)


adaworks@sbcglobal.net wrote:
> "Cesar Rabak" <csrabak@yahoo.com.br> wrote in message 
> news:eplsvd$1uu$1@aioe.org...
>> I don't have a clue, but even in the site from Rational (where there is 
>> mention to other IBM products) there is not a word about Ada...
>>
> It is really sad that Rational has decided to hide the fact that it was
> once an Ada company.   This trend began over ten years ago
> when they began to avoid any mention of Ada at trade shows,
> in advertisements, and elsewhere.   They still make money from
> Ada, but they act as if they are ashamed of having anything to
> do with it.
I agree the IBM site is not the easiest to navigate. But really it does 
not seem that hidden.
www.ibm.com, click products, software by category, Software Development, 
traditional programming languages & compilers.
Rational Ada Developer is then on the bottom of the page (In 
alphabetical order) and still above the XC compiler.


I do see what you mean about going down through the www.rational.com 
link. Tools like compilers don't even show up at that level though I 
think that is less about not liking Ada and more about the fact that a 
simple IDE/compiler is not at the same level as the high minded fairly 
useless 'craft you a solution' tools that IBM likes to sell.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-30 19:58                     ` Robert A Duff
@ 2007-01-30 21:52                       ` Markus E Leypold
  2007-01-31 22:49                         ` Robert A Duff
  2007-01-31 17:49                       ` Ed Falis
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-30 21:52 UTC (permalink / raw)



Robert A Duff <bobduff@shell01.TheWorld.com> writes:

> Markus E Leypold
> <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>
>> If yout have only downward scopes for "closures" and memory allocation
>> this will, finally, interact badly with the fact that "implicit
>> (i.e. static type safe) casting" of classes is also only possible
>> downwards. My impression is, that all these things together rule out
>> some useful designs, that would otherwise possible. Or to say it
>> differenty: Object orientation w/o indeterminable scopes, upward
>> closures and GC doesn't work well. Some abstractions cannot be
>> implemented.
>
> Why do we call these closures "downward" and "upward" instead of the
> other way around?  

I don't know. It's a pathological condition anyway :-). I'm using this
vocabulary as I find it.


> A downward closure (allowed in Ada 2005, and to some
> extent in Ada 83) is a procedure that is passed IN to another procedure,
> which is toward the TOP of the call stack (up!).  

It's passed downward th call chain.


> An upward closure (not
> directly supported in Ada, but supported in Lisp et al) is a procedure
> that is passed OUT (return value, 'out' parameter, setting a global
> variable) -- which means it's being passed DOWN toward the BOTTOM of the
> call stack.  

> (Of course, the call stack is not quite a stack anymore, if
> upward closures are allowed!)

Depends. Just yesterday I've been imagining a system which basically
uses a stack, but when passing a "closure" upwards, copies parts of
the stack to the heap.

Chicken scheme I think, also uses a stack (in som unusal ways), but
still has closure.


> It seems to me "inward" and "outward" closures would be clearer
> terminology, for what are normally called "downward" and "upward",
> respectively.

What you call outward closures are just closure. There was a
discussion at comp.lang.functional which convinced me, that closure
should better be called procdures (because it's rathe the natural way
how things should work) and what is presently called downwards closure
should better be called somthing different, like, say stack bound
procedures (to indicate that their lifetime depends on the stack).

But it's certainly too late now to change established if misleading
terminology.

Regards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* AW: How come Ada isn't more popular?
  2007-01-30 21:05                             ` Jeffrey Creem
@ 2007-01-31  7:59                               ` Grein, Christoph (Fa. ESG)
  2007-02-03 16:33                                 ` Martin Krischik
  0 siblings, 1 reply; 397+ messages in thread
From: Grein, Christoph (Fa. ESG) @ 2007-01-31  7:59 UTC (permalink / raw)
  To: comp.lang.ada


>> Ada, but they act as if they are ashamed of having anything to
>> do with it.
> I agree the IBM site is not the easiest to navigate. But really it
does 
> not seem that hidden.
> www.ibm.com, click products, software by category, Software
Development, 
> traditional programming languages & compilers.
> Rational Ada Developer is then on the bottom of the page (In 
> alphabetical order) and still above the XC compiler.

But if you're looking for Ada, you wouldn't expect it under R, would
you?



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29 14:21                   ` Markus E Leypold
@ 2007-01-31  9:23                     ` Maciej Sobczak
  2007-01-31 10:24                       ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-01-31  9:23 UTC (permalink / raw)


Markus E Leypold wrote:

>> Sorry, but that's a misconception - I don't remember when I was the
>> last time I was messing with manual memory management in a regular C++
>> code. I estimate that in my current programming I call delete (or free
>> or whatever) once in 5-10 kLOC.
> 
> OK. I'll have to reconsider this statement. I usually couldn't trim
> down 'automatic' allocation to that extent, but that might have been
> my application area. What I remember though, is the difficulty to
> recover from exceptions in the presence of automatic (scope bound)
> memory management. (I hope I'm making sense here, else I'd really have
> to go back to my C++ mind and notes and try to retrieve the right
> vocabulary and reasoning and -- well -- I don't really want to have a
> C++ discussion in c.l.a. :-).

Why not having C++ discussions on the list where people claim that C++ 
sucks? :-)

> If we must, let's shift that to personal
> mail or to another group ...).

Yes, please feel free to contact me with regard to the above (follow the 
links in my signature).

> But from what I remember in
> the 1997s to 1998s

Most programming languages were terrible at that time, that's true.

> (that was when there still were problems with STLs,
> exceptions and the string libraries in C++ and when there was no
> standard and Java was new), that this was one of the motivations that
> people shifted to Java (either from C++ or from C).

Yes, I know that. And I will keep stating that this motivation resulted 
from common misconceptions, further amplified by Java marketing.
I've even heard that Java is better, because it has a String class and 
there is no need to use char* as in C++ (!). FUD can buy a lot.

> The other
> motivation was the "portable GUI"

Yes.

> which, I think, mostly disappointed
> the expectations.

Still, GUI that sucks was better than no standard GUI at all for lost of 
  people. Both C++ and Ada are in the same camp in this aspect. I'm not 
claiming that these languages should have standard GUI, but not having 
it definitely scared many.

> Of course I might be wrong. This is just teh impression I got "from
> the trenches" and I might be missing a mor global point of view. It
> perhaps does not apply today where C++

Well, there is still no standard GUI for C++, but the choice with 
non-standard ones is quite impressive:

http://www.free-soft.org/guitool/

> and the understanding of C++
> has matured a bit

Yes. Sadly, too late for those who already changed their mind.

> (there is even an embedded subset of C++ which will
> annoy folks here no end :-).

I think that at the end of the day the embedded C++ will disappear from 
the market as the "full" C++ gets wider compiler support on embedded 
platforms. There will simply be no motivation for using subsets.
Subsetting C++ would be beneficial in the sense similar to Ravenscar or 
by extracting some core and using it with formal methods (sort of 
"SPARK++"), but I doubt it will ever happen.

>>> The trend I see, is that GC is a must, clumsy pointer handling is out
>>> and types are an option for those that can understand them.
>> Indeed, looks like everybody is going in that direction.
> 
> And certainly. Why should advances in hardware only buy more spiffy
> GUIs and not something to ease the everyday pain for the everyday
> software developer :-).

The interesting thing is that memory management is *said* to be painful. 
C++ and Ada are similar in this regard - the majority of the regular 
code don't need manual memory management (local objects!) or can have it 
encapsulated (containers!), so there are no problems that would need to 
be solved. Reference-oriented languages have completely different ratio 
of "new per kLOC" so GC is not a feature there, it's a must. But then 
the question is not whether GC is better, but whether reference-oriented 
languages are better than value-oriented ones. Many people get seduced 
by GC before they even start asking such questions.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-31  9:23                     ` Maciej Sobczak
@ 2007-01-31 10:24                       ` Markus E Leypold
  2007-02-02  8:42                         ` Maciej Sobczak
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-31 10:24 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:
>
>>> Sorry, but that's a misconception - I don't remember when I was the
>>> last time I was messing with manual memory management in a regular C++
>>> code. I estimate that in my current programming I call delete (or free
>>> or whatever) once in 5-10 kLOC.
>> OK. I'll have to reconsider this statement. I usually couldn't trim
>> down 'automatic' allocation to that extent, but that might have been
>> my application area. What I remember though, is the difficulty to
>> recover from exceptions in the presence of automatic (scope bound)
>> memory management. (I hope I'm making sense here, else I'd really have
>> to go back to my C++ mind and notes and try to retrieve the right
>> vocabulary and reasoning and -- well -- I don't really want to have a
>> C++ discussion in c.l.a. :-).
>
> Why not having C++ discussions on the list where people claim that C++
> sucks? :-)

Because of the level of detail? Because I haven't really swapped in my
C++-personality yet, I'm wary to do it and fear I'll slip up in public
claiming the wrong things about C++? :-)

Jokes aside, I feel a bit off topic with this and the fervour in this
thread seems to have diminished somewhat anyway ...

>
>> But from what I remember in
>> the 1997s to 1998s
>
> Most programming languages were terrible at that time, that's true.

Not Ada 95 ... :-). Actually I think it was that time when history
branched and Ada missed to become main stream, but that is not a
historians claim only a personal impression.

>
>> (that was when there still were problems with STLs,
>> exceptions and the string libraries in C++ and when there was no
>> standard and Java was new), that this was one of the motivations that
>> people shifted to Java (either from C++ or from C).
>
> Yes, I know that. And I will keep stating that this motivation
> resulted from common misconceptions, further amplified by Java
> marketing.

Oh yes, misconceptions perhaps. But I'v only been talking about
peoples motivations (which say a lot about their perceived problems).

> I've even heard that Java is better, because it has a String class and
> there is no need to use char* as in C++ (!). FUD can buy a lot.

As far as that goes I have seen people getting tripped up really bad
by string::c_str(). And I think you need it, if you don't program pure
C++, which at that time nobody did. The implementation I saw there
might have been strange/faulty though -- s.c_str() returned a pointer
into some internal data structure of s which promptly changed when s
was modified. The only "safe" way to use it was strdup(s.c_str()) and
that is not threadsafe as anybody can see. I see "the need to use
char* in C++" rumour as the result of people having been burned
by similarly quirks at that time.

>
>> The other
>> motivation was the "portable GUI"
>
> Yes.
>
>> which, I think, mostly disappointed
>> the expectations.
>
> Still, GUI that sucks was better than no standard GUI at all for lost
> of people. 

"portable" not "standard". There were always standard GUIs on Unix and
on Win32, only they where not exchangeable.

> Both C++ and Ada are in the same camp in this aspect. I'm
> not claiming that these languages should have standard GUI, but not
> having it definitely scared many.

Let's say it the other way round: Java certainly attracted a number of
people because of the "standard GUI". And since I firmly believe that
a GUI does NOT belong into a language standard but rather in a
separate standard for a runtime environment, I wonder a bit which kind
of people those were.

>> Of course I might be wrong. This is just teh impression I got "from
>> the trenches" and I might be missing a mor global point of view. It
>> perhaps does not apply today where C++
>
> Well, there is still no standard GUI for C++, but the choice with
> non-standard ones is quite impressive:

I haven't been talking about the standard GUI here, but about the
memory managment issues I hinted at earlier. I see, I've been jumping
a bit here.

>
> http://www.free-soft.org/guitool/
>
>> and the understanding of C++
>> has matured a bit
>
> Yes. Sadly, too late for those who already changed their mind.
>
>> (there is even an embedded subset of C++ which will
>> annoy folks here no end :-).

> I think that at the end of the day the embedded C++ will disappear
> from the market as the "full" C++ gets wider compiler support on
> embedded platforms. 

That is no question of compiler support, as I understand it, but of
verifiability and safety. A bit like Ravenscar, but -- of course --
not as highly integer (ahem ... :-).

> There will simply be no motivation for using subsets.

But there will -- see above.

> Subsetting C++ would be beneficial in the sense similar to Ravenscar
> or by extracting some core and using it with formal methods (sort of
> "SPARK++"), but I doubt it will ever happen.

It already did (and perhaps died)

   http://en.wikipedia.org/wiki/Embedded_C++


>>>> The trend I see, is that GC is a must, clumsy pointer handling is out
>>>> and types are an option for those that can understand them.
>>> Indeed, looks like everybody is going in that direction.
>> And certainly. Why should advances in hardware only buy more spiffy
>> GUIs and not something to ease the everyday pain for the everyday
>> software developer :-).
>
> The interesting thing is that memory management is *said* to be
> painful. C++ and Ada are similar in this regard - the majority of the
> regular code don't need manual memory management (local objects!) or
> can have it encapsulated (containers!), so there are no problems that
> would need to be solved. 

I disagree. The only-downward-closures style of C++ and Ada, which
allows only to mimic "upward closures" by using classes, heavily
influences the way the programmer thinks. Higher level abstractions
(as in functional languages) would require full closure -- and since
this means that memory life time cannot bound to scope any more, this
would be the point where manual memory management becomes painful.

Furthermore I've been convinced that manual memory management hinders
modularity.

> Reference-oriented languages have completely
> different ratio of "new per kLOC" so GC is not a feature there, it's a
> must. 


I wonder, if it is really possible to do OO without being
reference-oriented. I somewhat doubt it.

> But then the question is not whether GC is better, but whether
> reference-oriented languages are better than value-oriented ones. Many
> people get seduced by GC before they even start asking such questions.

Value-oriented in my world would be functional -- languages which all
heavily rely on GC. 

I also admit being one of the seduced, but that is not surprising
since my main focus is not in embedded programming and in everything
else it's sheer folly not to have GC. The arguments against GC often
read like arguments against virtual memory, against high level
languages as opposed to assembler, against filesystems (yes there was
a time when some people thought that the application would best do
allocation of disc cylinders itself since it knows its access patterns
better than the FS).

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-29 15:50                   ` Markus E Leypold
  2007-01-30 19:58                     ` Robert A Duff
@ 2007-01-31 10:55                     ` Dmitry A. Kazakov
  2007-01-31 15:16                       ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-01-31 10:55 UTC (permalink / raw)


On Mon, 29 Jan 2007 16:50:16 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Sun, 28 Jan 2007 16:06:48 +0100, Markus E Leypold wrote:
>>
>>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>>> 
>>>> On Sun, 28 Jan 2007 00:24:27 +0100, Markus E Leypold wrote:
>>
>>>> Generics is a wrong answer and always was. As well as built-in lists you
>>>> are praising is, because what about trees of strings, trees of lists etc.
>>>> You cannot build every and each type of containers in.
>>> 
>>> You missed my point. :-). A language with a Hinldey-Milner type system
>>> has a 'tree of something' type where something can be anything in
>>> every given instance of usage.
>>
>> You have X of Y. Granted, you can play with Y, but what about X?
> 
> I don't understand the question ...

X = tree, Y = something. What about X = something, Y = character.

Example: fixed size strings, unbounded strings, suffix tree. Here the
container (X) varies, the element does not. It is a quite common situation
when you wished to change the containers on the fly.

>> The point is, the language should not have either X or Y built-in. What
> 
> Pursuing this argument further, a language also shouldn't have strings
> built in etc. :-). 

Yes, sure. The standard library can provide them, but there must be no
magic in. The user should be able to describe string type in language terms
without loss of either performance or compatibility.

> What I think, is, that having at least lists in the "standard library"
> and by that I mean functional lists, not arrays, not "containers",
> helps tremendously because that is such a common use case.
> 
>> should be built-in is "of". The type system should be able to handle
>> parallel type hierarchies of X's and Y's bound by the relation "of".
> 
> Yes. And exactly that is what a Hindley-Milner type system has built
> in. The lists in Haskell and Ocaml are just in the standard library
> (conceptually -- knowing that they are here, makes it easy to compile
> them by special rules and get a more efficient implementation or have
> special instructions in the underlying VM or reduction mechanism).

I doubt it can. I am too lazy to check. (:-)) But the uncomfortable
questions you (the type system) should answer are like:

Y1 <: Y2 => X of Y1 <: X of Y2
   is container of subtypes a subtype?

X1 <: X2 => X1 of Y <: X2 of Y
   is sub-container a subtype?

There is no universal answer. Sometimes yes, sometimes not. Consider yes
examples:

   String vs. Wide_String (case 1)
   Unbounded_String vs. String  (case 2)
   Unbounded_String vs. Wide_String  (mixed case)
   
>>> So you get 'tree of foo' from it and 'tree of bar' when you use it
>>> like this, but there are no generics instantiated. There are trade
>>> offs, of course (like that you probably can't design such a language
>>> without GC and without loosing a certain degree of control of the
>>> representation of types),
>>
>> It is not obvious. I saw to good arguments for GC, so far. (it tend to
>> repeat in c.l.a and comp.object, a lot of foam around mouths... (:-))  
> 
> Foam, yes :-). But you have to admit, that allowing a GC without ever
> having an actual implementation (execpt Martin Krischik's adaption of
> the B�hm collector) just too much teasing: You could have it, but no,
> it's not there.

It was left open for those who might need it. It seems that nobody came...
(:-))

No seriously, I don't consider GC as a problem. Just give me "X of Y." Then
I would take X = access. An do whatever memory collection I want! See?
There is absolutely no need to have built-in GC, if you have abstract
referential (access) interfaces. Note additional advantage: you can have
different GC's handling objects of same types in the same application!

>> To me GC is about indeterminable scopes, upward closures and
>> other things I don't want to have...
> 
> If yout have only downward scopes for "closures" and memory allocation
> this will, finally, interact badly with the fact that "implicit
> (i.e. static type safe) casting" of classes is also only possible
> downwards. My impression is, that all these things together rule out
> some useful designs, that would otherwise possible. Or to say it
> differenty: Object orientation w/o indeterminable scopes, upward
> closures and GC doesn't work well. Some abstractions cannot be
> implemented.

Umm, I cannot tell. It is an interesting point. I am not sure if it is
true, because we already can return T'Class, and surely we should develop
the language towards making X of Y'Class possible. As a simple intermediate
stage we could allow X (T: Tag) of Y'Class (T). In Ada syntax:

type Coherent_Array (Element_Tag : <syntax sugar for Element'Class>) is
   array (Integer range <>) of
      Element'Class (Element_Type);

Here the discriminant of the array determines the specific types of all its
elements.

> This, of course is just a gut feeling. I do not know about research or
> empiricial studies that examine the influence that these various
> restrictions have on each other and how they act together.

My feeling is that upward closures destroy the idea of type. However,
somehow we surely need type objects in some form, for making distributed
systems, for instance (how to marshal non-objects?)

>> Let types-1 be values, make them values. Then they will need no
>> declarations, just literals. Soon you will discover some types-2 which
>> still require declarations. Make them values too. An so on. At some level n
>> you will have to stop. Now, rename type-n = "type". value, type-k<n =
>> "value". See? You are where you have started.
> 
> I do not understand your argument here. Care to give some example and
> I'll try write down how it is down in i.e. OCaml? Perhaps we're
> talking on cross purposes too, since I'm not sure I really wanted to
> desire the thing you insist I want. :-)

You have the following hierarchy:

values
type-1 = types (sets of values + operations)
type-2 = types of types (sets of types + operations to compose types)
type-3 = types of types of types (...)
...
type-n
...
type-oo

[ value-k = type-k-1 ]

As an example of type-2 you can consider parametrized types. Instances of
them are type-1 (=value-2). Types declared in generic Ada packages are
type-1. All of them considered together ("generic type") is a type-2.
Another example of type-2 in Ada is formal generic type:

generic
   type T is range <>;

"range <>" is type-2. The actual T is type-1 (=value-2).

>>> Languages of the Algol-FORTRAN-C-Pascal-Ada group are all far from
>>> that ideal. Since a lot of programming these days is general list
>>> manipulation, everyday jobs become painful. 
>>
>> There always was Lisp, for those who prefer declarations spelt in the form
>> of bracket structures...
> 
> I'm not talking about syntax. I'm talking about types. Which --
> strictly speaking -- lisp hasn't. there is only one (static) type in a
> dynamic type system, which is one large union (discriminant) type.

I think one could argue that types in Lisp are (), (()), ((())), ... but I
don't believe it deserves much attention. (:-))

>>> Has anybody here aver wondered about the popularity of "scripting",
>>> like with Perl, PHP, Python and so on?
>>
>> I did.
> 
> And? What was the result of your considerations? :-)

The Original Sin is the reason. They are sent upon us as a punishment...
(:-))

[...]
> You make me laugh :-). So what Ada should do, is to introduce a
> String256 with appropriate restriction, and then we will succeed. Good
> plan ... :-))).

Not in this life ... Use Ada and be saved... (:-))

>>>> In my view there are three great innovations Ada made, which weren't
>>>> explored at full:
>>>>
>>>> 1. Constrained subtypes (discriminants)
>>>> 2. Implementation inheritance without values (type N is new T;)
>>>> 3. Typed classes (T /= T'Class)
>>> 
>>> Here 2 things are missing: 
>>> 
>>>   - parameterized types (see Java generics and the ML type system, the
>>>     first is just borrowing from the latter).
>>
>> See pos.1. The constraint is the parameter of. In rare cases you wanted
>> different values of parameters producing isolated types you would apply 2
>> to 1.
> 
> Again I do not know what you're denying here...

I deny them as types-2. The huge advantage of pos.1 is that the result is
type-1. The consequence: you can have common values. With type-2 values
(value-1) of different values (type-1) of type-2 are of different types =>
you cannot put them into one container.

>>>   - Separation of implementation and type or to put it differently
>>>     inheritance and subtyping compatibility. See the ocaml type system.
>>
>> That should be interface inheritance from concrete types. Yes, Ada misses
>> that.
> 
> No, what I want is even different. 2 values / objects in the OCaml way
> of objects are just compatible if their type signatures (which are
> calculated by the type inference engine) agree or better one is
> contained in the other. This is a weaker contract than in Ada where at
> least a behavioural part of the contract is implied by inheriting the
> implementation, but which (except for generic packages) is useless,
> since you can destroy behaviour by overwriting a method with a
> misbehaved procedure.

I don't see difference yet. When you inherit only interface, you drop all
the implementation or its parts. This is one issue.

Type inference, if you mean that, is a different one. I doubt that
inference is a right way.

>>>     I admit the contracts are weaker for allowing to instante a
>>>     generic with a package with the "right type signature" as
>>>     parameter instead of requiring an instance of another specific
>>>     generic.
>>
>> There should be no generics at all...
> 
> I'm not sure. Generics provide a relatively stric contract model. I
> like this. But the instantation requirement is cumbersome if compared
> with the way parameterized types work in other langauges (and that is
> exactly what I'm pulling from generics most of the time: parameterized
> types). Parameterized types are now 1:1 substitute for generics. ML
> has functors too, for a good reason. But they are what makes everyday
> live bearable.

We don't need parametrized types (type-2). We can well do almost everything
with parametrized subtypes (type-1). That is the pos.1. Compare:

type String is array (Positive range <>) of Character;
   -- type-1, subtype is parametrized by the bounds

generic
   First : Positive;
   Last : Positive;
package
   type String is array (First..Last) of Character;
      -- type-2, type is parametrized by the bounds

I don't want the latter. Everything that can be made within type-1 must be
done there.

>>>     But what is absolutely annoying, is, that the compatibility of
>>>     objects is determined by inheritance instead by the type
>>>     signature.
>>
>> I see it otherwise. Because "compatibility" is undecidable (for both the
>> computer and the programmer), the language must be strongly and
>> manifestedly typed.
> 
> Since the contract can be broken by new methods anyway, the only thing
> that counts from a type safety point of view, is, not to break the
> abstraction to the underlying processor / VM, that is, to be able to
> call the methods with the right number of parameters and the right
> representation of the parameters (at the machine level). So the type
> signature is enough.

This is a strange argument. Yes, you cannot verify the semantics, exactly
therefore types come into play. Type only names the behavior, so that it
can be checked formally *without* looking into actual behavior.

> It's really bothersome that one cannot supply a class X which is
> compatible to another class Y by just writing out the right methods. 

This is interface inheritance + supertyping + inheritance. It works as
follows:

Given: X, Y independent types.

Required: To use Y where the interface of X is expected.

You create a supertype Z for Y which is a subtype of X. The compiler will
check the contract and require you to implement necessary adaptors. Done.

>>>    This makes things like the factory pattern necessary
>>>     and it really doesn't work in general case. (And yes, Java and C++
>>>     share this defect).
>>
>> I am not sure what you mean, but when 3 is considered as 1, then
>> dispatching on bare type tags might become possible.
> 
> 3? 1? Please elaborate? Is "dispatching on bare type tags" a good or a
> bad thing? You lost me there ... (my fault probably).

You can consider type tag as a discriminant so pos.3 is a case of pos.1.

When you make an abstract factory it is usually so that you know somehow
the type you want to create (=you have the tag of), but you don't have yet
any object of that type. I.e. you have to dispatch on the bare tag to the
factory function.

> But my dear Dmitry -- What does your sentence "All strings have fixed
> length ..." mean than in this context, eh?

That for any given X there exist a function X'Length. We should carefully
distinguish properties of values and types.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-31 10:55                     ` Dmitry A. Kazakov
@ 2007-01-31 15:16                       ` Markus E Leypold
  2007-02-01 14:22                         ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-01-31 15:16 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Mon, 29 Jan 2007 16:50:16 +0100, Markus E Leypold wrote:
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>> On Sun, 28 Jan 2007 16:06:48 +0100, Markus E Leypold wrote:
>>>
>>>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>>>> 
>>>>> On Sun, 28 Jan 2007 00:24:27 +0100, Markus E Leypold wrote:
>>>
>>>>> Generics is a wrong answer and always was. As well as built-in lists you
>>>>> are praising is, because what about trees of strings, trees of lists etc.
>>>>> You cannot build every and each type of containers in.
>>>> 
>>>> You missed my point. :-). A language with a Hinldey-Milner type system
>>>> has a 'tree of something' type where something can be anything in
>>>> every given instance of usage.
>>>
>>> You have X of Y. Granted, you can play with Y, but what about X?
>> 
>> I don't understand the question ...
>
> X = tree, Y = something. What about X = something, Y = character.
>
> Example: fixed size strings, unbounded strings, suffix tree. Here the
> container (X) varies, the element does not. It is a quite common situation
> when you wished to change the containers on the fly.

This situation -- in my experience is much less common. But if you
have it, at the place where you do the manipulation you still need a
common "access protocol" for all thos containers, that is something
like

   for K in keys of container; loop

     do something with container.item(K);

   end loop;

That is what classes are good for (different from parametrized
types). The types of K could well be different here with a
Hindley-Milner type system.

If I consider your challenge in the context of a function programming
language, the situation you address might not arise as often since
you'd abstract of the iteration as well as probably construct the item
sequence at the location of calling the iteration.

   let l = list_of_x ...;;
   let t = tree_of_x ...;;


   let do_something = ...;;                  (* single iteration step *)

   let fooify_items = fold do_something u ;; (* define the iteration  *)

   ...

   ... fooify_items l 
   ... fooify_items (Tree.fringe t)  (* extract the list of items from t and iterate *)

Efficiency concerns are usually countered (a) by a really good garbage
collection and (b) by lazyiness.

>>> The point is, the language should not have either X or Y built-in. What
>> 
>> Pursuing this argument further, a language also shouldn't have strings
>> built in etc. :-). 
>
> Yes, sure. The standard library can provide them, but there must be no
> magic in. 

Here I disgree a bit: Separate syntax is a good thing for the most
common constructs. Else you could as well argue: "A language can
provide constructs for conditional processing or loops, but there must
be no magic" -- meaning that you would want only GOTOs and structured
programming exists only in the mind of the user (I've been programming
like this on micros twenty years ago ...).

And if I have extra syntax i can as well provide special optimization
rules.

> The user should be able to describe string type in language terms
> without loss of either performance or compatibility.

No. I think, strings and lists are so common, that their _semantics_
must be expressible in the core language, but the compiler should have
the license to provide special constructs and optimizations.

>> What I think, is, that having at least lists in the "standard library"
>> and by that I mean functional lists, not arrays, not "containers",
>> helps tremendously because that is such a common use case.
>> 
>>> should be built-in is "of". The type system should be able to handle
>>> parallel type hierarchies of X's and Y's bound by the relation "of".
>> 
>> Yes. And exactly that is what a Hindley-Milner type system has built
>> in. The lists in Haskell and Ocaml are just in the standard library
>> (conceptually -- knowing that they are here, makes it easy to compile
>> them by special rules and get a more efficient implementation or have
>> special instructions in the underlying VM or reduction mechanism).
>
> I doubt it can. I am too lazy to check. (:-)) But the uncomfortable

If it can't, then there os no need to forbid it, isn't there :-).

> questions you (the type system) should answer are like:
>
> Y1 <: Y2 => X of Y1 <: X of Y2
>    is container of subtypes a subtype?

No. Never.

> X1 <: X2 => X1 of Y <: X2 of Y
>    is sub-container a subtype?

Sub-container is a difficult concept. If it doesn't have binary
methods ... then the answer (off the top of my head) is yes.


> There is no universal answer. Sometimes yes, sometimes not. Consider yes
> examples:

Have a look at the Hindley-Milner type system and OCaml's OO
extensions. IMHO the answer doesn't vary at all: The only criterion is
wether the type safety (in the sense given in the Cardelli tutorial)
stays intact. Sometimes I think it is a mistake to mix interface
contracts with type systems: Type systems are too weak to express
contracts, the ad hoc way in which contracts often work, cripple type
systems (i.e. there has always been a minority who wanted termination
guarantees via the type system: Apart from the practical problems,
that doesn't belong into a type system, IMHO).


>    String vs. Wide_String (case 1)
>    Unbounded_String vs. String  (case 2)
>    Unbounded_String vs. Wide_String  (mixed case)
>    
>>>> So you get 'tree of foo' from it and 'tree of bar' when you use it
>>>> like this, but there are no generics instantiated. There are trade
>>>> offs, of course (like that you probably can't design such a language
>>>> without GC and without loosing a certain degree of control of the
>>>> representation of types),
>>>
>>> It is not obvious. I saw to good arguments for GC, so far. (it tend to
>>> repeat in c.l.a and comp.object, a lot of foam around mouths... (:-))  
>> 
>> Foam, yes :-). But you have to admit, that allowing a GC without ever
>> having an actual implementation (execpt Martin Krischik's adaption of
>> the B�hm collector) just too much teasing: You could have it, but no,
>> it's not there.
>
> It was left open for those who might need it. It seems that nobody came...
> (:-))

My impression was that implementing a full Ada 95 compiler was
diifficult enough. Then you don't want GC in embedded programming and
Ada has more or less drawn back from general programming in the last
whatever years. The absence of a GC implementation almost everywhere
is a clear indication of that in my eyes.

> No seriously, I don't consider GC as a problem. Just give me "X of Y." Then
> I would take X = access. An do whatever memory collection I want! See?

No?

> There is absolutely no need to have built-in GC, if you have abstract
> referential (access) interfaces. Note additional advantage: you can have
> different GC's handling objects of same types in the same application!

But there is. Only under very restricted circumstances the necessity
of deallocation can be decided locally. With reference sharing
(representation sharing) between values (think Lisp-like lists -- and
I think you need that for efficience reasons sooner or later -- you're
either back to a "manual" reference counting scheme (inefficient) or
you really need GC.

Let me repeat my mantra: Manual allocation and deallocation hinders
modularity. Not much, because you can survive without it in much the
same fashion as the old FORTRAN programmers survived with GOTOs only,
but ... the question is not wether I need it, but wether I want it in
2007. For, mind you, PC/workstation application programming (as
opposed to embedded in this case).

Ada could have been a nice language straddling a wide range of
application areas. But where other languages have developed, mutated
and where necessary spawned bastards (e.g. C#), with Ada I notice the
majority opinion seems to be that if I don't absolutely need it in
embedded programming, it should not be in the language. Well -- what
was the question again: "How come Ada isn't more popular?". Me thinks
the answer is: Because it don't want any more.


>>> To me GC is about indeterminable scopes, upward closures and
>>> other things I don't want to have...
>> 
>> If yout have only downward scopes for "closures" and memory allocation
>> this will, finally, interact badly with the fact that "implicit
>> (i.e. static type safe) casting" of classes is also only possible
>> downwards. My impression is, that all these things together rule out
>> some useful designs, that would otherwise possible. Or to say it
>> differenty: Object orientation w/o indeterminable scopes, upward
>> closures and GC doesn't work well. Some abstractions cannot be
>> implemented.

> Umm, I cannot tell. 

I think I can tell, but the discussion on this topic (what is
functional programming good for) does still rage on c.l.f and the
relevant mailing lists. I notice, that nobody that actually has tried
FP doubts the superiority of the style in general (they are bitching
about efficiency, sometimes, and availability of libraries, mor
often).

> It is an interesting point. 

> I am not sure if it is
> true, because we already can return T'Class, and surely we should develop
> the language towards making X of Y'Class possible. 

Yes, all languages in question are Turing complete. So you can always
write a Lisp-interpreter and embed it etc. That, though, is not the
point. It does not count wether I can do things "in principle" but how
I can do them and how well the applied abstractions can be seen at a
glance by somebody reading the code later. 

BTW -- another argument _for_ a builtin list syntax.

> As a simple intermediate
> stage we could allow X (T: Tag) of Y'Class (T). In Ada syntax:
>
> type Coherent_Array (Element_Tag : <syntax sugar for Element'Class>) is
>    array (Integer range <>) of
>       Element'Class (Element_Type);
>
> Here the discriminant of the array determines the specific types of all its
> elements.
>
>> This, of course is just a gut feeling. I do not know about research or
>> empiricial studies that examine the influence that these various
>> restrictions have on each other and how they act together.
>
> My feeling is that upward closures destroy the idea of type. However,

So how come that OCaml and Haskell have it? Those languages have a
static type system. Without even RTTI ... :-)

> somehow we surely need type objects in some form, for making distributed
> systems, for instance (how to marshal non-objects?)

Alice ML does it well. Usually they're just marschalling a cluster of
blocks on the heap, but some languages (Alice ML) are more
sophisticated.


>
>>> Let types-1 be values, make them values. Then they will need no
>>> declarations, just literals. Soon you will discover some types-2 which
>>> still require declarations. Make them values too. An so on. At some level n
>>> you will have to stop. Now, rename type-n = "type". value, type-k<n =
>>> "value". See? You are where you have started.
>> 
>> I do not understand your argument here. Care to give some example and
>> I'll try write down how it is down in i.e. OCaml? Perhaps we're
>> talking on cross purposes too, since I'm not sure I really wanted to
>> desire the thing you insist I want. :-)
>
> You have the following hierarchy:
>
> values

> type-1 = types (sets of values + operations)

type = types?

Either you have a type (IMHO) or types. A still fail to follow you here.

> type-2 = types of types (sets of types + operations to compose types)
> type-3 = types of types of types (...)
> ...
> type-n
> ...
> type-oo
>
> [ value-k = type-k-1 ]
>
> As an example of type-2 you can consider parametrized types. Instances of
> them are type-1 (=value-2). Types declared in generic Ada packages are
> type-1. All of them considered together ("generic type") is a type-2.
> Another example of type-2 in Ada is formal generic type:
>
> generic
>    type T is range <>;
>
> "range <>" is type-2. The actual T is type-1 (=value-2).


Perhaps the problem is that with parametrized types you try to express a restriction of the kind

  List of something where something is in the followin family of types
  ...


With parametrized types in an Hindley-Milner type systems things are
vastly more simple: There is a 'general' 'anything' type and ther are
concrete types. So we have:

 -  List of anything (really anything)

which becomes List of int, List of float, List of List of Tree of Int
at any give point of use.

Of course there are also types systems as you seem to decribe them but
I don't know much about them. Hindley-Milner is still simple enough to
be decidable and understandable and buys you a good deal of things.

If on the other side you want to express structured types (i.e. types
with associated operations) the thing to use in ML are modules and
functors. Which are not values but a "static", still typed, language
for constructing modules.

I fail to see how that brings me all back to the starting point. Are
we still talking about the same topic?


>>>> Languages of the Algol-FORTRAN-C-Pascal-Ada group are all far from
>>>> that ideal. Since a lot of programming these days is general list
>>>> manipulation, everyday jobs become painful. 
>>>
>>> There always was Lisp, for those who prefer declarations spelt in the form
>>> of bracket structures...
>> 
>> I'm not talking about syntax. I'm talking about types. Which --
>> strictly speaking -- lisp hasn't. there is only one (static) type in a
>> dynamic type system, which is one large union (discriminant) type.
>
> I think one could argue that types in Lisp are (), (()), ((())), ... but I
> don't believe it deserves much attention. (:-))

No. () and ((()) are given values of the same (static) type in Lisp,
because there is not static type checking that keeps you from passing
() where you passed a ((()) just 2 lines before. See?

>
>>>> Has anybody here aver wondered about the popularity of "scripting",
>>>> like with Perl, PHP, Python and so on?
>>>
>>> I did.
>> 
>> And? What was the result of your considerations? :-)
>
> The Original Sin is the reason. They are sent upon us as a punishment...
> (:-))

I agree at least as far as PHP is concerned. Perl pleads for some
mitigating circumstances: It has not started as Web language but as an
awk substitute and has striven hard to mitigate the problems it causes
with safemode and this like. The jury is still out on that. 

But PHP certainly: A punishment.


>>>>> In my view there are three great innovations Ada made, which weren't
>>>>> explored at full:
>>>>>
>>>>> 1. Constrained subtypes (discriminants)
>>>>> 2. Implementation inheritance without values (type N is new T;)
>>>>> 3. Typed classes (T /= T'Class)
>>>> 
>>>> Here 2 things are missing: 
>>>> 
>>>>   - parameterized types (see Java generics and the ML type system, the
>>>>     first is just borrowing from the latter).
>>>

>>> See pos.1. The constraint is the parameter of. In rare cases you wanted
>>> different values of parameters producing isolated types you would apply 2
>>> to 1.
>> 
>> Again I do not know what you're denying here...

> I deny them as types-2. The huge advantage of pos.1 is that the result is
> type-1. The consequence: you can have common values. With type-2 values
> (value-1) of different values (type-1) of type-2 are of different types =>
> you cannot put them into one container.

We must be talking on cross purposes. I admittedly do not understand
most of the terminology you're using here and certainly cannot apply
it here: Why come, the Hindley-Milner type systems have parametrized
types and don't seem to labor under that kind of problem?

>>>>   - Separation of implementation and type or to put it differently
>>>>     inheritance and subtyping compatibility. See the ocaml type system.

>>> That should be interface inheritance from concrete types. Yes, Ada misses
>>> that.


No, no, no. Inheritance should never ever decide on on a subtype
relationship. It can't. In the most general case (again, see the OCaml
manual on objects) objects of a class B derived per inheritance of a
class A cannot be subtypes of the objects of class A (their parent
class).

>> 
>> No, what I want is even different. 2 values / objects in the OCaml way
>> of objects are just compatible if their type signatures (which are
>> calculated by the type inference engine) agree or better one is
>> contained in the other. This is a weaker contract than in Ada where at
>> least a behavioural part of the contract is implied by inheriting the
>> implementation, but which (except for generic packages) is useless,
>> since you can destroy behaviour by overwriting a method with a
>> misbehaved procedure.

> I don't see difference yet. When you inherit only interface, you drop all
> the implementation or its parts. This is one issue.

I don't even need a implementation at the start. One point is, that
the type fitting into a slot in a functor or as a paramter of a
procedure might well never have been defined explicitely but is a
result of type inference. Why should anyone bother with inheriting
interfaces from an implementation (espcially if that wouldn't give a
guarantee as far as subtyping compatibility goes).


> Type inference, if you mean that, is a different one. I doubt that
> inference is a right way.

And I don't. I'm actually convinced it is the right way.

>
>>>>     I admit the contracts are weaker for allowing to instante a
>>>>     generic with a package with the "right type signature" as
>>>>     parameter instead of requiring an instance of another specific
>>>>     generic.
>>>
>>> There should be no generics at all...
>> 
>> I'm not sure. Generics provide a relatively strict contract model. I
>> like this. But the instantation requirement is cumbersome if compared
>> with the way parameterized types work in other langauges (and that is
>> exactly what I'm pulling from generics most of the time: parameterized
>> types). Parameterized types are now 1:1 substitute for generics. ML
>> has functors too, for a good reason. But they are what makes everyday
>> live bearable.

> We don't need parametrized types (type-2). We can well do almost everything
> with parametrized subtypes (type-1). That is the pos.1. Compare:

"Almost". There was a time when I thought I could do everything in
turbo pascal. After a while one dicovers abstracting over the elements
of a container. The need for generics is borne. And so it goes on and
on and on. Abstraction and the need to implement concepts in a
reusable way drive the development. 




> type String is array (Positive range <>) of Character;
>    -- type-1, subtype is parametrized by the bounds
>
> generic
>    First : Positive;
>    Last : Positive;
> package
>    type String is array (First..Last) of Character;
>       -- type-2, type is parametrized by the bounds
>
> I don't want the latter. Everything that can be made within type-1 must be
> done there.
>
>>>>     But what is absolutely annoying, is, that the compatibility of
>>>>     objects is determined by inheritance instead by the type
>>>>     signature.
>>>
>>> I see it otherwise. Because "compatibility" is undecidable (for both the
>>> computer and the programmer), the language must be strongly and
>>> manifestedly typed.
>> 
>> Since the contract can be broken by new methods anyway, the only thing
>> that counts from a type safety point of view, is, not to break the
>> abstraction to the underlying processor / VM, that is, to be able to
>> call the methods with the right number of parameters and the right
>> representation of the parameters (at the machine level). So the type
>> signature is enough.

> This is a strange argument. Yes, you cannot verify the semantics, exactly
> therefore types come into play. Type only names the behavior, so that it
> can be checked formally *without* looking into actual behavior.

But it can't: If B derives from A I have no guarantee, none that B
behaves a an A.

>> It's really bothersome that one cannot supply a class X which is
>> compatible to another class Y by just writing out the right methods. 
>
> This is interface inheritance + supertyping + inheritance. It works as
> follows:

Really cumbersome. Why not just use the type signatur to decide on the
compatibility?


> Given: X, Y independent types.
>
> Required: To use Y where the interface of X is expected.
>
> You create a supertype Z for Y which is a subtype of X. The compiler will
> check the contract and require you to implement necessary adaptors. Done.



>>>>    This makes things like the factory pattern necessary
>>>>     and it really doesn't work in general case. (And yes, Java and C++
>>>>     share this defect).
>>>
>>> I am not sure what you mean, but when 3 is considered as 1, then
>>> dispatching on bare type tags might become possible.
>> 
>> 3? 1? Please elaborate? Is "dispatching on bare type tags" a good or a
>> bad thing? You lost me there ... (my fault probably).

> You can consider type tag as a discriminant so pos.3 is a case of pos.1.

I still don't see the positions -- where is the numbering?

> When you make an abstract factory it is usually so that you know somehow
> the type you want to create (=you have the tag of), but you don't have yet
> any object of that type. I.e. you have to dispatch on the bare tag to the
> factory function.

Yes I know that. If you look into concrete application it rarely works
(well) as a mechanism to abstract over implementation (i.e. the true
base classes).

>
>> But my dear Dmitry -- What does your sentence "All strings have fixed
>> length ..." mean than in this context, eh?
>
> That for any given X there exist a function X'Length. We should carefully
> distinguish properties of values and types.

And in C this function is?

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-30 19:58                     ` Robert A Duff
  2007-01-30 21:52                       ` Markus E Leypold
@ 2007-01-31 17:49                       ` Ed Falis
  2007-01-31 22:53                         ` Robert A Duff
  1 sibling, 1 reply; 397+ messages in thread
From: Ed Falis @ 2007-01-31 17:49 UTC (permalink / raw)


Robert A Duff wrote:

> Why do we call these closures "downward" and "upward" instead of the
> other way around?

Because certain microprocessor architectures manage their stacks from
high to low addresses as things are pushed onto them?  Stupid, but
understandable how it may have happened.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-30 21:52                       ` Markus E Leypold
@ 2007-01-31 22:49                         ` Robert A Duff
  2007-01-31 23:07                           ` (see below)
  2007-02-01  7:57                           ` Markus E Leypold
  0 siblings, 2 replies; 397+ messages in thread
From: Robert A Duff @ 2007-01-31 22:49 UTC (permalink / raw)


Markus E Leypold
<development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> Robert A Duff <bobduff@shell01.TheWorld.com> writes:
>> (Of course, the call stack is not quite a stack anymore, if
>> upward closures are allowed!)
>
> Depends. Just yesterday I've been imagining a system which basically
> uses a stack, but when passing a "closure" upwards, copies parts of
> the stack to the heap.
>
> Chicken scheme I think, also uses a stack (in som unusal ways), but
> still has closure.

Well, yeah, that's the sort of thing I mean by "not quite a stack".
It's sort of like a stack, except some more-recently-pushed parts of
it need to survive longer, so the implementation copies things
to the heap, or allocates activation records in the heap in the
first place, or <various implementation strategies>.

>> It seems to me "inward" and "outward" closures would be clearer
>> terminology, for what are normally called "downward" and "upward",
>> respectively.
>
> What you call outward closures are just closure.

Well, if a language supports both inward and outward (or downward and
upward, of you prefer), then people say it supports "full closures".
It doesn't make a lot of sense to support the outward/upward ones
without also supporting the inward/downward ones, too.

>... There was a
> discussion at comp.lang.functional which convinced me, that closure
> should better be called procdures...

Yes, I saw part of that discussion.  Yes, "closure" really just means
"procedure".  We use "closure" (or "lexical closure") to emphasize the
fact that a "procedure" includes some environmental information in
addition to just the code.

"Closure" also emphasizes that there's something interesting about that
environment -- namely, that NESTED procedures may be passed around the
place (whether inward only, or both inward and outward).

>... (because it's rathe the natural way
> how things should work) and what is presently called downwards closure
> should better be called somthing different, like, say stack bound
> procedures (to indicate that their lifetime depends on the stack).

Perhaps.  Pascal programmers just call them "procedural parameters".

> But it's certainly too late now to change established if misleading
> terminology.

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-31 17:49                       ` Ed Falis
@ 2007-01-31 22:53                         ` Robert A Duff
  0 siblings, 0 replies; 397+ messages in thread
From: Robert A Duff @ 2007-01-31 22:53 UTC (permalink / raw)


Ed Falis <falis@verizon.net> writes:

> Robert A Duff wrote:
>
>> Why do we call these closures "downward" and "upward" instead of the
>> other way around?
>
> Because certain microprocessor architectures manage their stacks from
> high to low addresses as things are pushed onto them?

I think pretty-much all computer architectures have stacks that grow
downward in memory (if they have anything resembling a stack at all).
I still prefer to call the top of the stack "top", even if its address
is less than the address of the "bottom".

>...Stupid, but
> understandable how it may have happened.

Quite likely.

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-31 22:49                         ` Robert A Duff
@ 2007-01-31 23:07                           ` (see below)
  2007-01-31 23:18                             ` Robert A Duff
  2007-02-01  7:57                           ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: (see below) @ 2007-01-31 23:07 UTC (permalink / raw)


On 31/1/07 22:49, in article wcchcu6zu9d.fsf@shell01.TheWorld.com, "Robert A
Duff" <bobduff@shell01.TheWorld.com> wrote:

> Yes, I saw part of that discussion.  Yes, "closure" really just means
> "procedure".  We use "closure" (or "lexical closure") to emphasize the
> fact that a "procedure" includes some environmental information in
> addition to just the code.
> 
> "Closure" also emphasizes that there's something interesting about that
> environment -- namely, that NESTED procedures may be passed around the
> place (whether inward only, or both inward and outward).
...
> Pascal programmers just call them "procedural parameters".

Thread tie!

I invented the "procedural/functional parameter" terminology for
the BSI/ISO standard because we kept getting confused in committee
about what was meant when we spoke of "procedure parameters".
(Were "procedure parameters" the parameters OF procedures
or those that WERE procedures?)

-- 
Bill Findlay
<surname><forename> chez blueyonder.co.uk




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-31 23:07                           ` (see below)
@ 2007-01-31 23:18                             ` Robert A Duff
  2007-01-31 23:36                               ` (see below)
  0 siblings, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-01-31 23:18 UTC (permalink / raw)


"(see below)" <yaldnif.w@blueyonder.co.uk> writes:

> On 31/1/07 22:49, in article wcchcu6zu9d.fsf@shell01.TheWorld.com, "Robert A
> Duff" <bobduff@shell01.TheWorld.com> wrote:
>
>> Yes, I saw part of that discussion.  Yes, "closure" really just means
>> "procedure".  We use "closure" (or "lexical closure") to emphasize the
>> fact that a "procedure" includes some environmental information in
>> addition to just the code.
>> 
>> "Closure" also emphasizes that there's something interesting about that
>> environment -- namely, that NESTED procedures may be passed around the
>> place (whether inward only, or both inward and outward).
> ...
>> Pascal programmers just call them "procedural parameters".
>
> Thread tie!
>
> I invented the "procedural/functional parameter" terminology for
> the BSI/ISO standard because we kept getting confused in committee
> about what was meant when we spoke of "procedure parameters".

Cool!

> (Were "procedure parameters" the parameters OF procedures
> or those that WERE procedures?)

Indeed.

Now tell me which of the following things called X is a "component of
type T"?

    type T is
        record
            X : Integer;
        end record;

    type Something is
        record
            X : T;
        end record;

;-)

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-31 23:18                             ` Robert A Duff
@ 2007-01-31 23:36                               ` (see below)
  0 siblings, 0 replies; 397+ messages in thread
From: (see below) @ 2007-01-31 23:36 UTC (permalink / raw)


On 31/1/07 23:18, in article wcc1wlazsy6.fsf@shell01.TheWorld.com, "Robert A
Duff" <bobduff@shell01.TheWorld.com> wrote:

> Now tell me which of the following things called X is a "component of
> type T"?
> 
>     type T is
>         record
>             X : Integer;
>         end record;
> 
>     type Something is
>         record
>             X : T;
>         end record;
> 
> ;-)

The RM seems to talk about components of objects rather than
components of types, so I'm *guessing* the second. 8-)

-- 
Bill Findlay
<surname><forename> chez blueyonder.co.uk





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-31 22:49                         ` Robert A Duff
  2007-01-31 23:07                           ` (see below)
@ 2007-02-01  7:57                           ` Markus E Leypold
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-01  7:57 UTC (permalink / raw)



Robert A Duff <bobduff@shell01.TheWorld.com> writes:

> Markus E Leypold
> <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>
>> Robert A Duff <bobduff@shell01.TheWorld.com> writes:
>>> (Of course, the call stack is not quite a stack anymore, if
>>> upward closures are allowed!)
>>
>> Depends. Just yesterday I've been imagining a system which basically
>> uses a stack, but when passing a "closure" upwards, copies parts of
>> the stack to the heap.
>>
>> Chicken scheme I think, also uses a stack (in som unusal ways), but
>> still has closure.
>
> Well, yeah, that's the sort of thing I mean by "not quite a stack".

Ah yes, of course. I did understand "not quite" as a cute way to say
"actually the stack is a linked set of frames in the heap", but now I
understand that you indicated your awareness of the various twists
this issue can take (or can be made to take).

> It's sort of like a stack, except some more-recently-pushed parts of
> it need to survive longer, so the implementation copies things
> to the heap, or allocates activation records in the heap in the
> first place, or <various implementation strategies>.

Excactly. :-)

>
>>> It seems to me "inward" and "outward" closures would be clearer
>>> terminology, for what are normally called "downward" and "upward",
>>> respectively.
>>
>> What you call outward closures are just closure.
>
> Well, if a language supports both inward and outward (or downward and
> upward, of you prefer), then people say it supports "full closures".

> It doesn't make a lot of sense to support the outward/upward ones
> without also supporting the inward/downward ones, too.

Yes. I think the whole upward/downward magic talk is simply due to the
fact that there is a case when people don't want GC and have a
traditional stack and therefore do something like closures the cheap
way: Closure data stays valid as long as the stack is not popped. This
is the way downward closures come into the world. 

>
>>... There was a
>> discussion at comp.lang.functional which convinced me, that closure
>> should better be called procdures...
>
> Yes, I saw part of that discussion.  Yes, "closure" really just means
> "procedure".  We use "closure" (or "lexical closure") to emphasize the
> fact that a "procedure" includes some environmental information in
> addition to just the code.

Which in a mathematical sense should be the default. If I say 

 - let a = ...
 - let f(x) = a * proolify (x) ^ a

and later (much later) say something like


 - let's call the coefficient (k1 * k7 ^ p4) a.

nobody (no mathematician) would expect f to change.

> "Closure" also emphasizes that there's something interesting about that
> environment -- namely, that NESTED procedures may be passed around the
> place (whether inward only, or both inward and outward).

As should be the norm.

>>... (because it's rathe the natural way
>> how things should work) and what is presently called downwards closure
>> should better be called somthing different, like, say stack bound
>> procedures (to indicate that their lifetime depends on the stack).
>
> Perhaps.  Pascal programmers just call them "procedural parameters".

Even if they aren't passed around at the moment? Shouldn't a parameter
be something passed to a procedure call? 

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-31 15:16                       ` Markus E Leypold
@ 2007-02-01 14:22                         ` Dmitry A. Kazakov
  2007-02-01 15:18                           ` Markus E Leypold
                                             ` (2 more replies)
  0 siblings, 3 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-01 14:22 UTC (permalink / raw)


On Wed, 31 Jan 2007 16:16:20 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Mon, 29 Jan 2007 16:50:16 +0100, Markus E Leypold wrote:
>>
>>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>>> 
>>>> On Sun, 28 Jan 2007 16:06:48 +0100, Markus E Leypold wrote:
>>>>
>>>>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>>>>> 
>>>>>> On Sun, 28 Jan 2007 00:24:27 +0100, Markus E Leypold wrote:
>>>>
>>>>>> Generics is a wrong answer and always was. As well as built-in lists you
>>>>>> are praising is, because what about trees of strings, trees of lists etc.
>>>>>> You cannot build every and each type of containers in.
>>>>> 
>>>>> You missed my point. :-). A language with a Hinldey-Milner type system
>>>>> has a 'tree of something' type where something can be anything in
>>>>> every given instance of usage.
>>>>
>>>> You have X of Y. Granted, you can play with Y, but what about X?
>>> 
>>> I don't understand the question ...
>>
>> X = tree, Y = something. What about X = something, Y = character.
>>
>> Example: fixed size strings, unbounded strings, suffix tree. Here the
>> container (X) varies, the element does not. It is a quite common situation
>> when you wished to change the containers on the fly.
> 
> This situation -- in my experience is much less common. But if you
> have it, at the place where you do the manipulation you still need a
> common "access protocol" for all thos containers, that is something
> like
> 
>    for K in keys of container; loop
> 
>      do something with container.item(K);
> 
>    end loop;
>
> That is what classes are good for (different from parametrized
> types). The types of K could well be different here with a
> Hindley-Milner type system.

I am not sure what you mean, but it looks quite wrong. (:-)) The iteration
above is based on the interface of an abstract array. Period.

(not every container is array (ordered finite set with a mapping to another
ordered finite set called index)

> If I consider your challenge in the context of a function programming
> language, the situation you address might not arise as often since
> you'd abstract of the iteration as well as probably construct the item
> sequence at the location of calling the iteration.
> 
>    let l = list_of_x ...;;
>    let t = tree_of_x ...;;
> 
> 
>    let do_something = ...;;                  (* single iteration step *)
> 
>    let fooify_items = fold do_something u ;; (* define the iteration  *)
> 
>    ...
> 
>    ... fooify_items l 
>    ... fooify_items (Tree.fringe t)  (* extract the list of items from t and iterate *)
> 
> Efficiency concerns are usually countered (a) by a really good garbage
> collection and (b) by lazyiness.

But tree is not an array, it may have an interface of, but I don't see in
your program how that interface works.

As for the functional style I am not impressed. Array interfaces are much
more natural an easier to use. Consider a bunch of tasks concurrently
processing pieces of the same container, exchanging the positions in etc.
But again, it is a stylistic issue which are IMO irrelevant to the type
system.

>>>> The point is, the language should not have either X or Y built-in. What
>>> 
>>> Pursuing this argument further, a language also shouldn't have strings
>>> built in etc. :-). 
>>
>> Yes, sure. The standard library can provide them, but there must be no
>> magic in. 
> 
> Here I disgree a bit: Separate syntax is a good thing for the most
> common constructs. Else you could as well argue: "A language can
> provide constructs for conditional processing or loops, but there must
> be no magic" -- meaning that you would want only GOTOs and structured
> programming exists only in the mind of the user (I've been programming
> like this on micros twenty years ago ...).

And I would insist to do this, if it were possible. But plain gotos don't
work. You need conditionals (arithmetic gotos or stuff like that) and you
need things to handle scopes (loop indices). The result would be a
different language rather than a subset of.

> And if I have extra syntax i can as well provide special optimization
> rules.

... or loose some important optimizations.

>> The user should be able to describe string type in language terms
>> without loss of either performance or compatibility.
> 
> No. I think, strings and lists are so common, that their _semantics_
> must be expressible in the core language,

You mean *not* expressible. The only reason for having anything built-in is
because it cannot be expressed in other language terms. So it is postulated
there and expressed in a natural language of the Reference Manual.

>> questions you (the type system) should answer are like:
>>
>> Y1 <: Y2 => X of Y1 <: X of Y2
>>    is container of subtypes a subtype?
> 
> No. Never.

Which is a catastrophe in my eyes. Consider:

Y1 = Integer;
Y2 = Positive

X = access

Now we have

access Positive is unrelated to access Integer.

What would be of Ada if it were constructed this way?

>> X1 <: X2 => X1 of Y <: X2 of Y
>>    is sub-container a subtype?
> 
> Sub-container is a difficult concept. If it doesn't have binary
> methods ... then the answer (off the top of my head) is yes.

How are you going to pass string slices where strings are expected? And the
language I want is such that I could pass an array where a queue is
expected.

> The only criterion is
> wether the type safety (in the sense given in the Cardelli tutorial)
> stays intact. Sometimes I think it is a mistake to mix interface
> contracts with type systems: Type systems are too weak to express
> contracts, the ad hoc way in which contracts often work, cripple type
> systems 

Hmm, the language of contracts is not the object language where types live.
In that sense it is decoupled anyway. Therefore the point about type system
is trivially true. Nothing in the object language is strong enough to
express the meta language. You could say same about if-then-else or
integers. If you meant the program semantics/behavior when you talked about
contracts, then that is again true. Nothing can express program semantics
at full. In that sense, yes, LSP was an unrealistic program, which still
does not mean that types are useless. For all I don't see anything useful
proposed instead.

>> No seriously, I don't consider GC as a problem. Just give me "X of Y." Then
>> I would take X = access. An do whatever memory collection I want! See?
> 
> No?

You have a user-defined access type. You can control everything related to
that type. What else you need to design a collector for such pointers?

>> There is absolutely no need to have built-in GC, if you have abstract
>> referential (access) interfaces. Note additional advantage: you can have
>> different GC's handling objects of same types in the same application!
> 
> But there is. Only under very restricted circumstances the necessity
> of deallocation can be decided locally. With reference sharing
> (representation sharing) between values (think Lisp-like lists -- and
> I think you need that for efficience reasons sooner or later -- you're
> either back to a "manual" reference counting scheme (inefficient) or
> you really need GC.

I don't see how "manual" is different from "implemented by the compiler
vendor." There is no magical GC hardware around... Each GC is reference
counting, only the ways of counting differ.

> I notice, that nobody that actually has tried
> FP doubts the superiority of the style in general (they are bitching
> about efficiency, sometimes, and availability of libraries, mor
> often).

It is a very strong argument. FP is even more niche than Ada, for that
matter.

> Yes, all languages in question are Turing complete. So you can always
> write a Lisp-interpreter and embed it etc. That, though, is not the
> point. It does not count wether I can do things "in principle" but how
> I can do them and how well the applied abstractions can be seen at a
> glance by somebody reading the code later. 

Yes, I fully agree, AND I don't want Lisp!

> BTW -- another argument _for_ a builtin list syntax.

Hey, Ada already has ()-brackets. Maybe our Unicode fans would propose
special glyphs for )), ))), )))), ))))), )))))) etc. Is it what you mean as
special syntax? (:-)) 

>> My feeling is that upward closures destroy the idea of type. However,
> 
> So how come that OCaml and Haskell have it? Those languages have a
> static type system. Without even RTTI ... :-)

How can anything be statically checked being a result of run-time
computation?

[ I don't consider a hidden argument to popularity, for it is obviously
flawed. Example: Visual Basic. ]

>> somehow we surely need type objects in some form, for making distributed
>> systems, for instance (how to marshal non-objects?)
> 
> Alice ML does it well. Usually they're just marschalling a cluster of
> blocks on the heap, but some languages (Alice ML) are more
> sophisticated.

Sending bytes over socket = does it well?

>>>> Let types-1 be values, make them values. Then they will need no
>>>> declarations, just literals. Soon you will discover some types-2 which
>>>> still require declarations. Make them values too. An so on. At some level n
>>>> you will have to stop. Now, rename type-n = "type". value, type-k<n =
>>>> "value". See? You are where you have started.
>>> 
>>> I do not understand your argument here. Care to give some example and
>>> I'll try write down how it is down in i.e. OCaml? Perhaps we're
>>> talking on cross purposes too, since I'm not sure I really wanted to
>>> desire the thing you insist I want. :-)
>>
>> You have the following hierarchy:
>>
>> values
> 
>> type-1 = types (sets of values + operations)
> 
> type = types?
>
> Either you have a type (IMHO) or types. A still fail to follow you here.

An example of a type is Integer.
 
>> type-2 = types of types (sets of types + operations to compose types)
>> type-3 = types of types of types (...)
>> ...
>> type-n
>> ...
>> type-oo
>>
>> [ value-k = type-k-1 ]
>>
>> As an example of type-2 you can consider parametrized types. Instances of
>> them are type-1 (=value-2). Types declared in generic Ada packages are
>> type-1. All of them considered together ("generic type") is a type-2.
>> Another example of type-2 in Ada is formal generic type:
>>
>> generic
>>    type T is range <>;
>>
>> "range <>" is type-2. The actual T is type-1 (=value-2).
> 
> Perhaps the problem is that with parametrized types you try to express a restriction of the kind
> 
>   List of something where something is in the followin family of types
>   ...

Parametrizing is a mapping:

   X : P1 x P2 x...x PN -> T

Here Pi are parameters, T is the set of types (which was denoted as
types-1). Clearly X is not a member of T, X(P1, P2, ..., PN) is. So
parametrized types cannot be types, at least immediately. You need some
equilibristics to bring mappings to or their equivalents into T, as well as
making Pi also members of T. I doubt it were possible in any mathematically
sane way.

> With parametrized types in an Hindley-Milner type systems things are
> vastly more simple: There is a 'general' 'anything' type and ther are
> concrete types. So we have:
> 
>  -  List of anything (really anything)

Which is pretty useless, because "really anything" has no operations =>
cannot be used in ANY way. To start with, anything has neither a "copy
constructor" nor "take pointer to." You cannot have a list of.

But in reality, it is something lesser than anything, and inability to
state what it is, in more or less clear way, is an obvious weakness of the
language.

>> I deny them as types-2. The huge advantage of pos.1 is that the result is
>> type-1. The consequence: you can have common values. With type-2 values
>> (value-1) of different values (type-1) of type-2 are of different types =>
>> you cannot put them into one container.
> 
> We must be talking on cross purposes. I admittedly do not understand
> most of the terminology you're using here and certainly cannot apply
> it here: Why come, the Hindley-Milner type systems have parametrized
> types and don't seem to labor under that kind of problem?

Hmm, I am not mathematician, so it is just a guess. I suppose this is a
part of the problematic Russell-Whitehead type theory and similar address,
while Goedel results prevent us from having "systems of anything." All in
all I think it is quite hopeless.
 
>>>> That should be interface inheritance from concrete types. Yes, Ada misses
>>>> that.
> 
> No, no, no. Inheritance should never ever decide on on a subtype
> relationship. It can't.

It is!

>> I don't see difference yet. When you inherit only interface, you drop all
>> the implementation or its parts. This is one issue.
> 
> I don't even need a implementation at the start.

Nobody forces you. However, there is a lot of cases when you are given a
concrete type and what to adapt it to something else.

> One point is, that
> the type fitting into a slot in a functor or as a paramter of a
> procedure might well never have been defined explicitely but is a
> result of type inference.

Why bother with types if you can infer them? I suppose, because you only
think you can...

> Why should anyone bother with inheriting
> interfaces from an implementation (espcially if that wouldn't give a
> guarantee as far as subtyping compatibility goes).

I don't care about LSP. Neither I consider Liskov's subtyping as a useful
definition of, for the reason you have mentioned. It is never satisfied.

>> Type inference, if you mean that, is a different one. I doubt that
>> inference is a right way.
> 
> And I don't. I'm actually convinced it is the right way.

I think it is a fundamentally flawed idea. You cannot infer anything
non-trivial. That's the first objection. The second is - even if you could,
then per definition of non-trivial, the reader of the program would be
unable to understand it. Ergo: either you don't need programmers or
inference [or else programs... (:-))]

>>> Since the contract can be broken by new methods anyway, the only thing
>>> that counts from a type safety point of view, is, not to break the
>>> abstraction to the underlying processor / VM, that is, to be able to
>>> call the methods with the right number of parameters and the right
>>> representation of the parameters (at the machine level). So the type
>>> signature is enough.
> 
>> This is a strange argument. Yes, you cannot verify the semantics, exactly
>> therefore types come into play. Type only names the behavior, so that it
>> can be checked formally *without* looking into actual behavior.
> 
> But it can't: If B derives from A I have no guarantee, none that B
> behaves a an A.

So? If B is declared as a subtype of A, then you, the programmer, are
responsible for B to behave as A. The meaning of this is not the language
business. Does 1 apple behave as 1 orange? Such questions are meaningless.

>>> It's really bothersome that one cannot supply a class X which is
>>> compatible to another class Y by just writing out the right methods. 
>>
>> This is interface inheritance + supertyping + inheritance. It works as
>> follows:
> 
> Really cumbersome. Why not just use the type signatur to decide on the
> compatibility?

Because named compatibility is much simpler A=A. As a programmer I don't
want to compare signatures, it adds nothing but complexity. Is 1+1=2? What
is 1, 1, =, +, 2? Does 2 have a signature?

>>>> I am not sure what you mean, but when 3 is considered as 1, then
>>>> dispatching on bare type tags might become possible.
>>> 
>>> 3? 1? Please elaborate? Is "dispatching on bare type tags" a good or a
>>> bad thing? You lost me there ... (my fault probably).
> 
>> You can consider type tag as a discriminant so pos.3 is a case of pos.1.
> 
> I still don't see the positions -- where is the numbering?

Because you commented them out! Here they are:

"...
1. Constrained subtypes (discriminants)
2. Implementation inheritance without values (type N is new T;)
3. Typed classes (T /= T'Class)
..."

>>> But my dear Dmitry -- What does your sentence "All strings have fixed
>>> length ..." mean than in this context, eh?
>>
>> That for any given X there exist a function X'Length. We should carefully
>> distinguish properties of values and types.
> 
> And in C this function is?

strlen

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 14:22                         ` Dmitry A. Kazakov
@ 2007-02-01 15:18                           ` Markus E Leypold
  2007-02-01 16:26                           ` Georg Bauhaus
  2007-02-01 19:31                           ` Ray Blaak
  2 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-01 15:18 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

<lots of things>

Dmitry -- I notice I'm missing a lot of time at the moment and we're
loosing more and more track of the argument since our respective
experiences differ so much (please try a functional language with a
Hindley-Milner type system some time :-). I'll try to come back to
parts of that withing the next days, don't hesitate to nudge me, if I
forget, but just now I can't.

Ragards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 14:22                         ` Dmitry A. Kazakov
  2007-02-01 15:18                           ` Markus E Leypold
@ 2007-02-01 16:26                           ` Georg Bauhaus
  2007-02-01 17:36                             ` Markus E Leypold
  2007-02-02  9:20                             ` Dmitry A. Kazakov
  2007-02-01 19:31                           ` Ray Blaak
  2 siblings, 2 replies; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-01 16:26 UTC (permalink / raw)


On Thu, 2007-02-01 at 15:22 +0100, Dmitry A. Kazakov wrote:
> On Wed, 31 Jan 2007 16:16:20 +0100, Markus E Leypold wrote:

> > I notice, that nobody that actually has tried
> > FP doubts the superiority of the style in general (they are bitching
> > about efficiency, sometimes, and availability of libraries, mor
> > often).

FP is superior to what? I'm saying this having had some fun writing
small OCaml command line tools. There are many good
libraries. And some things are just easy to write.

Hm. The compiler error message are as helpful as you
would expect from inference machines (on a par with Hugs or C++
template error messages). GHC has been improved in this regard.

But how would you know that those who have tried FP don't doubt the
superiority of the style? Any statistics? What does "in general" mean
here?  Are they really thinking that the inferior crowd doesn't write
recursive functions? Well, perhaps that is true... GCC does try to
eliminate tail calls, though, so maybe we can see a shift from
loops and counter manipulation towards recursive subprograms some day.
;-)

One of the great new pieces of Ada 2007 is Ada.Containers in my view
precisely because it permits what is taken for granted with FP or
with scripting languages: lists, tables, and sets, and ways to
use or write generic algorithms involving all of them.
There isn't a lot of curry in there, but much else.

To me, many of the FP advocates seem to be mathematicians. If you
look closely, two phenomena appear frequently in e.g. OCaml programs:

(1) Either they use functions assembly language(!!!),
(2) or they use reference types and assignments almost exclusively.

-> (1) can be attributed to the programmer being a mathematician:
function assembly is a game in logic. Might be fun. Is it easy?
What about review, changes, maintenance?

-> (2) is a strange thing: These are imperative programs, written
in some allegedly functional language, and they are written this way
because they will then run faster most of the time.
What would Peter Naur have to say about this von Neumann style?

Among FP advocates, there is always someone who just forces computers
to be abstracted into a function instead of considering what a
programmer MUST do and what most actual programming is all about:
manipulating the machine in operation, not manipulating some model of
functions.

So why doesn't someone tell the FP people to advocate a theory of
functions that

(a) match empirical comput*ers* better than they match some model
of comput*ation*.

(b) are no harder to analyze WRT O(?) than plain old procedures?

Wait, Haskell is trying (a) by capturing imperative statements
in monads. The chapter on fusions in Bird's Haskell book lets me
think that you add simplicity by mixing FP with imperative style
when memory or speed are a consideration.


> > BTW -- another argument _for_ a builtin list syntax.
> 
> Hey, Ada already has ()-brackets. Maybe our Unicode fans would propose
> special glyphs for )), ))), )))), ))))), )))))) etc. Is it what you mean as
> special syntax? (:-)) 

Actually, a syntax that is used in a theory book is
{left/right paren}{subscript i}.
The different lists of statements of an Ada program are marked using
more or less different pairs of brackets (if .. end if, loop .. end
loop, do .. end, ...).

What does a list syntax buy us if there is no argument pattern matching,
or no data structure pattern matching in Ada? Brevity?


> > One point is, that
> > the type fitting into a slot in a functor or as a paramter of a
> > procedure might well never have been defined explicitely but is a
> > result of type inference.
> 
> Why bother with types if you can infer them? I suppose, because you only
> think you can...

It is considered good Haskell style to list the function
type before the function. I like it being able to write OCaml code
that does the same:

let inc: int -> (int -> int) =
   fun k x ->
      x + k     (* (+) forces int anyway *)

Actually, to be more general and stylish, I could have written

# let inc binop increment arg = binop increment arg;;
val inc : ('a -> 'b -> 'c) -> 'a -> 'b -> 'c = <fun>

Hm. I wanted binop taking two 'as - the implementation might
change and require that binop be symmetric. When people
have used my inc with a binop taking different types,
they will have to change their programs too. Ouch.

So maybe I can think of some trick to make the inference
circuits assume binop takes two arguments of type 'a.
Or just write what I want and name the types!

How is this FP style superior, besides being brief and full of
assumptions?
Sure the function is a value that can be passed around freely,
and can be used in currying. OK. This is another technical
facility that is powerful.


generic
   type T is private;
   with function "+"(a, b: T) return T;
   increment: T;
function Inc(arg: T) return T;

function Inc(arg: T) return T is
begin
   return increment + arg;
end Inc;

Looks like more code, and I hear an FP advocate saying, "in FP,
you just ...". Any sentence that starts with "You just ..."
is suspicious :-)

Can I have this without generics, please, Dmitry? ;-)





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 16:26                           ` Georg Bauhaus
@ 2007-02-01 17:36                             ` Markus E Leypold
  2007-02-01 20:53                               ` Georg Bauhaus
  2007-02-05  0:39                               ` Robert A Duff
  2007-02-02  9:20                             ` Dmitry A. Kazakov
  1 sibling, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-01 17:36 UTC (permalink / raw)



Georg Bauhaus <bauhaus@futureapps.de> writes:

> On Thu, 2007-02-01 at 15:22 +0100, Dmitry A. Kazakov wrote:
>> On Wed, 31 Jan 2007 16:16:20 +0100, Markus E Leypold wrote:
>
>> > I notice, that nobody that actually has tried
>> > FP doubts the superiority of the style in general (they are bitching
>> > about efficiency, sometimes, and availability of libraries, mor
>> > often).
>
> FP is superior to what? 

Superior to not having "full" closures. WRT to abstracting over parts
of an algorithm / solution / component. The full context of this
comment of mine was:

> > >> If yout have only downward scopes for "closures" and memory allocation
> > >> this will, finally, interact badly with the fact that "implicit
> > >> (i.e. static type safe) casting" of classes is also only possible
> > >> downwards. My impression is, that all these things together rule out
> > >> some useful designs, that would otherwise possible. Or to say it
> > >> differenty: Object orientation w/o indeterminable scopes, upward
> > >> closures and GC doesn't work well. Some abstractions cannot be
> > >> implemented.

> > > Umm, I cannot tell. 

> > I think I can tell, but the discussion on this topic (what is
> > functional programming good for) does still rage on c.l.f and the
> > relevant mailing lists. I notice, that nobody that actually has tried
> > FP doubts the superiority of the style in general (they are bitching
> > about efficiency, sometimes, and availability of libraries, mor
> > often).



> I'm saying this having had some fun writing
> small OCaml command line tools. There are many good
> libraries. And some things are just easy to write.
>
> Hm. The compiler error message are as helpful as you
> would expect from inference machines (on a par with Hugs or C++
> template error messages). GHC has been improved in this regard.

>
> But how would you know that those who have tried FP don't doubt the
> superiority of the style? 

Admittedly, I don't. It's only difficult to dicuss the question wether
one would want to have e.g. closures, with people who haven't tried
it. As I wrote elsewhere: All common programming languages are Turing
complet, so equivalent. There is nothing that can be done in one that
could not be done in the other -- in principle. So there is nothing
like a proof "you cannot do X in language L" or "without feature
F". There is just how "easy" it is to express things and this is, to
an extend highly subjective.

> Any statistics? What does "in general" mean here?

My private head count. OK. I haven't proved it. 

> Are they really thinking that the inferior crowd doesn't write
> recursive functions?

> Well, perhaps that is true... 

Well, see, ..


> GCC does try to
> eliminate tail calls, though, so maybe we can see a shift from
> loops and counter manipulation towards recursive subprograms some day.
> ;-)

... as you see yourself, the language has to provide some support to
make that efficient as a general technique.

> One of the great new pieces of Ada 2007 is Ada.Containers in my view
> precisely because it permits what is taken for granted with FP or
> with scripting languages: lists, tables, and sets, and ways to

Yes, but in an imperative way. Still, it's progress, I completely
agree.

> use or write generic algorithms involving all of them.

> There isn't a lot of curry in there, but much else.

Currying is somewhat overrated, but's it's hardly the essence of FP.


> To me, many of the FP advocates seem to be mathematicians. If you

I studied physics when I were young ... 

> look closely, two phenomena appear frequently in e.g. OCaml programs:
>
> (1) Either they use functions assembly language(!!!),

What do you mean by that?

> (2) or they use reference types and assignments almost exclusively.

Don't know who "they" is here. Certainly not me nor Richard Bird nor
the Haskell crowd in general.

Yes, Ocaml, because of it's imperative elements has some potential to
get abused. Writing reusable components though, is, IMHO, done by
embracing a functional style.


> -> (1) can be attributed to the programmer being a mathematician:
> function assembly is a game in logic. Might be fun. Is it easy?

It's rumored to be accessible to certain optimization techniques. I
assume you mean the so called "point free style"?

> What about review, changes, maintenance?

Since point free style works well together with equational reasoning: A
big yes on all counts :-). Side effect free FP also makes it possible
to keep the old version for reference and for testing slave the old
and the new version together.

  let foo_old x y z = ... ;;
  let foo_opt x y z = ... ;;  (* new foo function, optimized *)

  let foo x y z =

      let old = (foo x y z) 
      and opt = (foo_opt x y z) 
      in 
         if not (old == opt) then raise (Testing_error "foo_opt");
         opt


This technique is too expensive and cumbersome in imperative
programming (yes, I tried it, it sucks, even for simple stuff).

> -> (2) is a strange thing: These are imperative programs, written
> in some allegedly functional language, and they are written this way
> because they will then run faster most of the time.

No. they don't as a general rule, but don't tell them :-).

> What would Peter Naur have to say about this von Neumann style?
>
> Among FP advocates, there is always someone who just forces computers
> to be abstracted into a function instead of considering what a
> programmer MUST do and what most actual programming is all about:
> manipulating the machine in operation, not manipulating some model of
> functions.

Programming, IMO, is not, manipulating the machine, but processing
data. 

>
> So why doesn't someone tell the FP people to advocate a theory of
> functions that
>
> (a) match empirical comput*ers* better than they match some model
> of comput*ation*.

What's your problem with that? I actually don't even see the problem
you're trying to address. Care to elaborate?  And shall we shift that
thread to c.l.f :-)). It's a bit quiet there at the moment.

> (b) are no harder to analyze WRT O(?) than plain old procedures?

They aren't.

> Wait, Haskell is trying (a) by capturing imperative statements
> in monads. The chapter on fusions in Bird's Haskell book lets me
> think that you add simplicity by mixing FP with imperative style
> when memory or speed are a consideration.

No, you don't add simplicity. Monad style by sequencing state through
the "operations" is just a ticket to grant the underlying compilation
system the opportunity to do more optimization. Monads, IMO, are not
imperative. They are a cute trick to enforce data flow by the type
system in such a way that state (the state of the world in case of the
IO monad) can be updated destructively by the VM or run time system
instead of having to retain the old value on the heap. Which would be
a bit difficult in case of the world state, anyway.


>> > BTW -- another argument _for_ a builtin list syntax.
>> 
>> Hey, Ada already has ()-brackets. Maybe our Unicode fans would propose
>> special glyphs for )), ))), )))), ))))), )))))) etc. Is it what you mean as
>> special syntax? (:-)) 

No.

>
> Actually, a syntax that is used in a theory book is
> {left/right paren}{subscript i}.
> The different lists of statements of an Ada program are marked using
> more or less different pairs of brackets (if .. end if, loop .. end
> loop, do .. end, ...).
>
> What does a list syntax buy us if there is no argument pattern matching,
> or no data structure pattern matching in Ada? Brevity?

Nothing. If you don't have the infrastructure to deal with lists in a
flexible way (functional lists, not containers) it doesn't buy you
much.


>> > One point is, that
>> > the type fitting into a slot in a functor or as a parameter of a
>> > procedure might well never have been defined explicitly but is a
>> > result of type inference.
>> 
>> Why bother with types if you can infer them? I suppose, because you only
>> think you can...

(I don't know what that is supposed to mean: The OCaml people must be
dreaming.)

> It is considered good Haskell style to list the function
> type before the function. I like it being able to write OCaml code
> that does the same,

>
> let inc: int -> (int -> int) =
>    fun k x ->
>       x + k     (* (+) forces int anyway *)


Yes, but you certainly do not declare all types of intermediate
results. Annotating function types is a good style, but again not for
short functions like

  let dup = List.map (f x -> (x,x))

>
> Actually, to be more general and stylish, I could have written
>
> # let inc binop increment arg = binop increment arg;;
> val inc : ('a -> 'b -> 'c) -> 'a -> 'b -> 'c = <fun>
>
> Hm. I wanted binop taking two 'as - the implementation might
> change and require that binop be symmetric. 

I've problems with that "require". :-). If you use 'inc' in a module
interface you'll have to state it's type (will not be interfered. So
you have to decide early on in this case. In the other case it's not
part of the module interface, so not part of reusable component. So
nobody cares if it changes. "People" will not use it.,

> When people have used my inc with a binop taking different types,
> they will have to change their programs too. Ouch.

> So maybe I can think of some trick to make the inference
> circuits assume binop takes two arguments of type 'a.
> Or just write what I want and name the types!

> How is this FP style superior, besides being brief and full of
> assumptions?

Have your way: It's not. I don't want to make unwilling converts to
FP. FP and more to the point the Hindley-Milner type systems were only
a point to illustrate some things in my discussion with Dimitry where
he said, that things cannot be done differently than (...) etc. They
can, SML/OCaml/Haskell are the living proof. If anybody decides he/she
doesn't want it -- be free to ignore FP. But we started out (somewhere
in the middle) with the statement of Dmitry that one "wouldn't want
closures" (I'm quoting from memory) and that as a general statement
doesn't hold water. Therefore and to illustrate what you can do with
closures and a different type system my continued references to OCaml.

If it didn't convince you -- probably my fault. Nonetheless I refuse
to discuss merits of languages ot a micro level like:

> So maybe I can think of some trick to make the inference
> circuits assume binop takes two arguments of type 'a.
> Or just write what I want and name the types!

> How is this FP style superior, besides being brief and full of
> assumptions?

That is not FP. That is, essentially, your style when interacting with
OCaml and your problem.

> Sure the function is a value that can be passed around freely,
> and can be used in currying. OK. This is another technical
> facility that is powerful.

Yes, passing functions around is powerful. Having a 'a type is also
powerful. That's basically all I have been asserting. 

It helps a tiny bit not to have to declare the types of lists (a
parameterized type) all the time like in

   let l1 = List.rev (...) in ...

or

   let l2 = 5 :: l1 in ...

which make the use of parameterized types much less clumsy than if you
had to. So type inference is useful too, but of course good
annotations are good style. You're not forbidden to annotate, where
you think it helps understandability.

>
> generic
>    type T is private;
>    with function "+"(a, b: T) return T;
>    increment: T;
> function Inc(arg: T) return T;
>
> function Inc(arg: T) return T is
> begin
>    return increment + arg;
> end Inc;
>
> Looks like more code, and I hear an FP advocate saying, "in FP,
> you just ...". Any sentence that starts with "You just ..."
> is suspicious :-)

And why is that so? I mean: My screen has only N (finite number)
lines. So more verbosity of that type actually diminishes by ability
to have a clear view at the code. Brevity can be a boon. (I don't say
it is always, but you can't dismiss it out of the hand).

To me any sentence that effectively says: "Look I can do that in X,
too" where X is something from {C, FORTRAN, Ada, Fortran, C++, Java,
Lisp, Scheme} is suspicious. Because, as I tried to explain above,
that is not the point: All languages are Turing complete. What really
counts is HOW you do it. And especially the Ada community should know
that, since Ada cannot do MORE things than C or C++: It only makes
doing the same things less error prone, supposedly.


> Can I have this without generics, please, Dmitry? ;-)

:-)) hehe.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 14:22                         ` Dmitry A. Kazakov
  2007-02-01 15:18                           ` Markus E Leypold
  2007-02-01 16:26                           ` Georg Bauhaus
@ 2007-02-01 19:31                           ` Ray Blaak
  2007-02-01 22:54                             ` Randy Brukardt
  2 siblings, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-01 19:31 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> >> There is absolutely no need to have built-in GC, if you have abstract
> >> referential (access) interfaces. Note additional advantage: you can have
> >> different GC's handling objects of same types in the same application!
> > 
> > But there is. Only under very restricted circumstances the necessity
> > of deallocation can be decided locally. With reference sharing
> > (representation sharing) between values (think Lisp-like lists -- and
> > I think you need that for efficience reasons sooner or later -- you're
> > either back to a "manual" reference counting scheme (inefficient) or
> > you really need GC.
> 
> I don't see how "manual" is different from "implemented by the compiler
> vendor." There is no magical GC hardware around... Each GC is reference
> counting, only the ways of counting differ.

Are you saying this seriously? The difference, of course, is the amount of
programmer effort involved.

If the compiler does it, then GC is implemented "correctly" (or at least more
correctly than the programmer's efforts) without the programmer needing to
redo it all the time.

Even if you have a reusable GC library, I would still assert that you cannot
do your own GC properly at the application level. It is not that magical GC
hardware is needed, it is instead that the compiler has all the low level
hooks so as to provide the GC with the information needed to know when data is
in use or not.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 17:36                             ` Markus E Leypold
@ 2007-02-01 20:53                               ` Georg Bauhaus
  2007-02-01 21:57                                 ` Markus E Leypold
                                                   ` (3 more replies)
  2007-02-05  0:39                               ` Robert A Duff
  1 sibling, 4 replies; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-01 20:53 UTC (permalink / raw)


On Thu, 2007-02-01 at 18:36 +0100, Markus E Leypold wrote:
> Georg Bauhaus <bauhaus@futureapps.de> writes:
> 
> > On Thu, 2007-02-01 at 15:22 +0100, Dmitry A. Kazakov wrote:
> >> On Wed, 31 Jan 2007 16:16:20 +0100, Markus E Leypold wrote:
> >
> >> > I notice, that nobody that actually has tried
> >> > FP doubts the superiority of the style in general (they are bitching
> >> > about efficiency, sometimes, and availability of libraries, mor
> >> > often).
> >
> > FP is superior to what? 
> 
> Superior to not having "full" closures. WRT to abstracting over parts
> of an algorithm / solution / component. The full context of this
> comment of mine was:
> 
> > > >> If yout have only downward scopes for "closures" and memory allocation
> > > >> this will, finally, interact badly with the fact that "implicit
> > > >> (i.e. static type safe) casting" of classes is also only possible
> > > >> downwards. My impression is, that all these things together rule out
> > > >> some useful designs, that would otherwise possible. Or to say it
> > > >> differenty: Object orientation w/o indeterminable scopes, upward
> > > >> closures and GC doesn't work well. Some abstractions cannot be
> > > >> implemented.
> 
> > > > Umm, I cannot tell. 
> 
> > > I think I can tell, but the discussion on this topic (what is
> > > functional programming good for) does still rage on c.l.f and the
> > > relevant mailing lists. I notice, that nobody that actually has tried
> > > FP doubts the superiority of the style in general (they are bitching
> > > about efficiency, sometimes, and availability of libraries, mor
> > > often).
> 
> 
> 
> > I'm saying this having had some fun writing
> > small OCaml command line tools. There are many good
> > libraries. And some things are just easy to write.
> >
> > Hm. The compiler error message are as helpful as you
> > would expect from inference machines (on a par with Hugs or C++
> > template error messages). GHC has been improved in this regard.
> 
> >
> > But how would you know that those who have tried FP don't doubt the
> > superiority of the style? 
> 
> Admittedly, I don't. It's only difficult to dicuss the question wether
> one would want to have e.g. closures, with people who haven't tried
> it. As I wrote elsewhere: All common programming languages are Turing
> complet, so equivalent. There is nothing that can be done in one that
> could not be done in the other -- in principle. So there is nothing
> like a proof "you cannot do X in language L" or "without feature
> F". There is just how "easy" it is to express things and this is, to
> an extend highly subjective.
> 
> > Any statistics? What does "in general" mean here?
> 
> My private head count. OK. I haven't proved it. 
> 
> > Are they really thinking that the inferior crowd doesn't write
> > recursive functions?
> 
> > Well, perhaps that is true... 
> 
> Well, see, ..
> 
> 
> > GCC does try to
> > eliminate tail calls, though, so maybe we can see a shift from
> > loops and counter manipulation towards recursive subprograms some day.
> > ;-)
> 
> ... as you see yourself, the language has to provide some support to
> make that efficient as a general technique.
> 
> > One of the great new pieces of Ada 2007 is Ada.Containers in my view
> > precisely because it permits what is taken for granted with FP or
> > with scripting languages: lists, tables, and sets, and ways to
> 
> Yes, but in an imperative way. Still, it's progress, I completely
> agree.
> 
> > use or write generic algorithms involving all of them.
> 
> > There isn't a lot of curry in there, but much else.
> 
> Currying is somewhat overrated, but's it's hardly the essence of FP.
> 
> 
> > To me, many of the FP advocates seem to be mathematicians. If you
> 
> I studied physics when I were young ... 
> 
> > look closely, two phenomena appear frequently in e.g. OCaml programs:
> >
> > (1) Either they use functions assembly language(!!!),
> 
> What do you mean by that?
> 
> > (2) or they use reference types and assignments almost exclusively.
> 
> Don't know who "they" is here. Certainly not me nor Richard Bird nor
> the Haskell crowd in general.
> 
> Yes, Ocaml, because of it's imperative elements has some potential to
> get abused. Writing reusable components though, is, IMHO, done by
> embracing a functional style.
> 
> 
> > -> (1) can be attributed to the programmer being a mathematician:
> > function assembly is a game in logic. Might be fun. Is it easy?
> 
> It's rumored to be accessible to certain optimization techniques. I
> assume you mean the so called "point free style"?

Yes, though whether or not there a points and parentheses is not
important.

> Side effect free FP

This is a plus. However, if you mark the variables involved
in creating side effects, as is done in SPARK, you can match
the result with a synthesized pure function, I think.
A reasoning function that takes the global variable as input.


>   let foo x y z =
> 
>       let old = (foo x y z) 
>       and opt = (foo_opt x y z) 
>       in 
>          if not (old == opt) then raise (Testing_error "foo_opt");
>          opt
> 
> 
> This technique is too expensive and cumbersome in imperative
> programming (yes, I tried it, it sucks, even for simple stuff).

?

   function foo(x, y, z: natural) return Integer is
        old: integer renames foo_old(x, y, z);
        opt: integer renames foo_opt(x, y, z);
      begin
        if not (old = opt) then raise Testing_error with "foo_opt"; end if;
        return opt;
      end;


> Programming, IMO, is not, manipulating the machine, but processing
> data. 

Well, and what is processing data?


> > So why doesn't someone tell the FP people to advocate a theory of
> > functions that
> >
> > (a) match empirical comput*ers* better than they match some model
> > of comput*ation*.
> 
> What's your problem with that? I actually don't even see the problem
> you're trying to address. Care to elaborate?  And shall we shift that
> thread to c.l.f :-)). It's a bit quiet there at the moment.

de.c.l.f is probably even more silent, but there are some
important references to Ada here: the Ada language is about computers,
given all the systems programming stuff, all the rules about
what is happening and when, representation, order of elaboration,
addressing multiprocessor systems, time types, etc. etc..

OTOH, there is little reasoning about equations in the vicinity of
Ada, is there? And what _is_ reasoning about equations? It is us
at work. (When we are "computers" in the old sense of the word.)
We *operate* when we reason, even when we reason about equations,
we try to *follow* what is going on. There is always one of a number
of possible *orders* of "evaluation" in

foo r = pi * d
  where pi = 3.14159
        d = 2 * r

no matter whether we bother to concentrate on the where clause first
or start with concentrating on the main clause of the equation etc..

I hear the comput*ing* scientist and the Haskell fan say:
--- But this is the important thing in a equation, you needn't
    worry!
Then I say, 
--- But computers have 1 or more sequential processors, and one thing
    happens after another. There are *operations* inside a computer.
    They take time and space, and they have effects. I must coordinate
    these effects.
I hear,
--- Just use strict if you must! Or monads. Why do you want to reason
    about what is going on, anyway?
And here the comput*er* plays its part.
--- Because I don't have the luxury of abstracting time away. And not
    space either. If you can follow computing steps at leisure, fine.
    If you are done once you have declared the timeless static
    connection between two sets of values, and call it a function,
    fine. I'm not done yet.

The next answer I will get is, well, try fold_left, or fold_right,
depending on the situation, or try this finding, etc., we don't know
exactly what is going on but it usually suffices. And the result is
correct, provided it can be computed on this machine.
  Ada.Containers, or the STL are more helpful in this regard because
they allow some amount of predictability of runtime *behavior*.
That an input value corresponds to an output value of some function
is a necessary but not a sufficient precondition. It is not important
*that* it will be computed, but *how* it will be computed. Sad but
true. And not for reasons of optimization, but because time and
space are almost always an essential parts of a problem specification.

Time and space are not normally considered the hall mark
of equational reasoning in FP. I think it is safe to say that
when you write a real-time application in Ada, you will be
thinking about time and space all the time, and you want this
task to be straight forward. As few implicit things as possible.

> 
> > (b) are no harder to analyze WRT O(?) than plain old procedures?
> 
> They aren't.

I meant that the behavior of lazy programs is harder to analyze,
or has there been an advance recently?

> > Wait, Haskell is trying (a) by capturing imperative statements
> > in monads. The chapter on fusions in Bird's Haskell book lets me
> > think that you add simplicity by mixing FP with imperative style
> > when memory or speed are a consideration.
> 
> No, you don't add simplicity.

I meant, if you write some of the I/O parts in a systems programming
language like Ada, you increase simplicity and efficiency of the
resulting program when compared to an FP only program that uses
fusions.  I think that Ada *and* Haskell will make an interesting
combination on .NET.

>  Monads, IMO, are not
> imperative. They are a cute trick [...]
> that state [...]
> can be updated destructively .

Uh, I thought this very issue is what von Neumann style is about?
Hence what imperative is about.


> > How is this FP style superior, besides being brief and full of
> > assumptions?
> 
> Have your way: It's not. [...] be free to ignore FP.

I'm serious about this. I don't ignore FP. When I have to
make a change, when the module structure needs to be reworked, when
the program fails for some inputs, am I really more productive
using FP? I can't say right now.
  A more important reason not to ignore functional programming
is that it teaches a different programming style. Recursion
and its approach to iteration is well worth being studied,
as it helps simplifying some algorithms.

>  I refuse
> to discuss merits of languages ot a micro level like:

The "micro" level is the level of production, change, and correction.


> > How is this FP style superior, besides being brief and full of
> > assumptions?
> 
> That is not FP. That is, essentially, your style when interacting with
> OCaml and your problem.

FP is brief and full of assumption because they set out
to safely save the work of writing things down - technically.
It's also typical math style: what is "obvious" to the insider
needs not be said. What is "obviously" a complete expression
needs not be terminated. This leads to undecipherable error
messages when you forget to place one missing token to complete
an expression that is the last thing in a function definition.
This is part of the problem that is *made* my problem by a
functional language such as ML. ML's syntax is so terribly broken
that even Andrew Appel has added a few paragraphs to his Critique
explaining what parts of ML should really be dumped and replaced.

This costs time and money.






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 20:53                               ` Georg Bauhaus
@ 2007-02-01 21:57                                 ` Markus E Leypold
  2007-02-01 22:03                                 ` Markus E Leypold
                                                   ` (2 subsequent siblings)
  3 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-01 21:57 UTC (permalink / raw)




Georg Bauhaus <bauhaus@futureapps.de> writes:

<...>

Well another long reply which will have to be read and replied to
carefully. I won't have the time today, so just some asides that can
stay short.

> Time and space are not normally considered the hall mark
> of equational reasoning in FP. I think it is safe to say that
> when you write a real-time application in Ada, you will be
> thinking about time and space all the time, and you want this
> task to be straight forward. As few implicit things as possible.

Mayhaps when you write real-time application. I, at least, I'm not
talking about real-time application. AFAIR Ada was conceived as rather
general programming language. If we restrict ourselves to real-time
programming today -- fine -- then the question "How come Ada isn't
more popular?" is already answered in a sense. Additionally all that:
"Uuuh, look they (let's say: the open source people, as somebody in
this thread did) they are using C or C++. My God, how can they." All
that is quite supeflous than, since

In revers I conclude Ada programming is not about real-time
programming only. At least to me that seems seems to be the ambition a
significant number of people at c.l.a still have. At least in theory.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 20:53                               ` Georg Bauhaus
  2007-02-01 21:57                                 ` Markus E Leypold
@ 2007-02-01 22:03                                 ` Markus E Leypold
  2007-02-01 23:40                                 ` Markus E Leypold
  2007-02-02  7:17                                 ` Harald Korneliussen
  3 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-01 22:03 UTC (permalink / raw)



Georg Bauhaus <bauhaus@futureapps.de> writes:


>> > (b) are no harder to analyze WRT O(?) than plain old procedures?
>> 
>> They aren't.
>
> I meant that the behavior of lazy programs is harder to analyze,
> or has there been an advance recently?

I doubt that it was so difficult generally. But I do not have any
privileged information about that.

Let me say it like this: The O(?) behaviour of a lazy construction in
general can be analyzed (I think) with the same degree of difficulty
as the behaviour of an imperative program. Actually you'd be surprised
how many simple imperative algorithms are at least not easy and
straight forward to analyse. As far as exact numbers go, you might be
right in that it is difficult to calculate exact coefficients. But
again that hardly does matter: The decision for an imperative languge
because it might easier to optimize or to analyse run time behaviour
is, in my eyes a typical case of prematuire optimization (as any
choice of language within a resonable base on base of opportunities to
optimize).

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 19:31                           ` Ray Blaak
@ 2007-02-01 22:54                             ` Randy Brukardt
  2007-02-02  1:37                               ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
  0 siblings, 1 reply; 397+ messages in thread
From: Randy Brukardt @ 2007-02-01 22:54 UTC (permalink / raw)


"Ray Blaak" <rAYblaaK@STRIPCAPStelus.net> wrote in message
news:ubqkd8yk3.fsf@STRIPCAPStelus.net...
...
> Even if you have a reusable GC library, I would still assert that you
cannot
> do your own GC properly at the application level. It is not that magical
GC
> hardware is needed, it is instead that the compiler has all the low level
> hooks so as to provide the GC with the information needed to know when
data is
> in use or not.

I don't agree. Ada provides those hooks to the user code in the form of
controlled types and finalization. It's always possible for an object to
know that it is about to be destroyed. Combined with using local objects as
much as possible (so that they can be destroyed automatically) and avoiding
references as much as possible, there is no particular problem, and no GC is
necessary.

The only time you need GC is when the owner of an object looses any
connection to it without destroying it. I think that represents a program
bug - a manifestation of sloppy programming.

In any case, I think GC is just a stopover on the road to general
persistence. At some point in the future, we'll have enough space that we
won't ever destroy objects. That will give everything the value of a good
version control system -- older versions will always be available. GC is
unnecessary in this system (and quite possibly interferes with it), and
forcing support for it isn't helpful. It's like encoding other forms of
obsolete technology into your programming languages; because of
compatibility you can never get rid of them.

                                 Randy.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 20:53                               ` Georg Bauhaus
  2007-02-01 21:57                                 ` Markus E Leypold
  2007-02-01 22:03                                 ` Markus E Leypold
@ 2007-02-01 23:40                                 ` Markus E Leypold
  2007-02-03 16:54                                   ` Georg Bauhaus
  2007-02-02  7:17                                 ` Harald Korneliussen
  3 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-01 23:40 UTC (permalink / raw)





Georg Bauhaus <bauhaus@futureapps.de> writes:

>> > Wait, Haskell is trying (a) by capturing imperative statements
>> > in monads. The chapter on fusions in Bird's Haskell book lets me
>> > think that you add simplicity by mixing FP with imperative style
>> > when memory or speed are a consideration.
>> 
>> No, you don't add simplicity.
>
> I meant, if you write some of the I/O parts in a systems programming
> language like Ada, you increase simplicity and efficiency of the

> resulting program when compared to an FP only program that uses
> fusions.  I think that Ada *and* Haskell will make an interesting
> combination on .NET.

I wonder why one wouldn't just use Monads in most cases?

>
>>  Monads, IMO, are not
>> imperative. They are a cute trick [...]
>> that state [...]
>> can be updated destructively .
>
> Uh, I thought this very issue is what von Neumann style is about?
> Hence what imperative is about.


IMHO: no. Of course every functional interpreter / VM runs in an
imperative world: After all the past is really lost, that is,
destroyed. The question is how to maintain the (useful) functional
illusion when interacting with systems that can only be updated
destructively (e.g. the world). The IO Monad and its relatives are the answer.

But those Monads are not just an imperative kludge at the top of a
functional language as number of people seem to think. A Monad is a
beautiful thing. It contains only functions and rules to compose
functions and as such is purely functional. The typing just happens to
be arranged in a way the every world value can only be passed once to
a function and never be bound to an identifier. Thus it looks like
imperative programming but it is purely function -- or to put it
differently: The type systems restricts program construction to a
subset of the full functional language that is isomorphic to some
imperative language.


>
>
>> > How is this FP style superior, besides being brief and full of
>> > assumptions?
>> 
>> Have your way: It's not. [...] be free to ignore FP.
>
> I'm serious about this. I don't ignore FP. When I have to
> make a change, when the module structure needs to be reworked, when
> the program fails for some inputs, am I really more productive
> using FP? I can't say right now.

I think I can say. I've written and deployed programs in Ada, C, C++,
Perl and OCaml (among others). And I've played around with Haskell. I
think I'm more productive with FP than with C and Ada. I don't say
this to denigrate Ada: But I often do ad-hoc programming driven by
what works at the moment. What really appeals to me under those
circumstances is the way I can 'morph' working OCaml programs into
other equivalent OCaml programs. What also appeals to me (that is not
a property of FP per se, though Haskell, OCaml, most Schemes, have it)
is the ability to interact with the code (modules, libraries) I've
written in a command line to immediately test the results of changes
(have I broken anything just now?).

>  A more important reason not to ignore functional programming
> is that it teaches a different programming style. Recursion
> and its approach to iteration is well worth being studied,
> as it helps simplifying some algorithms.

Recursion is the least of those reasons. What is IMHO more important
is, that you start not to think about mutating variables but producing
new values from give ones (a slight shift in perspective, but with
real impact at the long run).


>>  I refuse
>> to discuss merits of languages ot a micro level like:
>
> The "micro" level is the level of production, change, and correction.

I think you just did that at least one level too far down. Defining
'inc' or stuff like this doesn't give you any insight into functional
programming. I'm at a loss, though, to teach or tell you what to do to
get that insight. As you have noticed, FP appeals to
mathematicians. I'Ve mathematical background and I've some wildly
different pasts, but one of them was concerned with formal
specification. FP just came naturally to me: It mirrors the way I
think about a problem and about the solution to problems. The rest is
optimization: Eliminating double evaluations or even mutating
variables if that is needed for efficiency. But the first draft is
functional and stays the point of reference.

>
>
>> > How is this FP style superior, besides being brief and full of
>> > assumptions?
>> 
>> That is not FP. That is, essentially, your style when interacting with
>> OCaml and your problem.

> FP is brief and full of assumption because they set out
> to safely save the work of writing things down - technically.

I fear your full of preconceptions. It's different from what you do in
Ada (or whatever your favourite imperative language is), so it must be
wrong or there must be something missing.

As an FP advocate, I suggest, that the things not written down, are
not necessary. So those "savings" you address as a defect are actually
a chance. 

But YMMV. As I said: I'm not in the business of convincing people WRT
FP. You already indicated that you would not accept the typical
reasoning of an FP protagonist. I can't offer you something different:
It's shorter, I can see the problem clearer, I can leave out redundant
information and so on.

Listening to you justifying that every, every variable must be
declared with type and all, one wonders hoe mathematics itself ever
could live without type declarations. But if

   Let V be a vectorspace. (* defining a name and its type *)

then

   Let v \n V

is just enough to know v is a vector. Saying

   Let v: Vector with v \ V.

is just superfluous. The same principle applies here: The type, what
the symbol is and in which context it is embedded, in which operations
it can occur, can be inferred. No need to declare more.

The same principle applies in FP. I fear it won't convince you.


> It's also typical math style: what is "obvious" to the insider
> needs not be said. What is "obviously" a complete expression

You can also hide what you want to express under a lot of superfluous
drivel (type declarations that can be inferred or at the worst be
queried from the interactive system). FP has set the cut off at a
different point than Ada. Question: Was that necessarily wrong? It
fits me. Does that make be a worse programmer / designer / software
engineer / mathematician? I don't think so.

> needs not be terminated. This leads to undecipherable error
> messages when you forget to place one missing token to complete
> an expression that is the last thing in a function definition.

I fear that hardly happens to me in OCaml. Partly because I compile
often and develop code over a number of intermediate scaffolding
stages which all type check properly. BTW: I had to use the same
approach with Ada, since in the presence of overloading and tagged
types the error messages can be become quit misleading, at least with
Gnat.

> This is part of the problem that is *made* my problem by a
> functional language such as ML. ML's syntax is so terribly broken

You know, I rather use OCaml for practical reasons (I like the object
system). But my experience is that it is the beginners that are most
obsessed with the "broken syntax". Related to that are the repeating
attempts on c.l.f. or c.l.l to suggest a new syntax surface for Lisp
"without all so parenthesis", implying that would hugely further the
proliferation of Lisp. There is, I think, already a c.l.l FAQ for
this. Though the attempts to reform ML syntax happen less often, they
happen and I count them under the same heading as those Lisp reform
attempts.

I think it would be rather simple to design a pascalish language whose
semantics is just declared by transformation to ML. I was tempted some
times to do it myself. But really: What would that buy me? Investing
the same time into understanding more ML / OCaml / Haskell will earn
me much more.

> that even Andrew Appel has added a few paragraphs to his Critique
> explaining what parts of ML should really be dumped and replaced.


Let me quote from the abstract of that paper:

   Standard ML is an excellent language for many kinds of
   programming. It is safe, efficient, suitably abstract and
   concise. There are many aspects of the language that work
   well. However, nothing is perfect: Standard ML has a few
   shortcomings. In some cases there are obvious solutions, and in
   other cases further research is required.

So we are talking about somebody intimately acquainted with the
language and the research on that language, striving for an
improvement. Talking that as ground (as a mere novice!) to say: "Nah,
that language is terrible, let's look for something different, at
least not start with this" strikes me as daring, even blind.

I suggest you really read the paper you quoted: He has some nice
things to say about the necessity of GC and the people who don't like
the "bizarre syntax" of ML. At the end of that paragraph he says: "But
in general, don't we have better things to argue about than syntax?".

Duh!

Actually the paper is a good defense why one should use an FP (or
something very similar) EXCEPT he/she explicitly has different
requirements.

Your approach seems to be more the Olde FORTRAN Programmer's approache:
I can do it in [...] so why must I use/learn another language.

I fortunately have learned FP at a time when I could see it as fun,
instead of having to defend myself against having to learn it. On the
other side I'm one of the perverts that sit down in a quiet hour and
write down mathematical proofs just for the joy of it. (Proves your
hypothesis about mathematicians and FP, probably.)

> This costs time and money.

Well -- every legacy feature does. Tell me, Ada has none :-).

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-01 22:54                             ` Randy Brukardt
@ 2007-02-02  1:37                               ` Ray Blaak
  2007-02-02  9:35                                 ` Dmitry A. Kazakov
                                                   ` (3 more replies)
  0 siblings, 4 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-02  1:37 UTC (permalink / raw)


"Randy Brukardt" <randy@rrsoftware.com> writes:
> I don't agree. Ada provides those hooks to the user code in the form of
> controlled types and finalization. It's always possible for an object to
> know that it is about to be destroyed. Combined with using local objects as
> much as possible (so that they can be destroyed automatically) and avoiding
> references as much as possible, there is no particular problem, and no GC is
> necessary.

Doing GC by marking scope exits is a fundamentally inefficient way of doing
GC. This is the classical performance hit that reference counting schemes
suffer from, which can result in "waves" of cascading cleanups.

Modern GC algorithms do things by "finding" unused objects at collection
time. In between the GC is not running, which leads to faster and more
predictable execution times. And collection time does not necessarily imply a
huge performance spike either, as collection is amortized into future
collections.

Furthermore, finalization-based GC only lets the programmer have control over
their own types. If GC is part of the runtime one has the confidence that all
memory in the system is being properly managed (outside of unsafe areas,
interfaces to foreign functions, etc.).

Modern GC systems simply do a better job at it them people do.

> The only time you need GC is when the owner of an object looses any
> connection to it without destroying it. I think that represents a program
> bug - a manifestation of sloppy programming.

It is only sloppy programming in the context of manual clean up. With a valid
GC you do not have to clean up at all. One simply stops using objects when
they no longer need them, just "dropping them on the floor", leaving it up to
the GC to eventually collect it.

This considerably simplifies many algorithms and application code, and the
"sloppiness" in fact becomes an advantage: cleaner, easier to maintain code.

> In any case, I think GC is just a stopover on the road to general
> persistence. At some point in the future, we'll have enough space that we
> won't ever destroy objects. 

No way. As long as you have finite memories and long running algorithms, you
will need to clean up. Just leave that clean up to an automatic system.

Anyway, even "destroy nothing" is a form of GC, assuming that a process'
memory gets reclaimed by the OS when it terminates.

The point is that the programmer is freed from the error prone tedium of
explicitly managing memory.

> GC is unnecessary in this system (and quite possibly interferes with it),
> and forcing support for it isn't helpful. It's like encoding other forms of
> obsolete technology into your programming languages; because of
> compatibility you can never get rid of them.

GC is one of the modern advances of computing science, period, akin to the use
of high level languages vs assembly, lexical scoping vs dynamic scoping,
strong typing vs no typing, etc.

It should be used by default and turned off only in unusual situations.

Of course, in this group, those circumstances are probably the usual
case, what with the use of Ada for realtime and embedded programming.

Still, given recent realtime GC algorithms, I would consider the use of GC for
any system I was responsible for, and would give it up reluctantly, only if
timing and space constraints were too tight for a given application.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 20:53                               ` Georg Bauhaus
                                                   ` (2 preceding siblings ...)
  2007-02-01 23:40                                 ` Markus E Leypold
@ 2007-02-02  7:17                                 ` Harald Korneliussen
  3 siblings, 0 replies; 397+ messages in thread
From: Harald Korneliussen @ 2007-02-02  7:17 UTC (permalink / raw)


>From what I have seen, I must give you (Bauhaus) right in this: space
usage can be hard to predict in a lazy language, and time usage is
probably also easier in a language closer to the hardware, like Ada.
Compiler error messages in Haskell are bad, I had not considered
before that the reason for this may be type inference, but it makes
sense. By comparison, the GNAT error messages are the best I've come
across.
But I suspect that the libraries haskell offers can free me from
worries of space usage, if they are well designed and documented.
Haskell certainly offers some very interesting libraries.
Equational reasoning with haskell code is suprisingly easy, and I
believe that for most applications, correctness is more important than
speed and memory usage. From my (admitably small) experience with
haskell, it seems that it's faster and easier to write correct
programs the first time. If it fails, though, it's back to making
sense of the compiler error messages, or much worse: trying to track
down that space leak!




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-31 10:24                       ` Markus E Leypold
@ 2007-02-02  8:42                         ` Maciej Sobczak
  2007-02-02  9:32                           ` Alex R. Mosteo
                                             ` (2 more replies)
  0 siblings, 3 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-02  8:42 UTC (permalink / raw)


Markus E Leypold wrote:

>>> But from what I remember in
>>> the 1997s to 1998s
>> Most programming languages were terrible at that time, that's true.
> 
> Not Ada 95 ... :-).

Ada 95 *is* terrible. It doesn't have containers nor unbounded strings 
and it cannot even return limited types from functions. Yuck! ;-)

> Oh yes, misconceptions perhaps. But I'v only been talking about
> peoples motivations (which say a lot about their perceived problems).

Everybody has full rights to use misconceptions as a basis for 
perception and motivation. That's a natural limitation of human brain.

>> I've even heard that Java is better, because it has a String class and
>> there is no need to use char* as in C++ (!). FUD can buy a lot.
> 
> As far as that goes I have seen people getting tripped up really bad
> by string::c_str(). And I think you need it, if you don't program pure
> C++, which at that time nobody did.

Good point. It is true that people get tripped when interfacing with old 
C code. What about interfacing with old C code *from Java*? Are there 
less opportunities for getting tripped, or what?
Another misconception.
Interfacing to old C is tricky from Ada as well.

(Strangely, in "pure" C++ you have to use .c_str() even when opening a 
file stream, because fstream constructor does not understand string - 
that's real oops, but actually I've never seen anybody getting tripped 
here.)

> s.c_str() returned a pointer
> into some internal data structure of s which promptly changed when s
> was modified.

Yes.

> The only "safe" way to use it was strdup(s.c_str())

No, the only "safe" way to use it is making an immutable copy of the string:

string someString = "Hello";
const string safeCopy(someString);
some_old_C_function(safeCopy.c_str());

// modify someString here without influencing anything
// ...

 > and
> that is not threadsafe as anybody can see.

Why? There is nothing about threads here.

> I see "the need to use
> char* in C++" rumour as the result of people having been burned
> by similarly quirks at that time.

Yes, I understand it. Still, most of the rumour comes from misconceptions.


>> I think that at the end of the day the embedded C++ will disappear
>> from the market as the "full" C++ gets wider compiler support on
>> embedded platforms. 
> 
> That is no question of compiler support, as I understand it, but of
> verifiability and safety. A bit like Ravenscar, but -- of course --
> not as highly integer (ahem ... :-).

I agree with it, but restrictions for embedded C++ are not catering for 
the same objectives as those of Ravenscar. For example: EC++ does not 
have templates. Nor even namespaces (:-|). Does it have anything to do 
with schedulability or necessary runtime support? No. Result - some 
people got frustrated and invented this:

http://www.iar.com/p7371/p7371_eng.php

Funny?

That's why I believe that Embedded C++ will die.

>> Subsetting C++ would be beneficial in the sense similar to Ravenscar
>> or by extracting some core and using it with formal methods (sort of
>> "SPARK++"), but I doubt it will ever happen.
> 
> It already did (and perhaps died)
> 
>    http://en.wikipedia.org/wiki/Embedded_C++

Exactly. It will die, because it's just a subset. If it was a subset 
extended with annotations ("SPARK++") or with anything else, the 
situation would be different, because it would provide new possibilities 
instead of only limiting them.


>> The interesting thing is that memory management is *said* to be
>> painful.

> I disagree. The only-downward-closures style of C++ and Ada, which
> allows only to mimic "upward closures" by using classes, heavily
> influences the way the programmer thinks. Higher level abstractions
> (as in functional languages) would require full closure -- and since
> this means that memory life time cannot bound to scope any more, this
> would be the point where manual memory management becomes painful.

You can have it by refcounting function frames (and preserving some 
determinism of destructors). GC is not needed for full closures, as far 
as I perceive it (with all my misconceptions behind ;-) ).

On the other hand, GC is convincing with some lock-free algorithms.
Now, *this* is a tough subject for Ada community, right? ;-)

> Furthermore I've been convinced that manual memory management hinders
> modularity.

Whereas I say that I don't care about manual memory management in my 
programs. You can have modularity without GC.

(And if I think about all these funny GC-related effects like 
resurrection of objects in Java, then I'm not sure what kind of 
modularity you are referring to. ;-) )

>> Reference-oriented languages have completely
>> different ratio of "new per kLOC" so GC is not a feature there, it's a
>> must. 
> 
> I wonder, if it is really possible to do OO without being
> reference-oriented. I somewhat doubt it.

Why? OO is about encapsulation and polymorphism, these don't need 
references everywhere.

>> But then the question is not whether GC is better, but whether
>> reference-oriented languages are better than value-oriented ones. Many
>> people get seduced by GC before they even start asking such questions.
> 
> Value-oriented in my world would be functional -- languages which all
> heavily rely on GC. 

What about maintainability and reasoning?

> I also admit being one of the seduced, but that is not surprising
> since my main focus is not in embedded programming and in everything
> else it's sheer folly not to have GC.

I disagree. I'm not seduced.

> The arguments against GC often
> read like arguments against virtual memory, against high level
> languages as opposed to assembler, against filesystems (yes there was
> a time when some people thought that the application would best do
> allocation of disc cylinders itself since it knows its access patterns
> better than the FS).

Valid points. Still, Blue Gene/L uses real addressing scheme in each 
node and more advanced database servers use raw disk access bypassing 
the features provided by FS. Guess why.


-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 16:26                           ` Georg Bauhaus
  2007-02-01 17:36                             ` Markus E Leypold
@ 2007-02-02  9:20                             ` Dmitry A. Kazakov
  2007-02-02 12:34                               ` Markus E Leypold
  2007-02-02 14:27                               ` Georg Bauhaus
  1 sibling, 2 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-02  9:20 UTC (permalink / raw)


On Thu, 01 Feb 2007 17:26:21 +0100, Georg Bauhaus wrote:

> On Thu, 2007-02-01 at 15:22 +0100, Dmitry A. Kazakov wrote:

>> On Wed, 31 Jan 2007 16:16:20 +0100, Markus E Leypold wrote:
> 
>>> BTW -- another argument _for_ a builtin list syntax.
>> 
>> Hey, Ada already has ()-brackets. Maybe our Unicode fans would propose
>> special glyphs for )), ))), )))), ))))), )))))) etc. Is it what you mean as
>> special syntax? (:-)) 
> 
> Actually, a syntax that is used in a theory book is
> {left/right paren}{subscript i}.

Great. The next step would be to introduce expression in subscripts: )2+1 =
)3 = ))). Right? Then we proceed to )f(...) where f is a free function that
determines the number of brackets. OK? Now the question is, is it still
syntax? (:-))

> What does a list syntax buy us if there is no argument pattern matching,
> or no data structure pattern matching in Ada? Brevity?

There is no need in pattern matching. What is actually required in Ada is
abstract aggregates, i.e. an ability do have user-defined constructor
functions with the parameter specified in the form of a list.

> generic
>    type T is private;
>    with function "+"(a, b: T) return T;
>    increment: T;
> function Inc(arg: T) return T;
> 
> function Inc(arg: T) return T is
> begin
>    return increment + arg;
> end Inc;
> 
> Looks like more code, and I hear an FP advocate saying, "in FP,
> you just ...". Any sentence that starts with "You just ..."
> is suspicious :-)
> 
> Can I have this without generics, please, Dmitry? ;-)

Sure you can:

type Additive is abstract;  -- Should be enough for an interface
function "+" (A, B : Additive) return Additive is abstract;

Here I am not sure about "increment" variable, because it is not
initialized. Anyway:

function Increment return Additive is abstract;
   -- Better be getter/setter pair. Unfortunately Ada does not have
   -- "abstract variable interface" it should have

function Inc (Arg : Additive'Class) return Additive'Class is
begin
   return Increment + Arg;
end Inc;

There is a very simple rule for all this:

1. Formal generic subroutines -> abstract primitive operations
2. Subroutines in the body -> class-wide operations.

This follows from even simpler rule:

generic <--> class-wide
instance <--> specific
------------------------------
= polymorphic

BTW, can you do the same for the case where Increment were Integer and Arg
were Float? [ You would need multiple dispatch in "+", that is the case
where any system without subtypes would quickly collapse. For a vivid
example see String/Unbounded_String/Wide_String/... mess in Ada.]

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02  8:42                         ` Maciej Sobczak
@ 2007-02-02  9:32                           ` Alex R. Mosteo
  2007-02-02 11:04                             ` Maciej Sobczak
  2007-02-02 13:57                           ` Markus E Leypold
  2007-02-09  8:01                           ` adaworks
  2 siblings, 1 reply; 397+ messages in thread
From: Alex R. Mosteo @ 2007-02-02  9:32 UTC (permalink / raw)


Maciej Sobczak wrote:

> Markus E Leypold wrote:
> 
>>>> But from what I remember in
>>>> the 1997s to 1998s
>>> Most programming languages were terrible at that time, that's true.
>> 
>> Not Ada 95 ... :-).
> 
> Ada 95 *is* terrible. It doesn't have containers nor unbounded strings
> and it cannot even return limited types from functions. Yuck! ;-)

Well, well, it *has* unbounded strings, you bad guy! ;)

> (snip)



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-02  1:37                               ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
@ 2007-02-02  9:35                                 ` Dmitry A. Kazakov
  2007-02-02 12:44                                   ` in defense of GC Markus E Leypold
  2007-02-02 18:15                                   ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
  2007-02-02 12:36                                 ` Markus E Leypold
                                                   ` (2 subsequent siblings)
  3 siblings, 2 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-02  9:35 UTC (permalink / raw)


On Fri, 02 Feb 2007 01:37:26 GMT, Ray Blaak wrote:

> "Randy Brukardt" <randy@rrsoftware.com> writes:

[...]
> Modern GC systems simply do a better job at it them people do.

Maybe, but that does not explain why such GC should be hard-wired. Note you
are talking about systems (plural), but wired could be only one. Why do you
want to forbid me to use my GC?

[...]
> The point is that the programmer is freed from the error prone tedium of
> explicitly managing memory.

This is a misconception. There are two fundamentally different issues:

1. Object scopes
2. Memory management

The second issue is less and less relevant as Randy pointed out. The first
issue is always relevant. It is a good design to consider where an object
exists. GC [and upward closures] is an attitude of making everything
potentially global. In fact it is worse than just global. It is "I don't
know where I need that damn thing."

This is A) mess, B) sloppy programming, C) impossible model in our
networking distributed relativist world.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02  9:32                           ` Alex R. Mosteo
@ 2007-02-02 11:04                             ` Maciej Sobczak
  0 siblings, 0 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-02 11:04 UTC (permalink / raw)


Alex R. Mosteo wrote:

>> Ada 95 *is* terrible. It doesn't have containers nor unbounded strings
>> and it cannot even return limited types from functions. Yuck! ;-)
> 
> Well, well, it *has* unbounded strings, you bad guy! ;)

Oops, indeed. :-)

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02  9:20                             ` Dmitry A. Kazakov
@ 2007-02-02 12:34                               ` Markus E Leypold
  2007-02-03  9:45                                 ` Dmitry A. Kazakov
  2007-02-02 14:27                               ` Georg Bauhaus
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-02 12:34 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Thu, 01 Feb 2007 17:26:21 +0100, Georg Bauhaus wrote:
>
>> On Thu, 2007-02-01 at 15:22 +0100, Dmitry A. Kazakov wrote:
>
>>> On Wed, 31 Jan 2007 16:16:20 +0100, Markus E Leypold wrote:
>> 
>>>> BTW -- another argument _for_ a builtin list syntax.
>>> 
>>> Hey, Ada already has ()-brackets. Maybe our Unicode fans would propose
>>> special glyphs for )), ))), )))), ))))), )))))) etc. Is it what you mean as
>>> special syntax? (:-)) 
>> 
>> Actually, a syntax that is used in a theory book is
>> {left/right paren}{subscript i}.
>
> Great. The next step would be to introduce expression in subscripts: )2+1 =
> )3 = ))). Right? Then we proceed to )f(...) where f is a free function that
> determines the number of brackets. OK? Now the question is, is it still
> syntax? (:-))
>
>> What does a list syntax buy us if there is no argument pattern matching,
>> or no data structure pattern matching in Ada? Brevity?
>
> There is no need in pattern matching. What is actually required in Ada is
> abstract aggregates, i.e. an ability do have user-defined constructor
> functions with the parameter specified in the form of a list.
>
>> generic
>>    type T is private;
>>    with function "+"(a, b: T) return T;
>>    increment: T;
>> function Inc(arg: T) return T;
>> 
>> function Inc(arg: T) return T is
>> begin
>>    return increment + arg;
>> end Inc;
>> 
>> Looks like more code, and I hear an FP advocate saying, "in FP,
>> you just ...". Any sentence that starts with "You just ..."
>> is suspicious :-)
>> 
>> Can I have this without generics, please, Dmitry? ;-)
>
> Sure you can:
>
> type Additive is abstract;  -- Should be enough for an interface
> function "+" (A, B : Additive) return Additive is abstract;
>
> Here I am not sure about "increment" variable, because it is not
> initialized. Anyway:
>
> function Increment return Additive is abstract;
>    -- Better be getter/setter pair. Unfortunately Ada does not have
>    -- "abstract variable interface" it should have
>
> function Inc (Arg : Additive'Class) return Additive'Class is
> begin
>    return Increment + Arg;
> end Inc;
>
> There is a very simple rule for all this:
>
> 1. Formal generic subroutines -> abstract primitive operations
> 2. Subroutines in the body -> class-wide operations.


So you derive from Additive to get a specific implementation?

Like

  type Vector   is Additive with ...;
  type CountVal is Additive with ...;

Right? But then, you should note that with

  C: Countval;

  Inc(C).

returns an 'Additive', not a 'CountVal'. That is one problem. The
other is of course that Increment is not initialized. This is exactly
what generics are here to solve.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-02  1:37                               ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
  2007-02-02  9:35                                 ` Dmitry A. Kazakov
@ 2007-02-02 12:36                                 ` Markus E Leypold
  2007-02-02 21:50                                 ` in defense of GC (was Re: How come Ada isn't more popular?) Gautier
  2007-02-05  1:12                                 ` Robert A Duff
  3 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-02 12:36 UTC (permalink / raw)



Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:

<a rather lucid defense of GC>

Standing ovations here! I could not agree more.

Regards -- Markus








^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-02  9:35                                 ` Dmitry A. Kazakov
@ 2007-02-02 12:44                                   ` Markus E Leypold
  2007-02-03 10:13                                     ` Dmitry A. Kazakov
  2007-02-02 18:15                                   ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-02 12:44 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Fri, 02 Feb 2007 01:37:26 GMT, Ray Blaak wrote:

>> The point is that the programmer is freed from the error prone tedium of
>> explicitly managing memory.
>
> This is a misconception. There are two fundamentally different issues:
>
> 1. Object scopes
> 2. Memory management
>
> The second issue is less and less relevant as Randy pointed out. The first
> issue is always relevant. It is a good design to consider where an object
> exists. GC [and upward closures] is an attitude of making everything
> potentially global. In fact it is worse than just global. It is "I don't
> know where I need that damn thing."

Closure don't "make things global". They do it as much as returning an
Integer, makes, GOD FORBID!, a value global (don't return Integers
people, that makes a value global and we all know, global is bad --
how nonsensical is that?). Closures provide exactly the opposite of
"making things global". They provide a way for a module (a capsule of
knowledge / implementation) to return a method (how to do things)
without disclosing the implementation (i.e. keeping up the abstraction
barrier). 

I can't believe I even have to spell it out. On the other side -- why
do I argue? You're even opposing generics.

> This is A) mess, B) sloppy programming, C) impossible model in our
> networking distributed relativist world.

There are BTW, even distributed garbage collection algorithms. This
probably will enrage you no end? :-).




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02  8:42                         ` Maciej Sobczak
  2007-02-02  9:32                           ` Alex R. Mosteo
@ 2007-02-02 13:57                           ` Markus E Leypold
  2007-02-03  9:44                             ` Dmitry A. Kazakov
  2007-02-05  9:59                             ` Maciej Sobczak
  2007-02-09  8:01                           ` adaworks
  2 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-02 13:57 UTC (permalink / raw)




Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:
>
>>>> But from what I remember in
>>>> the 1997s to 1998s
>>> Most programming languages were terrible at that time, that's true.
>> Not Ada 95 ... :-).
>
> Ada 95 *is* terrible. It doesn't have containers nor unbounded strings
> and it cannot even return limited types from functions. Yuck! ;-)

Can't it? Wasn't there a trick with renaming somewhere? Like

   A : Limited_Type renames some_function(...);

I seem to remember something like this. Might be mistaken: I usually
end up to eliminate limited types in my programs against my will,
since they play bad with unlimited base classes (like found in
GtkAda).
   
>> Oh yes, misconceptions perhaps. But I'v only been talking about
>> peoples motivations (which say a lot about their perceived problems).
>
> Everybody has full rights to use misconceptions as a basis for
> perception and motivation. That's a natural limitation of human brain.

Exactly. And we talked about "Why isn't Ada more popular?". This is
basically about how Ada is perceived, not about "technical truth",
i.e. is Ada really as it is perceived. 

>>> I've even heard that Java is better, because it has a String class and
>>> there is no need to use char* as in C++ (!). FUD can buy a lot.

>> As far as that goes I have seen people getting tripped up really bad
>> by string::c_str(). And I think you need it, if you don't program pure
>> C++, which at that time nobody did.
>
> Good point. It is true that people get tripped when interfacing with
> old C code. 


> What about interfacing with old C code *from Java*? Are

They don't. And that is the interesting things. The people transitioning
C->Java where a totally different class from those doing (almost at the
same time, not exactly, but almost) the C->C++ transition. 

The C++ adopters were often (not necessarily) always motivated by
being able to integrate their existing code base and take their
C-knowhow with them. They felt that they didn't get more removed from
the system, but still had all the interfaces and libraries
available. In a sense this was a no-brainer for the applications
folks. They thought they got OO for free (no downside). Of course the
downside was that old hand C programmers don't necessarily make good
C++ programmers or good OO developers.

The Java adopters on the other side knew that they were giving up
their legacy code (if they had any) but on the up side were rewarded
with a much more complete standard runtime library. Adopting Java
removed you a further step from your host system, so old know-how was
only partly applicable if at all (yes, there is JNI, but it wasn't
commonly used). So Java was adopted by people who (a) thought the win
worth the price they had to pay (basically start from the beginning),
(b) people who realized that their existing code base was crap anyway
:-) and (c) newcomers (companies or students who didn't have any
specific know-how yet and decided to get that know-how in Java. 

The last point in my eyes accounts for relatively high numbers of
clueless look-what-we-have-newly-invented (old win in new bottles)
newbies in the Java sector who gave Java a bad name. I still have the
reflex to groan deeply when I hear the words: "We now we are trying to
re-implement this in Java". It's probably unjustified these days but
some years ago that sentence to me was the typical hallmark of a
buzzword chasing newbie who thought that by choosing his favourite
language (probably only his 1st or 2nd one), all problems would
magically go away.

(Sorry, no proofs or sources here, folks. I think the recent history of
programming language development and adoption would bear some more
research. You might apply at my e-mail address to sponsor this
research :-)

When I talk about all those transitions, I see, that there was no
C->Ada transition, at least no mass movement. So we come back to the
C->initial question: Why not? 

I think some of the posts here have already given answers to that:
Historical reasons. 

Those transitions would have had to happen around 1995-2000 which in
my eyes was a period where people were looking for new languages (GUI
development in C and all this became rather unfeasible at the
time). But a process of bringing the candidate languages into the
public awareness would have to have started earlier. Was the Ada 95
standard just a tiny bit too late (it is understandable that Ada 83
was not a serious contender for this, people were looking for OO
really urgently)? Or was it the vendor situation? GCC has had C++ for
some time, but did GNAT come too late? 

I think this is very much the case of being "at the right place at the
right time" -- when people were looking for ways out of their pain,
Java and C++ were (more or less) ready to at least promise
salvation. I wonder if Ada was ready ... -- at least it wasn't in
public discussion then.

> What about interfacing with old C code *from Java*? Are
> there less opportunities for getting tripped, or what?

As I said: It's less part of the overall migration strategy usually
associated with a transition to Java.

> Another misconception.  Interfacing to old C is tricky from Ada as
> well.

Yes :-). I never denied that. But then you're less tempted to mix Ada
and C freely, as you're in C++/C. So in Ada (and in Java and in every
other language with a useful foreign function call interface) you get
a clear interface (in the original as in the programming sense of the
word) to C. In C++ the temptation / opportunity to get messed up is
much greater.


> (Strangely, in "pure" C++ you have to use .c_str() even when opening a
> file stream, because fstream constructor does not understand string -
> that's real oops, but actually I've never seen anybody getting tripped
> here.)

If you just do f(s.c_str()) and f _is_ properly behaved, that is, only
reads from the pointer or does a strdup(), everything is fine, but, I
note, not thread safe. I wouldn't exclude the possibility that the
resulting race condition is hidden in a nice number of C++ programs
out there.

>> s.c_str() returned a pointer
>> into some internal data structure of s which promptly changed when s
>> was modified.
>
> Yes.
>
>> The only "safe" way to use it was strdup(s.c_str())
>
> No, the only "safe" way to use it is making an immutable copy of the string:
>
> string someString = "Hello";
> const string safeCopy(someString);
> some_old_C_function(safeCopy.c_str());

Brr. Yes, that's another way to solve this problem.

>
> // modify someString here without influencing anything
> // ...
>
>  > and
>> that is not threadsafe as anybody can see.

> Why? There is nothing about threads here.

Your solution is thread safe, if the strings package is (which it
wasn't in the past). My "solution" isn't since, if any other thread
holds a reference to the string in question and modifies it between
c_str() and strdup() we're not working only with suddenly modified
data (which shouldn't happen), but with pointers to invalid memory.

That means: The race has the potential not to be just a race, but to
break type safety! Which is an interaction between the presence of
threads and the semantics of a program which is just so bad bad bad.

>> I see "the need to use
>> char* in C++" rumour as the result of people having been burned
>> by similarly quirks at that time.
>

> Yes, I understand it. Still, most of the rumour comes from misconceptions.

I think this is not about, that you/someone _can_ handle C++
safely. It's about how probable that is to happen without having to
study up on arcane knowledge, by just doing the next best "reasonable"
thing. And that is the area where C++ will trip up, yes, especially
the newcomer.


>>> I think that at the end of the day the embedded C++ will disappear
>>> from the market as the "full" C++ gets wider compiler support on
>>> embedded platforms.
>> That is no question of compiler support, as I understand it, but of
>> verifiability and safety. A bit like Ravenscar, but -- of course --
>> not as highly integer (ahem ... :-).
>
> I agree with it, but restrictions for embedded C++ are not catering
> for the same objectives as those of Ravenscar. For example: EC++ does
> not have templates. Nor even namespaces (:-|). Does it have anything
> to do with schedulability or necessary runtime support? No. Result -

But perhaps with trying to attach at least a feeble resemblance of
semantics to the remaining language and avoid -- heuristically -- the
most common handling errors (namespaces + overloading + "last
identifier defined wins" make a nice mess in C++).

> some people got frustrated and invented this:
>
> http://www.iar.com/p7371/p7371_eng.php

Obviously another market: Minus verifiability (well, of a sort), plus
the ability to compile to really small targets. Useful.

>
> Funny?
>
> That's why I believe that Embedded C++ will die.

That might be.


>>> Subsetting C++ would be beneficial in the sense similar to Ravenscar
>>> or by extracting some core and using it with formal methods (sort of
>>> "SPARK++"), but I doubt it will ever happen.
>> It already did (and perhaps died)
>>    http://en.wikipedia.org/wiki/Embedded_C++

> Exactly. It will die, because it's just a subset. If it was a subset
> extended with annotations ("SPARK++") or with anything else, the
> situation would be different, because it would provide new
> possibilities instead of only limiting them.

There is some truth in that.

>
>>> The interesting thing is that memory management is *said* to be
>>> painful.
>
>> I disagree. The only-downward-closures style of C++ and Ada, which
>> allows only to mimic "upward closures" by using classes, heavily
>> influences the way the programmer thinks. Higher level abstractions
>> (as in functional languages) would require full closure -- and since
>> this means that memory life time cannot bound to scope any more, this
>> would be the point where manual memory management becomes painful.


> You can have it by refcounting function frames (and preserving some
> determinism of destructors). GC is not needed for full closures, as
> far as I perceive it (with all my misconceptions behind ;-) ).

Yes, one could do it like that. Ref-counting is rumoured to be
inefficient, but if you don't have too many closure that might just
work.


> On the other hand, GC is convincing with some lock-free algorithms.
> Now, *this* is a tough subject for Ada community, right? ;-)

:-).

>
>> Furthermore I've been convinced that manual memory management hinders
>> modularity.

> Whereas I say that I don't care about manual memory management in my
> programs. You can have modularity without GC.

Certainly. But you can have more with GC. George Bauhaus recently
refered to the "A Critique of Standard ML" by Andrew W. Appel. 

   http://www.cs.princeton.edu/research/techreps/TR-364-92

I re-read that paper cursorily and noticed that there a some nice
points about the desirability of GC in there (approx. 1 page) I
suggest you read that: It says it better than I could say it, that w/o
GC the responsibility for freeing/disposing of allocated storage is
always/often a difficult question in general.

People who don't have GC often say that they can do anything with
manual memory management. I humbly suggest that might be, because they
already think about their solutions in terms compatible with manual
memory management. Which means, they are missing the perception of
those opportunities where GC would buy a vastly simpler architecture /
solution / whatever.

> (And if I think about all these funny GC-related effects like
> resurrection of objects in Java, then I'm not sure what kind of
> modularity you are referring to. ;-) )

Resurrection? You're talking about finalization in Java? Well -- the
way this is designed it's just a perversion.


>>> Reference-oriented languages have completely
>>> different ratio of "new per kLOC" so GC is not a feature there, it's a
>>> must.

>> I wonder, if it is really possible to do OO without being
>> reference-oriented. I somewhat doubt it.


> Why? OO is about encapsulation and polymorphism, these don't need
> references everywhere.

Yes, but -- you want to keep, say, a list of Shape(s). Those can be
Triangle(s), Circle(s) etc, which are all derived from class
Shape. How do you store this list? An array of Shape'Class is out of
question because of the different allocation requirements for the
descendants of Shape(s).

>>> But then the question is not whether GC is better, but whether
>>> reference-oriented languages are better than value-oriented ones. Many
>>> people get seduced by GC before they even start asking such questions.

>> Value-oriented in my world would be functional -- languages which all
>> heavily rely on GC.

> What about maintainability and reasoning?

What about it? It's easy with value-oriented languages (i.e. languages
that just produce new values from old ones in a non-destructive
fashion). Functional languages do this therefore reasoning is a well
developed art there. But the representations of all those values
(trees, lists, ...) (a) rely heavily on representation sharing and (b)
use references because of that. They need and use GC.

>> I also admit being one of the seduced, but that is not surprising
>> since my main focus is not in embedded programming and in everything
>> else it's sheer folly not to have GC.
>
> I disagree. I'm not seduced.

So stay pure. :-)

Reminds me a bit to the folks in youth who didn't want to use high
level languages like (gasp) C or Pascal, and insisted on Assembler,
because they (a) wanted to be efficient all the time, (b) didn't trust
the compiler.

I've decided, if I want to deliver any interesting functionality to
the end user, my resources (developer time) are limited, I have to
leave everything I can to automation (i.e. compilers, garbage
collectors, even libraries), to be able to reach my lofty goals.

>> The arguments against GC often read like arguments against virtual
>> memory, against high level languages as opposed to assembler,
>> against file systems (yes there was a time when some people thought
>> that the application would best do allocation of disc cylinders
>> itself since it knows its access patterns better than the FS).

> Valid points. Still,

> Blue Gene/L uses real addressing scheme in each
> node and more advanced database servers use raw disk access bypassing
> the features provided by FS. Guess why.

Yes. Certainly. The point is to know when to optimise, not to do it
always. Like I said elsewhere: I advocate the use of a type safe,
garbage collected language more in the functional sector, probably,
together with a good foreign function call interface and a real low
level language for interfacing and, perhaps, hot spot
optimisation.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02  9:20                             ` Dmitry A. Kazakov
  2007-02-02 12:34                               ` Markus E Leypold
@ 2007-02-02 14:27                               ` Georg Bauhaus
  2007-02-02 16:07                                 ` Dmitry A. Kazakov
  1 sibling, 1 reply; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-02 14:27 UTC (permalink / raw)


On Fri, 2007-02-02 at 10:20 +0100, Dmitry A. Kazakov wrote:
> On Thu, 01 Feb 2007 17:26:21 +0100, Georg Bauhaus wrote:

> > Actually, a syntax that is used in a theory book is
> > {left/right paren}{subscript i}.
> 
> Great. The next step would be to introduce expression in subscripts: )2+1 =
> )3 = ))). Right? Then we proceed to )f(...) where f is a free function that
> determines the number of brackets. OK? Now the question is, is it still
> syntax? (:-))

In Lisp, yes, every Lisp programmer creates his or her own
syntax anyway. :-) In the text book this only serves to point
out that a left bracket must match a right bracket on the
stack. The index prevents mixing parens.


> type Additive is abstract;  -- Should be enough for an interface
> function "+" (A, B : Additive) return Additive is abstract;
> 
> Here I am not sure about "increment" variable, because it is not
> initialized. Anyway:

That's the point: you create instances of functions
that return the sum of their one argument plus an Increment.
The Increment is bound to a value at the point of instantiation.


So that we have

  function next is
     new Inc(T => Integer, Increment => 1);
  function next_but_one is
     new Inc(T => Integer, Increment => 2);

  pragma assert(next_but_one(1) = 1 + next(x));

This isn't exactly like 

(define inc
 (lambda (Increment)
   (lambda (x)
     (+ Increment x))))

(defined next (inc 1))
(defined next-but-one (inc 2))

because next and next_but_one cannot be passed outwards in
Ada. (Which I understand you don't want anyway ;-)


Or similarly, but not creating function instances,

   type Section(Increment: Additive) is new Additive with private;
   function inc(x: Section) return Integer;

   next: Section(Increment => 1);
   next_but_one: Section(Increment => 2);

   pragma assert(inc(next_but_one) = 1 + inc(next));

And you cannot pass these objects outwards, either.

> function Increment return Additive is abstract;
>    -- Better be getter/setter pair. Unfortunately Ada does not have
>    -- "abstract variable interface" it should have

You could start by declaring

   type Integer is new Controlled
                   and Accessors
                   and Additive with private;


> BTW, can you do the same for the case where Increment were Integer and Arg
> were Float? [ You would need multiple dispatch in "+",

Why would I need multiple dispatch? I could just as well state
the requirement in the generic "+" function(s), taking arguments
of type Integer and Float.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02 14:27                               ` Georg Bauhaus
@ 2007-02-02 16:07                                 ` Dmitry A. Kazakov
  0 siblings, 0 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-02 16:07 UTC (permalink / raw)


On Fri, 02 Feb 2007 15:27:22 +0100, Georg Bauhaus wrote:

> So that we have
> 
>   function next is
>      new Inc(T => Integer, Increment => 1);
>   function next_but_one is
>      new Inc(T => Integer, Increment => 2);
> 
>   pragma assert(next_but_one(1) = 1 + next(x));
> 
> This isn't exactly like 
> 
> (define inc
>  (lambda (Increment)
>    (lambda (x)
>      (+ Increment x))))
> 
> (defined next (inc 1))
> (defined next-but-one (inc 2))
> 
> because next and next_but_one cannot be passed outwards in
> Ada. (Which I understand you don't want anyway ;-)

That depends on what you mean 
>
> Or similarly, but not creating function instances,
> 
>    type Section(Increment: Additive) is new Additive with private;
>    function inc(x: Section) return Integer;
> 
>    next: Section(Increment => 1);
>    next_but_one: Section(Increment => 2);
> 
>    pragma assert(inc(next_but_one) = 1 + inc(next));
> 
> And you cannot pass these objects outwards, either.

Why should I pass them outwards? It seems like three mixed issues:

1. Types of subprograms (Ada should have them, but it does not)
2. Specialization, a sugar for wrappers (It could be useful too)
3. Upward closures (Bad idea, why next should be visible in places where 1
does not exist?)

>> function Increment return Additive is abstract;
>>    -- Better be getter/setter pair. Unfortunately Ada does not have
>>    -- "abstract variable interface" it should have
> 
> You could start by declaring
> 
>    type Integer is new Controlled
>                    and Accessors
>                    and Additive with private;

You don't need to specify all interfaces in advance:

type Integer_For_Foo is new Integer and Interface_For_Foo;

(Multiple inheritance from concrete types shall be.)

>> BTW, can you do the same for the case where Increment were Integer and Arg
>> were Float? [ You would need multiple dispatch in "+",
> 
> Why would I need multiple dispatch? I could just as well state
> the requirement in the generic "+" function(s), taking arguments
> of type Integer and Float.

Geometric explosion of variants. Then you defeat the very idea of generic
programming, which was to talk in terms of Additive, rather than in all
concrete instances of in this concrete program.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-02  9:35                                 ` Dmitry A. Kazakov
  2007-02-02 12:44                                   ` in defense of GC Markus E Leypold
@ 2007-02-02 18:15                                   ` Ray Blaak
  2007-02-02 19:35                                     ` Adam Beneschan
  2007-02-02 20:04                                     ` Dmitry A. Kazakov
  1 sibling, 2 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-02 18:15 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> On Fri, 02 Feb 2007 01:37:26 GMT, Ray Blaak wrote:
> > Modern GC systems simply do a better job at it them people do.
> 
> Maybe, but that does not explain why such GC should be hard-wired. Note you
> are talking about systems (plural), but wired could be only one. Why do you
> want to forbid me to use my GC?

Well, I am not forbidding anyone. Certainly there are times where it is
justified to disable GC, as I already pointed out. So, presumably a fully
general language environment would be able to run with it disabled if
necessary.

If nothing else, you are free choose the language you like. Currently that
means Ada. For me, that would be C#.

The original context of this thread is about the popularity of Ada. My
position is that Ada would be a better fit for general use if GC was the
default, not the exception.

> 
> [...]
> > The point is that the programmer is freed from the error prone tedium of
> > explicitly managing memory.
> 
> This is a misconception. There are two fundamentally different issues:
> 
> 1. Object scopes
> 2. Memory management
> 
> The second issue is less and less relevant as Randy pointed out. The first
> issue is always relevant. It is a good design to consider where an object
> exists. GC [and upward closures] is an attitude of making everything
> potentially global. In fact it is worse than just global. It is "I don't
> know where I need that damn thing."

GC does not affect visibility or scoping. That is an orthogonal issue, and
still quite properly under the control of the programmer. Upward closures do
not make an object more global unless the closure explicitly exposes the
object for external use.

All GC is really about is reclaiming objects that are no longer in use.

As to memory management being less relevant, I am clearly disagreeing with
Randy. As long as memory is finite, it matters. We have huge memories on our
computers today as compared to, say, the 80's. Memory management matters more
than ever.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-02 18:15                                   ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
@ 2007-02-02 19:35                                     ` Adam Beneschan
  2007-02-02 20:04                                     ` Dmitry A. Kazakov
  1 sibling, 0 replies; 397+ messages in thread
From: Adam Beneschan @ 2007-02-02 19:35 UTC (permalink / raw)


On Feb 2, 10:15 am, Ray Blaak <rAYbl...@STRIPCAPStelus.net> wrote:

> As to memory management being less relevant, I am clearly disagreeing with
> Randy. As long as memory is finite, it matters. We have huge memories on our
> computers today as compared to, say, the 80's. Memory management matters more
> than ever.

Yep, sounds like Parkinson's Law is still in effect . . . .

                               -- Adam





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-02 18:15                                   ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
  2007-02-02 19:35                                     ` Adam Beneschan
@ 2007-02-02 20:04                                     ` Dmitry A. Kazakov
  2007-02-02 22:40                                       ` Ray Blaak
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-02 20:04 UTC (permalink / raw)


On Fri, 02 Feb 2007 18:15:05 GMT, Ray Blaak wrote:

> GC does not affect visibility or scoping.

But scoping does affect GC. If scope of each object is known, then there is
nothing to collect.

Further argumentation to what I have already made is that "object in use"
is a semantic concept which cannot be inferred (you wanted automatic
collection). "in use" /= "has references."

It is the programmer who expresses "in use" in language terms. It could be
scoped names or a bunch of controlled pointers distributed across the
program. There must be very strong reasons for choosing the latter.

(I don't think that gaining popularity for a programming language is a good
reason for spaghetti programming. (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-02  1:37                               ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
  2007-02-02  9:35                                 ` Dmitry A. Kazakov
  2007-02-02 12:36                                 ` Markus E Leypold
@ 2007-02-02 21:50                                 ` Gautier
  2007-02-04  8:19                                   ` Ray Blaak
  2007-02-05  1:12                                 ` Robert A Duff
  3 siblings, 1 reply; 397+ messages in thread
From: Gautier @ 2007-02-02 21:50 UTC (permalink / raw)


Ray Blaak:

> It is only sloppy programming in the context of manual clean up. With a valid
> GC you do not have to clean up at all. One simply stops using objects when
> they no longer need them, just "dropping them on the floor", leaving it up to
> the GC to eventually collect it.

Sorry for my ignorance in this field, but from a real household point of view, 
it seems to me that there is a big difference between
  (1) "stop using an object"
and
  (2) "drop an object on the floor".

In case of (1), I would hate that my object is taken from the table and thrown 
away by my garbage collecting robot GeeCee. Hey, does he know I really won't 
use my object anymore ?! In the case (2), it's OK that GeeCee takes it away, 
but then there is an action (Drop_on_the_floor) and I can do it in Ada 
(procedure Drop_on_the_floor_and_even_directly_throw_away is new 
Ada.Unchecked_Deallocation).

Thanks for an explanation...
______________________________________________________________
Gautier         -- http://www.mysunrise.ch/users/gdm/index.htm
Ada programming -- http://www.mysunrise.ch/users/gdm/gsoft.htm

NB: For a direct answer, e-mail address on the Web site!



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-02 20:04                                     ` Dmitry A. Kazakov
@ 2007-02-02 22:40                                       ` Ray Blaak
  2007-02-03 10:00                                         ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-02 22:40 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> On Fri, 02 Feb 2007 18:15:05 GMT, Ray Blaak wrote:
> > GC does not affect visibility or scoping.
> 
> But scoping does affect GC. If scope of each object is known, then there is
> nothing to collect.

If one is using explicit pointers at all, even within a single scope there is
still manual work to free them, unless you want to force the use of some sort
of smart pointer controlled types (which I find tedious compared to using
native pointers).

GC alleviates this work even within a single scope.

> It is the programmer who expresses "in use" in language terms. It could be
> scoped names or a bunch of controlled pointers distributed across the
> program. There must be very strong reasons for choosing the latter.

I am happy to relax the criteria to be just "practical". With GC, one no
longer needs to have such a strong concern precisely because the consequences
are not dire.

> (I don't think that gaining popularity for a programming language is a good
> reason for spaghetti programming. (:-))

I agree :-). I just strongly disagree that the use of shared objects in
conjunction with GC is spaghetti programming.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02 13:57                           ` Markus E Leypold
@ 2007-02-03  9:44                             ` Dmitry A. Kazakov
  2007-02-03 14:51                               ` Markus E Leypold
  2007-02-05  9:59                             ` Maciej Sobczak
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-03  9:44 UTC (permalink / raw)


On Fri, 02 Feb 2007 14:57:17 +0100, Markus E Leypold wrote:

> Can't it? Wasn't there a trick with renaming somewhere? Like
> 
>    A : Limited_Type renames some_function(...);

You can return a limited object within the same scope.
 
> I seem to remember something like this. Might be mistaken: I usually
> end up to eliminate limited types in my programs against my will,
> since they play bad with unlimited base classes (like found in
> GtkAda).

Yes, I still can't understand why they made collected objects (widgets)
non-limited. For packing my limited objects into widgets I am using handles
to. A handle is non-limited. This is quite in GTK+ spirit.

> When I talk about all those transitions, I see, that there was no
> C->Ada transition, at least no mass movement. So we come back to the
> C->initial question: Why not? 

How much popular was C that time? I am asking this question because I
learned C after Ada. My personal transition was FORTRAN-IV/PL/1 -> Ada 83.

> I think some of the posts here have already given answers to that:
> Historical reasons. 
> 
> Those transitions would have had to happen around 1995-2000 which in
> my eyes was a period where people were looking for new languages (GUI
> development in C and all this became rather unfeasible at the
> time). But a process of bringing the candidate languages into the
> public awareness would have to have started earlier. Was the Ada 95
> standard just a tiny bit too late (it is understandable that Ada 83
> was not a serious contender for this, people were looking for OO
> really urgently)? Or was it the vendor situation? GCC has had C++ for
> some time, but did GNAT come too late? 

I think so. GNAT was a quite poor compiler for too long. Another important
thing was (is) a lack of good IDE. C++ vendors paid much attention to
design IDE, much more than to compiler quality... This is the first thing a
newcomer sees.

>> Why? OO is about encapsulation and polymorphism, these don't need
>> references everywhere.
> 
> Yes, but -- you want to keep, say, a list of Shape(s). Those can be
> Triangle(s), Circle(s) etc, which are all derived from class
> Shape. How do you store this list? An array of Shape'Class is out of
> question because of the different allocation requirements for the
> descendants of Shape(s).

Why should this worry you (and me)? It should Randy and Robert! (:-))

The language does not require array implementation to be contiguous. Randy
told once that Janus Ada is practically ready for

   type X is array (...) of Y'Ckass;

>> What about maintainability and reasoning?
> 
> What about it? It's easy with value-oriented languages (i.e. languages
> that just produce new values from old ones in a non-destructive
> fashion). Functional languages do this therefore reasoning is a well
> developed art there. But the representations of all those values
> (trees, lists, ...) (a) rely heavily on representation sharing and (b)
> use references because of that. They need and use GC.

You are mixing by-value vs. by-reference semantics with no-identity vs.
has-identity. These are two semantically different things. One is about
implementation another is about properties of the domain. "Premature
optimization" is you know what... If identity is in question, then objects
should be made limited and could then be accessed through referential
objects. But that alone does not require GC. I might also require no
reference object if self-recursive types supported.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02 12:34                               ` Markus E Leypold
@ 2007-02-03  9:45                                 ` Dmitry A. Kazakov
  2007-02-03 14:16                                   ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-03  9:45 UTC (permalink / raw)


On Fri, 02 Feb 2007 13:34:21 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> Sure you can:
>>
>> type Additive is abstract;  -- Should be enough for an interface
>> function "+" (A, B : Additive) return Additive is abstract;
>>
>> Here I am not sure about "increment" variable, because it is not
>> initialized. Anyway:
>>
>> function Increment return Additive is abstract;
>>    -- Better be getter/setter pair. Unfortunately Ada does not have
>>    -- "abstract variable interface" it should have
>>
>> function Inc (Arg : Additive'Class) return Additive'Class is
>> begin
>>    return Increment + Arg;
>> end Inc;
>>
>> There is a very simple rule for all this:
>>
>> 1. Formal generic subroutines -> abstract primitive operations
>> 2. Subroutines in the body -> class-wide operations.
> 
> So you derive from Additive to get a specific implementation?
> 
> Like
> 
>   type Vector   is Additive with ...;
>   type CountVal is Additive with ...;
> 
> Right? But then, you should note that with
> 
>   C: Countval;
> 
>   Inc(C).
> 
> returns an 'Additive', not a 'CountVal'.

No, it does Additive'Class! Thus, no problem.

(note that the goal was polymorphic Inc, if you wanted it covariant that
would automatically exclude polymorphism)

> That is one problem. The
> other is of course that Increment is not initialized. This is exactly
> what generics are here to solve.

As it was in Georg's example. But with inheritance covariant Increment
would be required to be overridden (praised be Ada). So an initialization
will be enforced anyway.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-02 22:40                                       ` Ray Blaak
@ 2007-02-03 10:00                                         ` Dmitry A. Kazakov
  2007-02-03 14:30                                           ` in defense of GC Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-03 10:00 UTC (permalink / raw)


On Fri, 02 Feb 2007 22:40:51 GMT, Ray Blaak wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> On Fri, 02 Feb 2007 18:15:05 GMT, Ray Blaak wrote:
>>> GC does not affect visibility or scoping.
>> 
>> But scoping does affect GC. If scope of each object is known, then there is
>> nothing to collect.
> 
> If one is using explicit pointers at all, even within a single scope there is
> still manual work to free them, unless you want to force the use of some sort
> of smart pointer controlled types (which I find tedious compared to using
> native pointers).

Why? The language should make no difference between them! BTW, Ada 83 was
designed much in this spirit. It had pragma Controlled.

> GC alleviates this work even within a single scope.

But it is much simpler to hide original object and use controlled pointers
as proxies. Again Ada 83 design had that in mind by making access types
almost transparent. One should just have made them fully transparent.

>> It is the programmer who expresses "in use" in language terms. It could be
>> scoped names or a bunch of controlled pointers distributed across the
>> program. There must be very strong reasons for choosing the latter.
> 
> I am happy to relax the criteria to be just "practical". With GC, one no
> longer needs to have such a strong concern precisely because the consequences
> are not dire.

Premature or belated finalization can be extremely dire. One of the worst
problems I had in C++ was fighting elaboration/finalization order of static
library objects. I suppose you propose to express such things using hard
references scattered across all objects? BTW, if GC were answer why would
anybody require weak references?

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-02 12:44                                   ` in defense of GC Markus E Leypold
@ 2007-02-03 10:13                                     ` Dmitry A. Kazakov
  2007-02-03 14:28                                       ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-03 10:13 UTC (permalink / raw)


On Fri, 02 Feb 2007 13:44:42 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Fri, 02 Feb 2007 01:37:26 GMT, Ray Blaak wrote:
> 
>>> The point is that the programmer is freed from the error prone tedium of
>>> explicitly managing memory.
>>
>> This is a misconception. There are two fundamentally different issues:
>>
>> 1. Object scopes
>> 2. Memory management
>>
>> The second issue is less and less relevant as Randy pointed out. The first
>> issue is always relevant. It is a good design to consider where an object
>> exists. GC [and upward closures] is an attitude of making everything
>> potentially global. In fact it is worse than just global. It is "I don't
>> know where I need that damn thing."
> 
> Closure don't "make things global". They do it as much as returning an
> Integer, makes, GOD FORBID!, a value global (don't return Integers
> people, that makes a value global and we all know, global is bad --
> how nonsensical is that?).

You return a value of the type (Integer) which scope encloses the
subprogram returning that value.

> Closures provide exactly the opposite of
> "making things global". They provide a way for a module (a capsule of
> knowledge / implementation) to return a method (how to do things)
> without disclosing the implementation (i.e. keeping up the abstraction
> barrier).

I have no problem with that. But it is not yet an upward closure when the
type of the returned object is local.

I am a long proponent of procedural types for Ada. Now consider. A
procedure is a limited object. Limited objects can be returned in Ada 2005.
What else you need? [Note, this is still not an upward closure, I am
opposing.]

> I can't believe I even have to spell it out. On the other side -- why
> do I argue? You're even opposing generics.

Yes I do! (:-))

>> This is A) mess, B) sloppy programming, C) impossible model in our
>> networking distributed relativist world.
> 
> There are BTW, even distributed garbage collection algorithms. This
> probably will enrage you no end? :-).

Yes I know, they are usually called trojans...  (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-03  9:45                                 ` Dmitry A. Kazakov
@ 2007-02-03 14:16                                   ` Markus E Leypold
  2007-02-04 19:33                                     ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-03 14:16 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Fri, 02 Feb 2007 13:34:21 +0100, Markus E Leypold wrote:
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>> Sure you can:
>>>
>>> type Additive is abstract;  -- Should be enough for an interface
>>> function "+" (A, B : Additive) return Additive is abstract;
>>>
>>> Here I am not sure about "increment" variable, because it is not
>>> initialized. Anyway:
>>>
>>> function Increment return Additive is abstract;
>>>    -- Better be getter/setter pair. Unfortunately Ada does not have
>>>    -- "abstract variable interface" it should have
>>>
>>> function Inc (Arg : Additive'Class) return Additive'Class is
>>> begin
>>>    return Increment + Arg;
>>> end Inc;
>>>
>>> There is a very simple rule for all this:
>>>
>>> 1. Formal generic subroutines -> abstract primitive operations
>>> 2. Subroutines in the body -> class-wide operations.
>> 
>> So you derive from Additive to get a specific implementation?
>> 
>> Like
>> 
>>   type Vector   is Additive with ...;
>>   type CountVal is Additive with ...;
>> 
>> Right? But then, you should note that with
>> 
>>   C: Countval;
>> 
>>   Inc(C).
>> 
>> returns an 'Additive', not a 'CountVal'.
>
> No, it does Additive'Class! Thus, no problem.

I've been expressing myself sloppily. It should return a CountVal, not
an Additive'Class. CountVal(s) could not be added to Vectors -- there
is a difference.

> (note that the goal was polymorphic Inc, if you wanted it covariant that

No. The goal was to have mechanism to write the algorithms once and
being able to pull various actual operators/procedures from that
without replicating the specification of the algorithm. I really think
you're mixing something up here, but the example is not really useful
to demonstrate the advantages of generics.

Let me suggest 1 other example for the usefulness of generics and than
1 example where generics (Ada-style) help, but parametric polymorphism
(what I have been talking about as parametrized types) would actually
bring an advantage.

Example 1:

  An abstract vector space is defined as a sets of vectors u,v,w,...
  and scalars a,b,c, ...  with a number of operations:
   
    - scalars are a field, i.e. have additions an multiplication with certain properties
    - there is amultiplication between scalars and vectors: a*v
    - there is an addition between vectors.

  A normalized vector space introduces the idea of distance between
  verctors, Again certain laws apply. For the programmer: There is a
  dist-functions which takes 2 vectors a parameters.

  You can do a really huge amount of mathematics with this
  axioms. Examples of vector spaces with thos properties would be
  certain subsets of functions, finite vectors (i.e. arrays of N
  components), polynomes, etc.

  One application would be, given a function on the vector space f : V
  -> Real_Numbers to find minima of this function.

  There is an algorithm that is not very efficient but works in a vast
  number of cases without having to use specific knowledge of the
  underlying vector space.

With Ada generics I'd define the vector space interface a generic and
instantiate accordingly. The algorithm to fin minima would be
implemented as generic and take a vector space package as generic
parameter.


Example 2

  Write a quicksort algorithm on arrays which can be reused for arrays
  of almost arbitrary elements, if a suitable order relationship is
  defined. Note that I might want to sort _the same_ elements with
  different orders (like insurance number or alphabetically ...).


> would automatically exclude polymorphism)
>
>> That is one problem. The
>> other is of course that Increment is not initialized. This is exactly
>> what generics are here to solve.
>
> As it was in Georg's example. 

Yes. But Georg used generics, so it worked. You said, you don't want
generics, but it doesn't work. I fail to see, how George has to use
select an example that makes your misguided hypothesis -- that
generics are not neeeded -- hold up. Quite the opposite.

> But with inheritance covariant Increment
> would be required to be overridden (praised be Ada). So an initialization
> will be enforced anyway.

I don't see how. And what is more: With generics I could istantiate a
Large_Step (increment by 100) and a Small_Step (increment by 1)
function from the generic. That would not work at all with your method. 

If suspect you're missing the respective application areas of generics
vs. inheritanc+overriding. As an software engineer I'd always prefer
generics, since they avoid polymorphism where no polymorphism is
required or intended. That is good, since it's nearer to the contract.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-03 10:13                                     ` Dmitry A. Kazakov
@ 2007-02-03 14:28                                       ` Markus E Leypold
  2007-02-04 18:38                                         ` Dmitry A. Kazakov
  2007-02-05  0:23                                         ` Robert A Duff
  0 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-03 14:28 UTC (permalink / raw)




"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Fri, 02 Feb 2007 13:44:42 +0100, Markus E Leypold wrote:
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>> On Fri, 02 Feb 2007 01:37:26 GMT, Ray Blaak wrote:
>> 
>>>> The point is that the programmer is freed from the error prone tedium of
>>>> explicitly managing memory.
>>>
>>> This is a misconception. There are two fundamentally different issues:
>>>
>>> 1. Object scopes
>>> 2. Memory management
>>>
>>> The second issue is less and less relevant as Randy pointed out. The first
>>> issue is always relevant. It is a good design to consider where an object
>>> exists. GC [and upward closures] is an attitude of making everything
>>> potentially global. In fact it is worse than just global. It is "I don't
>>> know where I need that damn thing."
>> 
>> Closure don't "make things global". They do it as much as returning an
>> Integer, makes, GOD FORBID!, a value global (don't return Integers
>> people, that makes a value global and we all know, global is bad --
>> how nonsensical is that?).
>
> You return a value of the type (Integer) which scope encloses the
> subprogram returning that value.

Same applies to closures when they are returned. Where is the problem?
Scope is defined as the visibility of identifiers. There is now
introduction of global scope in the cas eof returning closures. You
seem to confuse this with lifetime of memory....


>
>> Closures provide exactly the opposite of
>> "making things global". They provide a way for a module (a capsule of
>> knowledge / implementation) to return a method (how to do things)
>> without disclosing the implementation (i.e. keeping up the abstraction
>> barrier).
>
> I have no problem with that. But it is not yet an upward closure when the
> type of the returned object is local.

?. Do you confuse the type of the enclosed objects with the type of
the closure?


Forgive me Dmitry, but I'm on the verge of giving up on you. Your
terminology is, well, unusual. I suggest you read up on some FP and
especially do some and the we restart this argument (or not).


> I am a long proponent of procedural types for Ada. Now consider. A
> procedure is a limited object. Limited objects can be returned in Ada 2005.
> What else you need? [Note, this is still not an upward closure, I am
> opposing.]

I do need that

  function Make_a_Stepper (K:Integer) return ... is

      N : Integer := Complicated_Function(K);

      function Stepper(F:Foo) return Foo is
    
        begin
          
          return Foo + N;

        end;

    begin
      return Stepper;
    end;


would work. And no nitpicking, please, if I made syntax error here: The
intention should be clear.

If it works, we have closures. If it doesn't I fail to see what you
mean by 'returning procedures'. You seem to be long term proponent for
something you're arguing hard to be bad / not useful / whatever ... --
ahem, fighting yourself :-).


>> I can't believe I even have to spell it out. On the other side -- why
>> do I argue? You're even opposing generics.
>
> Yes I do! (:-))

Yes I notice. That does make you avantgarde, certainly. Even the Java
people learned that there is no live without generics / functors.


>>> This is A) mess, B) sloppy programming, C) impossible model in our
>>> networking distributed relativist world.
>> 
>> There are BTW, even distributed garbage collection algorithms. This
>> probably will enrage you no end? :-).
>
> Yes I know, they are usually called trojans...  (:-))

?.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-03 10:00                                         ` Dmitry A. Kazakov
@ 2007-02-03 14:30                                           ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-03 14:30 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> references scattered across all objects? BTW, if GC were answer why would
> anybody require weak references?

For caching. Collect when necessary, reuse when possible.

Regards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-03  9:44                             ` Dmitry A. Kazakov
@ 2007-02-03 14:51                               ` Markus E Leypold
  2007-02-04 17:55                                 ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-03 14:51 UTC (permalink / raw)




"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Fri, 02 Feb 2007 14:57:17 +0100, Markus E Leypold wrote:

>> When I talk about all those transitions, I see, that there was no
>> C->Ada transition, at least no mass movement. So we come back to the
>> C->initial question: Why not? 
>
> How much popular was C that time? I am asking this question because I

Really popular, at least in Europe and the US, I think. Pascal was a
real contender on micros in the eighties but it had practically lost
out in the nineties despite there being still a good number of Delphi
shops/people/developers around.

> learned C after Ada. My personal transition was FORTRAN-IV/PL/1 -> Ada 83.

> I think so. GNAT was a quite poor compiler for too long. Another important

GNAT is still annoying the hell out of me in fringe areas of the
language. And the errors are so fundamental, that I begin to think
that it will take a long time to smoke them out. 

Furthermore I believe there is simply no incentive for AdaCore (who as
I understand maintain most of the GNAT code in the GCC repository) to
establish a good or stable baseline in the publicly accessible
repository.

> thing was (is) a lack of good IDE. C++ vendors paid much attention to
> design IDE, much more than to compiler quality... This is the first thing a
> newcomer sees.

There is something in that. Additionally there is/was a number
additional tools missing like lint/splint/cscope etc. 


>>> Why? OO is about encapsulation and polymorphism, these don't need
>>> references everywhere.
>> 
>> Yes, but -- you want to keep, say, a list of Shape(s). Those can be
>> Triangle(s), Circle(s) etc, which are all derived from class
>> Shape. How do you store this list? An array of Shape'Class is out of
>> question because of the different allocation requirements for the
>> descendants of Shape(s).
>
> Why should this worry you (and me)? It should Randy and Robert! (:-))
>
> The language does not require array implementation to be contiguous. Randy
> told once that Janus Ada is practically ready for
>
>    type X is array (...) of Y'Ckass;

OK, then But non contigous representation of arrays will really stress
memory management, fragment the heap (we can't do that on the stack
AFAIS). And what about mapping of C arrays to Ada arrays (or is that
not possible anyway -- I admit I'd have to read that up)

>>> What about maintainability and reasoning?
>> 
>> What about it? It's easy with value-oriented languages (i.e. languages
>> that just produce new values from old ones in a non-destructive
>> fashion). Functional languages do this therefore reasoning is a well
>> developed art there. But the representations of all those values
>> (trees, lists, ...) (a) rely heavily on representation sharing and (b)
>> use references because of that. They need and use GC.

> You are mixing by-value vs. by-reference semantics with no-identity vs.
> has-identity. 

No. Values have no "identity". "Object identity" smacks heavily of
storage and memory chunks. 

I fail to see how "identity" comes into the question of
"value-oriented" languages (your term, BTW) and represenation
sharing. GC is about references (to memory) and representation sharing
works with references to parts of another "value" (see lisp
lists). Represnattion sharing needs either reference counting
(inefficient, is also just some kind of instant GC) or GC.

> These are two semantically different things. One is about
> implementation another is about properties of the domain. "Premature
> optimization" is you know what... If identity is in question, then objects
> should be made limited and could then be accessed through referential
> objects. But that alone does not require GC. I might also require no
> reference object if self-recursive types supported.

I admit I cannot follow you into your rather foreign and, I have to
say, closed and self referential world of programming language
design. But that doesn't matter much: I also have forgotten why we're
discussing this. "Maintainability and reasoning" was somewhere
there. But did you answer my answer (quoted above) or did you just
declare it as invalid, because you have a different terminology?
Seems, we've actually stopped communicating ...

So ... -- I summarize: In your opinion GC is bad, and I don't
aunderstand your reasoning, in mine it's indispensable in a world were
program sizes have grown and we desire to program on different levels
of abstractions than we did 10 or 20 years ago.

We should leave it at that.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: AW: How come Ada isn't more popular?
  2007-01-31  7:59                               ` AW: " Grein, Christoph (Fa. ESG)
@ 2007-02-03 16:33                                 ` Martin Krischik
  0 siblings, 0 replies; 397+ messages in thread
From: Martin Krischik @ 2007-02-03 16:33 UTC (permalink / raw)


Grein, Christoph (Fa. ESG) wrote:

> 
>>> Ada, but they act as if they are ashamed of having anything to
>>> do with it.
>> I agree the IBM site is not the easiest to navigate. But really it
> does
>> not seem that hidden.
>> www.ibm.com, click products, software by category, Software
> Development,
>> traditional programming languages & compilers.
>> Rational Ada Developer is then on the bottom of the page (In
>> alphabetical order) and still above the XC compiler.
> 
> But if you're looking for Ada, you wouldn't expect it under R, would
> you?

More intersting: Click on "How to Buy"!

No price quoted
No online order
No selling to the general pubic

That is my impression. It might have missed something, I might be mistaken
but for answering "How come Ada isn't more popular" impression is all that
counts.

Martin
-- 
mailto://krischik@users.sourceforge.net
Ada programming at: http://ada.krischik.com



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 23:40                                 ` Markus E Leypold
@ 2007-02-03 16:54                                   ` Georg Bauhaus
  2007-02-03 18:39                                     ` Dmitry A. Kazakov
  2007-02-03 20:06                                     ` Markus E Leypold
  0 siblings, 2 replies; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-03 16:54 UTC (permalink / raw)


On Fri, 2007-02-02 at 00:40 +0100, Markus E Leypold wrote:
> 
> 
> Georg Bauhaus <bauhaus@futureapps.de> writes:
> 
>   I think that Ada *and* Haskell will make an interesting
> > combination on .NET.
> 
> I wonder why one wouldn't just use Monads in most cases?

You wouldn't just use Haskell and monads for at least two reasons:

- used in a natural way, Haskell equations (incl. monads) still
  turn out to be comparatively slow. See Darcs (I am a Darcs user).

- control over what is going to happen at what time is easier using
  a systems programming language.


> I've written and deployed programs in Ada, C, C++,
> Perl and OCaml (among others). And I've played around with Haskell. I
> think I'm more productive with FP than with C and Ada.

It will be most interesting to learn the specifics of what makes
you more productive using OCaml instead of Ada etc, language-wise.


> >  A more important reason not to ignore functional programming
> > is [... ] Recursion

> Recursion is the least of those reasons. What is IMHO more important
> is, that you start not to think about mutating variables but producing
> new values from give ones (a slight shift in perspective, but with
> real impact at the long run).

Yes, you replace thinking about updates to a variable by
thinking about values passed, and their relations.
Now unless you write a straight line program, this won't
work without recursion :-)

Thinking about function values and their combinations
like map + filter is another good thing to learn (and use).

OTOH, FP style is sometimes just assignment in disguise.
It hides the variable as a parameter (and relies on TCElimination)
in a sense. I don't think this is always easier to follow:

  function Sum(L: Cursor; Variable: Integer := Initial_value)
      return Integer is
  begin
      if Has_Element(Cursor) then
          return Sum(Next(L), Variable + Element(L));
      else
          return Variable;
      end if;
  end Sum;

  function Sum(L: Cursor) return Integer is
      Variable: Integer := Initial_Value;
  begin
      while Has_Element(L) loop
          Variable := Variable + Element(L);
          Next(L);
      end loop;
      return Variable;
  end Sum;

(Yes, a true FP Sum function is composed, taking a binop as a parameter
for folding and you make it a base-case-step-macro in Lisp ;-)
I still think
that it is just a bit more difficult to follow the first Sum. If you
want to know how Sum is going to arrive at a result then you have to
follow the history of Variable through a series of function
calls with a series of changing arguments (the induction step).
In the second Sum you see the history of Variable without following
the mechanics of a function call and it is still just one place
where Variable is modified. It's less hidden behind the "function
call screen".
  Still I much prefer the recursive, nested narrowing
function in my Binary_Search -- even though the loop
based variant is faster with some Ada compilers :-)

It's certainly easier to see that the first Sum matches
a primitive recursive function if and when this is important.


> >>  I refuse
> >> to discuss merits of languages ot a micro level like:
> >
> > The "micro" level is the level of production, change, and correction.
> 
>  As you have noticed, FP appeals to
> mathematicians.

Yes, FP has a mathematical appeal. However, time and space
are not usually intrinsic parts of mathematical functions. This is
what I am trying to point out: a computer program operates in time
and space, it is not a mapping between two sets of values,
even though you can interpret the operating program this way.
  A "functional solution" is still very helpful as a reference,
just like you said. For example, a functional solution helps me stay
sane maintaining one of my longer Perl programs that takes about 35 min
on average to compute a number of rankings from some larger database
tables. :) But sometimes a functional solution is only a starting
point, a necessary precondition. It provides guidance when writing
the real thing.

At other times, a functional program is just fine, provided
the author follows the usual conventions: Don't leave out that
much, it won't hurt if you say what you mean.


> I'Ve mathematical background [...]. FP just came naturally to me:
> It mirrors the way I
> think about a problem and about the solution to problems.

For example, a Haskell program can be an executable specification.
Unfortunately, this may not be good enough because what you call the
"rest" is *not* optimization.
It is meeting the requirements, even when these requirements are
"Start no sooner than 22:00h, be ready before 23:00h."
Time and space are intrinsic parts of a solution. If the problem
specification includes time and space, a complete solution must
in effect provide time and space values as well. (Make self-referential
considerations WRT time and space when computing, so to speak.)

The meaning of the word "optimization" is, I think,
to improve existing solutions so they run faster, consume
less resources, are easier to understand, etc.
Optimization does *not* mean to turn something that isn't a solution
into a solution.

This is what mathematicians refuse to see, as far as I can tell.
The stigmatize time and space as being just "optimization" issues.
They are not. In programming, a function is more than an equation
over some field that excludes time and space.



> I fear your full of preconceptions. It's different from what you do in
> Ada (or whatever your favourite imperative language is), so it must be
> wrong or there must be something missing.

Why would I be writing programs in OCaml, then?

> As an FP advocate, I suggest, that the things not written down, are
> not necessary.

FP error messages get better in the presence of explicit types.
Redundant semicolons can help the functional compilers a lot
in figuring out where there is an error in the program text.
Not necessary?
Or are FP programmers of the compiler writer type who hardly
need a robust language?


>  So those "savings" you address as a defect are actually
> a chance. 
> 
> But YMMV. As I said: I'm not in the business of convincing people WRT
> FP. You already indicated that you would not accept the typical
> reasoning of an FP protagonist. I can't offer you something different:
> It's shorter, I can see the problem clearer, I can leave out redundant
> information and so on.

That's the point: it's you who sees clearly, you leave out what seems
redundant to you, etc.. But those other guys, trying to understand
your program, will have to repeat your arrival at a clear sight, 
they will have to iterate the learning process that makes things
seem redundant to you, etc..
The usual answer I got when asking about this is, well, I need
educated colleagues with skills at about the same level as mine.
Sometimes that seemed just true, sometimes that's snobbish,
in some cases it has seemed to express pride. It has also been
an attempt of a programmer to make himself irreplaceable.


> Listening to you justifying that every, every variable must be
that's an exaggeration
> declared with type and all, one wonders hoe mathematics itself ever
> could live without type declarations.

Mathematicians use explanatory sentences and bibliographic references
as a replacement for declarations. So they do declare.
Only the declarations are semi-formal in many cases like you show
in your example. After "Let V be a vectorspace", V is declared,
is in scope, and references to V will just name V of type vectorspace.


> The same principle applies in FP. I fear it won't convince you.

Where FP authors follow the conventions of good style, 
they list a function's type, or they add a semi-formal comment,
or both. Why would they do that if these things are redundant
and should therefore be left out?

> FP has set the cut off at a
> different point than Ada. Question: Was that necessarily wrong?

No, not wrong. It just has consequences to leave things out.

>  It
> fits me. Does that make be a worse programmer / designer / software
> engineer / mathematician? I don't think so.

Imagine an engineer writing programs and documenting his
assumptions even though he thinks they are redundant because
they can be inferred from some context. Worse or better? Why?

Imagine a script author who has to get some data laundry job
done. Would he/she be well advised to write a program that
can be reused, with all bells and whistles, good types, OO
encapsulation? Or just use a bunch of lists, tables, and
regular expressions? (What's the probability of a once-program
becoming an n-times-program, anyway, in reality?)



> > needs not be terminated. This leads to undecipherable error
> > messages when you forget to place one missing token to complete
> > an expression that is the last thing in a function definition.
> 
> I fear that hardly happens to me in OCaml.

I think it's really not important what happens to you and me here.
What is important is what happens to the amount of available
money to pay us until we arrive at an error-free solution.
How much time is spent by the programmers when correcting
mistakes, and how do OCaml and Ada compare in this case in
typical settings?

The error messages of the Erlang system can be even
more obscure. Some of them are printed at run time.
Sometimes there isn't a message at all ... Still, the language
is somewhat minimal and Erlang can help solve some problems
quickly provided the Erlang programmer knows the language, the
libraries, and the system and its quirks.

If you write Erlang programs, you *must* be able to say,
"I fear that hardly happens to me in" Erlang. Otherwise
you will be lost tackling errors you don't understand
because there is comparatively little "redundant" information
in Erlang programs.


> the presence of overloading and tagged
> types the error messages can be become quit misleading, at least with
> Gnat.

It can be.


>  But my experience is that it is the beginners that are most
> obsessed with the "broken syntax".

Of course the beginners complain! Just like when someone switches
from permissive C to unforgiving Ada. But is't not the syntax
of Ada that seems to cause difficulties.

>  Related to that are the repeating
> attempts on c.l.f. or c.l.l to suggest a new syntax surface for Lisp
> "without all so parenthesis", implying that would hugely further the
> proliferation of Lisp. There is, I think, already a c.l.l FAQ for
> this.

I know, and it has taken me some time an effort to make
some proponents of Lisp syntax see the problem in the first place.

(Usual suggestions: If you run into "syntax error at end
of file", just fire up your Lisp system's editor, some function
will look "skewed" and point you near to where a ')' is missing.
Well...)



>  Though the attempts to reform ML syntax happen less often, they
> happen and I count them under the same heading as those Lisp reform
> attempts.

You cannot take pride in having mastered a syntax that is no
challenge. :-)


> But really: What would that buy me? Investing
> the same time into understanding more ML / OCaml / Haskell will earn
> me much more.


> Let me quote from the abstract of that paper:
> 
> ...
> So we are talking about somebody intimately acquainted with the
> language and the research on that language, striving for an
> improvement.

That's why I quoted the paper. It does explain why ML is important.
And that too little attention has been given to syntax.


> I suggest you really read the paper you quoted:

I did, I usually read the papers I quote before I argue ;-)

>  He has some nice
> things to say about the necessity of GC and the people who don't like
> the "bizarre syntax" of ML. At the end of that paragraph he says: "But
> in general, don't we have better things to argue about than syntax?".

Syntax is structuring our communication.
We have important things to achieve, and just ignoring syntax
won't make us more effective. But since we are starting to throw
papers at each other, here is another, more or less empirical one,
talking about programmers discussing(!) irrelevant syntax:)

"Of the pretense (syntax is irrelevant) and the actual reaction
(syntax matters), the one to be believed is the actual reaction.
Not that haggling over parentheses is very productive, but
unsatisfactory syntax usually reflects deeper problems, often
semantic ones: form betrays contents.

"Experienced programmers, so the argument goes,
will never make the error. In fact they make it often.
A recent review of the BSD operating system source..."
-- Betrand Meyer, Principles of language design and evolution, §8

The last sentence is about '=' in C comparisons. '=' has caused a
problem in ML, too.  Hm, perhaps everything coming out of Bell
Labs must continue Bell Labs traditions. SNOBOL4 has '=', C has
it so C++ has it, Aleph has it, Limbo, too IIRC. So maybe ML has
had to have the '=' trouble maker, too.

> Your approach seems to be more the Olde FORTRAN Programmer's approache:
> I can do it in [...] so why must I use/learn another language.

Not at all. What makes you think so?

> > This costs time and money.
> 
> Well -- every legacy feature does. Tell me, Ada has none :-).

In the case of OCaml, there is at least camlp4. I understand nobody
seems to want the revised syntax? Not cool? Herd instinct?
Fear of change? "No, it's not necessary, we have learn-ed ouR
working in-group syntax."

I only wish someone could win the lottery and have some language
designers work on the formal principles of sound syntax for industry
programming languages.

Instead, legacy syntax is reintroduced again and again, by
the same people who argue it doesn't matter because they have
finally learned to work around the broken syntax.
So why not again use the broken syntax? Professional
programmers get bitten by the syntax bugs again and still
continue to claim this isn't important...

A few weeks ago a colleague explained he always writes

if (expr == false)
{

because a '!' can be hard to see. I told him he could always use

if (!!!expr)
{

...





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-03 16:54                                   ` Georg Bauhaus
@ 2007-02-03 18:39                                     ` Dmitry A. Kazakov
  2007-02-03 20:06                                     ` Markus E Leypold
  1 sibling, 0 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-03 18:39 UTC (permalink / raw)


On Sat, 03 Feb 2007 17:54:53 +0100, Georg Bauhaus wrote:

> A few weeks ago a colleague explained he always writes
> 
> if (expr == false)
> {
> 
> because a '!' can be hard to see. I told him he could always use
> 
> if (!!!expr)
> {

Funny, I am almost always write

   if (0 == expr)
   { 

(and "false", that's not C anyway! (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-03 16:54                                   ` Georg Bauhaus
  2007-02-03 18:39                                     ` Dmitry A. Kazakov
@ 2007-02-03 20:06                                     ` Markus E Leypold
  2007-02-05  0:06                                       ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-03 20:06 UTC (permalink / raw)



Georg Bauhaus <bauhaus@futureapps.de> writes:

> On Fri, 2007-02-02 at 00:40 +0100, Markus E Leypold wrote:
>> 
>> 
>> Georg Bauhaus <bauhaus@futureapps.de> writes:
>> 
>>   I think that Ada *and* Haskell will make an interesting
>> > combination on .NET.
>> 
>> I wonder why one wouldn't just use Monads in most cases?
>
> You wouldn't just use Haskell and monads for at least two reasons:
>
> - used in a natural way, Haskell equations (incl. monads) still
>   turn out to be comparatively slow. See Darcs (I am a Darcs user).

Darcs is slow because it partly uses algorithms with O(n^k) with big
k's or even exponential run time. It's not a I/O or monad problem as I
have understood.

> - control over what is going to happen at what time is easier using
>   a systems programming language.

Sigh. I can't convince you here. There is no reason to assume the
necessary level of control cannot be achieved with a language like
Haskel. Given it has to me achieved at all.

But I also notice a loss of context. You were talking about fusions
and I said "I wonder why one wouldn't just use Monads in most
cases?". You way to quote that back on me somehow mangles that
context.

>> I've written and deployed programs in Ada, C, C++,
>> Perl and OCaml (among others). And I've played around with Haskell. I
>> think I'm more productive with FP than with C and Ada.
>
> It will be most interesting to learn the specifics of what makes
> you more productive using OCaml instead of Ada etc, language-wise.

How do I know? I just obeerve it, that it is so and I can of course
speculate why that is so. But what I experience is not a controlled
experiment and I will be met with responses that tell me, it can't be
that this is the reason or even that I'm more productive, because
<theory of your choice>.

My to points:

  - The ML-typesystem (specifically the OCaml way to integrate
    objects) is safe w/o forcing you to through contorsion in the more
    abstract cases as. i.e. Java or Ada in their way to implement
    classes. This seems to be an effect how classes work together and
    see each other and cannot be seen with examples involving only 1
    or 2 classes.

  - GC really is a boon if objects recursively refer to each other (A
    calls be an vice versa). Their life cycle is somehow coupled and
    there seems to be no general (!) way on deciding who needs to
    deallocate whom. 

  - What I like very much is type inference if used right. Parametric
    polymorphism allows be to concentrate on the aspect at hand and
    not to have to concern myself with properties of data that are not
    relevant for the algorithm.

  - It's really easy to refacto functional programs incrementally by
    morphing over a number of correct intermediate stages. 

>
>> >  A more important reason not to ignore functional programming
>> > is [... ] Recursion
>
>> Recursion is the least of those reasons. What is IMHO more important
>> is, that you start not to think about mutating variables but producing
>> new values from give ones (a slight shift in perspective, but with
>> real impact at the long run).
>
> Yes, you replace thinking about updates to a variable by
> thinking about values passed, and their relations.

About data flow, to be correct.

> Now unless you write a straight line program, this won't
> work without recursion :-)

Yes, but recursion is not the big eye opener here. You said "A more
important reason not to ignore functional programming is [... ]
Recursion". I say: Recursion is more at the surface here, but hardly
the big difference to imperative languages (which all can recur
:-). It's more at the surface, though.

> Thinking about function values and their combinations
> like map + filter is another good thing to learn (and use).

I've been starting to write a library like this in and for
Ada. Unfortunately I'm missing time and motivation to continue.

> OTOH, FP style is sometimes just assignment in disguise.  It hides
> the variable as a parameter (and relies on TCElimination) in a
> sense.

Man, it's not assignment that is BAD. It's just that the programmer
shouldn't program it and should keep to a language in which equational
reasoning is possible. The compiler on the other side is allowed to
optimize what it wants.


> I don't think this is always easier to follow:
>
>   function Sum(L: Cursor; Variable: Integer := Initial_value)
>       return Integer is
>   begin
>       if Has_Element(Cursor) then
>           return Sum(Next(L), Variable + Element(L));
>       else
>           return Variable;
>       end if;
>   end Sum;
>
>   function Sum(L: Cursor) return Integer is
>       Variable: Integer := Initial_Value;
>   begin
>       while Has_Element(L) loop
>           Variable := Variable + Element(L);
>           Next(L);
>       end loop;
>       return Variable;
>   end Sum;
>
> (Yes, a true FP Sum function is composed, taking a binop as a parameter
> for folding and you make it a base-case-step-macro in Lisp ;-)

Yes, a really terrible example: Pseudo FP with Ada syntax. I'm not
surprised you find it difficult to follow. 

  sum = fold (+) 1
 
How's that? And that really summarizes what a functional programmers
knows and wants to know about linear iteration with a state over some
linear collection of thingies. Another example: Revert a list.

  rev = fold (fun x xs -> x::xs) []

And now removing all element with certain properties (p) from a list

 remove_rev p = fold (fun x xs -> if (p x) then xs else x::xs) []

 remove p l   = rev (remove_rev p l)

Read fold as "iterate and do the following ... starting with ..." and
you got the essence.
       

> I still think that it is just a bit more difficult to follow the
> first Sum.

Because of the atrocious syntax: Not really suitable for the purpose.

> If you want to know how Sum is going to arrive at a result then you
> have to follow the history of Variable through a series of function
> calls with a series of changing arguments (the induction step).

"How do you know how Sum is going to arrive at a result when you have
 to follow the history of a variable through a series of iterations
 with a series of changing state variables"

To answer you question: Because most definitions of data
transformations a recursive in nature: Do a small step, the do the
rest. E.g. I don't see another way to work on trees (like XML
documents) without risking major fuck-ups.

> In the second Sum you see the history of Variable without following
> the mechanics of a function call and it is still just one place
> where Variable is modified. It's less hidden behind the "function
> call screen".
>   Still I much prefer the recursive, nested narrowing
> function in my Binary_Search -- even though the loop
> based variant is faster with some Ada compilers :-)
>
> It's certainly easier to see that the first Sum matches
> a primitive recursive function if and when this is important.

No.


>> >>  I refuse
>> >> to discuss merits of languages ot a micro level like:
>> >
>> > The "micro" level is the level of production, change, and correction.
>> 
>>  As you have noticed, FP appeals to
>> mathematicians.
>
> Yes, FP has a mathematical appeal. However, time and space
> are not usually intrinsic parts of mathematical functions. This is

No? What about State (t) = ...? Are mathematical functions somehow
against nature? Will my machine time shift or desintegrate when I
write pure functions? 

A no to all that questions. I hardly see why I should think about time
and space _all the time_. It's the compilers job to make something
from it and it does a good job of this. Data is evaluated when needed
(call by need ...). Not when passed and perhaps not needed. I'd even
venture there are enough (not IO bound) problems where eager language
evaluate much too many data items which are, finally not needed at
all. Functional is basically about need: When data is needed, it is
evaluated. 

And when I need to have interaction with space and time (i.e. the
outside world) and not just live in the data sphere, the IO monad
comes to the rescue.

Forgive me, I'm not one to cry FUD early, but your permanent
assertions that time and space complexity are so unmanageable in FP
are simply FUD and they reflect the state of FUD some 20 years ago. I
think there is even a paper of Wadler on that.

> what I am trying to point out: a computer program operates in time
> and space, it is not a mapping between two sets of values,
> even though you can interpret the operating program this way.

Yes, it'd not even manipulating numbers but levels voltage and
strength of current. I'm not programing with the voltmeter though --
we have stopped to do that in the fifties. This is progress. It's
called abstraction.

>   A "functional solution" is still very helpful as a reference,
> just like you said. 

Not only that, but I won't be able to convince you. I'm no purist,
though. We started with talking about the type system and I'm not
averse to writing imperative procedures in ocaml and using them
together with a functional library.

> For example, a functional solution helps me stay
> sane maintaining one of my longer Perl programs that takes about 35 min
> on average to compute a number of rankings from some larger database
> tables. :) But sometimes a functional solution is only a starting
> point, a necessary precondition. 

Sometimes ... -- But therefore FP is bad all the time because it is
not somehow tied to time and space? I'm starting to become confused.

> It provides guidance when writing the real thing.

Only that the unreal thing works perfectly most of the time. Why
should I rewrite?

> At other times, a functional program is just fine, provided
> the author follows the usual conventions: Don't leave out that
> much, it won't hurt if you say what you mean.
>
>
>> I'Ve mathematical background [...]. FP just came naturally to me:
>> It mirrors the way I
>> think about a problem and about the solution to problems.
>

> For example, a Haskell program can be an executable specification.

A Haskell program is program, not a spec. How can people confuse that?
A spec often has predicates that simply cannot be computed at all or
only at forbidding costs.

> Unfortunately, this may not be good enough because what you call the
> "rest" is *not* optimization.

?? rest ??

> It is meeting the requirements, even when these requirements are
> "Start no sooner than 22:00h, be ready before 23:00h."

> Time and space are intrinsic parts of a solution. 

Nonsense. Intrinsic? In the sense if I don't think always about it, my
functional program will not start when I tell it, but some time
araound last easter? Man, that are not even real problem you're
injuring up here! Apart from the fact that Monads provide the handling
you're trying to deny that it exists. q

> If the problem
> specification includes time and space,

That is, if we are talking about a real time problem. Fine. As I
repeatedly said, most of the things I'm talking about are not real
time problems. I simply refuse to throw away a suitable tool and
exchange it against an unsuitable one, simply because some problem
sets cannot (perhaps) be dealt with by the useful tool.

All arguments you gave can as well be applied to not using a compiled
language, but using assembler. "Sometimes you don't knoe about the
instractions the compiler generates.  If the problem specification
includes time and space, a complete solution must [...] -- let's all
use assembler where we have full control, all the time, also in
ordinary application prgramming".

> a complete solution must
> in effect provide time and space values as well. (Make self-referential
> considerations WRT time and space when computing, so to speak.)
>
> The meaning of the word "optimization" is, I think,
> to improve existing solutions so they run faster, consume
> less resources, are easier to understand, etc.
> Optimization does *not* mean to turn something that isn't a solution
> into a solution.
>
> This is what mathematicians refuse to see, as far as I can tell.

Well - how can I refute that? There is so much work (by
mathematicians) on specifying real time algorithms and giving real
time assureances) that your across-the-board accusation strikes me as
simply unjustified. There has also been work on realtime systems and
ML and furthermore only a small fraction of applications and problems
are real time oriented. Functional computation doesn't happen outside
of space and time and the space and time behaviour is mostly well
understood.

So I fail to see how FP should be bad because it doesn't refer to
space and time or doesn't specify sequencing of operations outside of
Monads or effect systems. You know, I don't see a time variable in C
or Ada either and as far as sequencing goes: You know that modern
compilers reorder the operations to achieve optimization?

> The stigmatize time and space as being just "optimization" issues.

? You must be dreaming. Perhaps your confusion stems from that you mix
up mathematicians, mathematically inclined software engineers and
functional programmers. I personally I'm wuite glad that, say, linear
algebra has not time+space component in it. But what has that to do
with our problem?

Or care to alaborate to which people you refer when you say
"mathematicians"?

> They are not. In programming, a function is more than an equation
> over some field that excludes time and space.

You mix up functions (a mathematical construct) and procedures (a
specification of an operation or computation to be performed by suitable computing
machine). 

Functional programming also specifies operations (not mathematical
functions) but the intresting part is, that it does so in a more
abstract way: It just says what has to be delivered when needed and
(in the lazy case) leaves the rest to the compiler / runtime system.

>> I fear your full of preconceptions. It's different from what you do in
>> Ada (or whatever your favourite imperative language is), so it must be
>> wrong or there must be something missing.
>
> Why would I be writing programs in OCaml, then?

Yes, I wonder, indeed.

>> As an FP advocate, I suggest, that the things not written down, are
>> not necessary.
>
> FP error messages get better in the presence of explicit types.

At the right places. Not at all places. It's that difference evrything
is about.

> Redundant semicolons can help the functional compilers a lot
> in figuring out where there is an error in the program text.

No. Redundant semicolons very probably make your ML/OCaml compiler
barf at you.

> Not necessary?

??

> Or are FP programmers of the compiler writer type who hardly
> need a robust language?

FUD. (a) what is a "robust" language. (b) what has it to do with
semicolons and (c) why don't I have the problem to understand my
compilers error messages?

I really wonder about your experiences.


>>  So those "savings" you address as a defect are actually
>> a chance. 
>> 
>> But YMMV. As I said: I'm not in the business of convincing people WRT
>> FP. You already indicated that you would not accept the typical
>> reasoning of an FP protagonist. I can't offer you something different:
>> It's shorter, I can see the problem clearer, I can leave out redundant
>> information and so on.


> That's the point: it's you who sees clearly, you leave out what seems
> redundant to you, etc.. But those other guys, trying to understand
> your program, will have to repeat your arrival at a clear sight, 
> they will have to iterate the learning process that makes things
> seem redundant to you, etc..

I wonder why you think, that Perl and Ada are readable and FP is not?

> The usual answer I got when asking about this is, well, I need
> educated colleagues with skills at about the same level as mine.
> Sometimes that seemed just true, sometimes that's snobbish,
> in some cases it has seemed to express pride. It has also been
> an attempt of a programmer to make himself irreplaceable.

Bullshit, man. What actually do you suggest? That I, I!, have to
program everything in e.g. C so that every other gonzo in the world
can understand all the programs I e.g. wrote for my use and
entertainment?

George, I give up on you. You don't understand -> it's my problem and
you don't give me enough clues to get you, well, clued in. You seem to
have had some real bad experiences with mathematicians and functional
programs. I can't help you there. FP, perhaps, is not for you. I'll
always be happy to answer specific questions, though. 

> That's the point: it's you who sees clearly, you leave out what seems
> redundant to you, etc.. But those other guys, trying to understand
> your program, will have to repeat your arrival at a clear sight, 

But I can't answer diatribe of this kind, expect with "I don't have
that problem, nobody I know of has it etc., I hardly have problems
reading other peoples OCaml programs etc...".



>> Listening to you justifying that every, every variable must be
> that's an exaggeration
>> declared with type and all, one wonders hoe mathematics itself ever
>> could live without type declarations.

> Mathematicians use explanatory sentences and bibliographic references
> as a replacement for declarations. So they do declare.


Well, we can suppose, that someon trying to understand a program in
Haskell or OCaml or even Perl has read parts of the reference manual,
can we?

> Only the declarations are semi-formal in many cases like you show
> in your example. After "Let V be a vectorspace", V is declared,
> is in scope, and references to V will just name V of type vectorspace.
>
>
>> The same principle applies in FP. I fear it won't convince you.
>
> Where FP authors follow the conventions of good style, 
> they list a function's type, or they add a semi-formal comment,

No, sorry, they don't, They do that for key functions, not for
functions with the status of lemmata. They do it for interfaces. And
you know what: The language requires it in interfaces,

> or both. 

> Why would they do that if these things are redundant
> and should therefore be left out?

This IS childish. Also because it depends on your made-up definition
of good style in FP, which is, incidentally, not true.

You quoted Appels ML critique -- have you actually read it?


>> FP has set the cut off at a
>> different point than Ada. Question: Was that necessarily wrong?
>
> No, not wrong. It just has consequences to leave things out.

Just some paragraps ago you said, good style doesn't leave that things
out. Necessarily leaving the out is bad style: So bad. Seems to me,
you said wrong.


>>  It
>> fits me. Does that make be a worse programmer / designer / software
>> engineer / mathematician? I don't think so.
>
> Imagine an engineer writing programs and documenting his
> assumptions even though he thinks they are redundant because
> they can be inferred from some context. Worse or better? Why?

What has that to do with type inference? 


> Imagine a script author who has to get some data laundry job
> done. Would he/she be well advised to write a program that
> can be reused, with all bells and whistles, good types, OO
> encapsulation? 

> Or just use a bunch of lists, tables, and
> regular expressions? 

> (What's the probability of a once-program
> becoming an n-times-program, anyway, in reality?)


What has that to do with type inference? 


>> > needs not be terminated. This leads to undecipherable error
>> > messages when you forget to place one missing token to complete
>> > an expression that is the last thing in a function definition.
>> 
>> I fear that hardly happens to me in OCaml.
>
> I think it's really not important what happens to you and me here.

But your anecdotal evidence is?

> What is important is what happens to the amount of available
> money to pay us until we arrive at an error-free solution.

Please, you use your tools, and I'll use mine.

> How much time is spent by the programmers when correcting
> mistakes, and how do OCaml and Ada compare in this case in
> typical settings?

How do they? 

> The error messages of the Erlang system can be even
> more obscure. Some of them are printed at run time.
> Sometimes there isn't a message at all ... Still, the language
> is somewhat minimal and Erlang can help solve some problems
> quickly provided the Erlang programmer knows the language, the
> libraries, and the system and its quirks.

Good idea. If you need an argument, pull it from another functional
language, (anecdotal evidence again) and pose it as typical. So all
faunctional langauges / programming systems have to hjustfy the
weaknesses of any of them. 

> If you write Erlang programs, you *must* be able to say,
> "I fear that hardly happens to me in" Erlang. Otherwise
> you will be lost tackling errors you don't understand
> because there is comparatively little "redundant" information
> in Erlang programs.

So saying this is an invalid argument. Unfortunately that also applies  to 

   "I find error messages of the Erlang system even be more obsure".

Not the I, I have been adding here. 


>
>> the presence of overloading and tagged
>> types the error messages can be become quit misleading, at least with
>> Gnat.
>
> It can be.
>
>
>>  But my experience is that it is the beginners that are most
>> obsessed with the "broken syntax".
>
> Of course the beginners complain! Just like when someone switches
> from permissive C to unforgiving Ada. But is't not the syntax
> of Ada that seems to cause difficulties.

So? So your complaints about ML syntax become a valid and important argument?

>>  Related to that are the repeating
>> attempts on c.l.f. or c.l.l to suggest a new syntax surface for Lisp
>> "without all so parenthesis", implying that would hugely further the
>> proliferation of Lisp. There is, I think, already a c.l.l FAQ for
>> this.
>
> I know, and it has taken me some time an effort to make
> some proponents of Lisp syntax see the problem in the first place.
>
> (Usual suggestions: If you run into "syntax error at end
> of file", just fire up your Lisp system's editor, some function
> will look "skewed" and point you near to where a ')' is missing.
> Well...)

Yes, well? So what? The Lispers seem not to have the problem you
have. What does that tell you? Probably: Typical FP advocates, they
just pretend that is not a problem and leave all other people outside
in the cold rain? Tell me; What does that tell you about your approach
to a new language or programming system? I got the impression now,
that you try to retrofit your Ada / Perl / C / whatever experience to
the new system and that you're peeved of than, if that doesn't work.

I notice we have been departed from teh discussion of technical
properties of languages and the possible implications for constructing
programs and have strayed in the realm of pure taste - Like "I don't
have the problem -- typical, that you would deny it" and bad analogies
"Imagine an engineer, who ...". 

Unfortunately I don't feel qualified and I'm not intrested to discuss
believes and tastes of that kind especially since have reached some kind of deadlock. 

So I suggest, we either shift that thread to a functional forum, group
or list or I'll at least stop participating in this particular sub
thread. It seems to be wast of time, since I really can't change or
influence your point of view.
>
>>  Though the attempts to reform ML syntax happen less often, they
>> happen and I count them under the same heading as those Lisp reform
>> attempts.
>
> You cannot take pride in having mastered a syntax that is no
> challenge. :-)

What is that supposed to mean?


>> But really: What would that buy me? Investing
>> the same time into understanding more ML / OCaml / Haskell will earn
>> me much more.
>
>
>> Let me quote from the abstract of that paper:
>> 
>> ...
>> So we are talking about somebody intimately acquainted with the
>> language and the research on that language, striving for an
>> improvement.
>
> That's why I quoted the paper. It does explain why ML is important.
> And that too little attention has been given to syntax.

Better reread that paragraph.


>
>> I suggest you really read the paper you quoted:
>
> I did, I usually read the papers I quote before I argue ;-)
>
>>  He has some nice
>> things to say about the necessity of GC and the people who don't like
>> the "bizarre syntax" of ML. At the end of that paragraph he says: "But
>> in general, don't we have better things to argue about than syntax?".
>
> Syntax is structuring our communication.

> We have important things to achieve, and just ignoring syntax
> won't make us more effective. 

So be free to continue arguing about syntax. I more agree with Appel
on that subject.

> But since we are starting to throw
> papers at each other, here is another, more or less empirical one,
> talking about programmers discussing(!) irrelevant syntax:)


> "Of the pretense (syntax is irrelevant) and the actual reaction
> (syntax matters), the one to be believed is the actual reaction.
> Not that haggling over parentheses is very productive, but
> unsatisfactory syntax usually reflects deeper problems, often
> semantic ones: form betrays contents.
>
> "Experienced programmers, so the argument goes,
> will never make the error. In fact they make it often.
> A recent review of the BSD operating system source..."
> -- Betrand Meyer, Principles of language design and evolution, �8
>
> The last sentence is about '=' in C comparisons. '=' has caused a
> problem in ML, too.  Hm, perhaps everything coming out of Bell
> Labs must continue Bell Labs traditions. SNOBOL4 has '=', C has
> it so C++ has it, Aleph has it, Limbo, too IIRC. So maybe ML has
> had to have the '=' trouble maker, too.

So C has problem with "=" and "==", which of course makes your
complaints about ML syntax valid...? 


>> Your approach seems to be more the Olde FORTRAN Programmer's approache:
>> I can do it in [...] so why must I use/learn another language.
>
> Not at all. What makes you think so?

I think there is a bit of context missing here.

>> > This costs time and money.
>> 
>> Well -- every legacy feature does. Tell me, Ada has none :-).

> In the case of OCaml, there is at least camlp4. 

So?

> I understand nobody
> seems to want the revised syntax? Not cool? Herd instinct?

No. Keyword is maintainability and tool support (this starts with
emacs modes. Tuareg + traditional syntax buys you more than revised
syntax.

> Fear of change? "No, it's not necessary, we have learn-ed ouR
> working in-group syntax."

I find that rather typical that you insinuate elitist motivation
here. Of course I can't help if you experience the world like that.

> I only wish someone could win the lottery and have some language
> designers work on the formal principles of sound syntax for industry
> programming languages.


> Instead, legacy syntax is reintroduced again and again, by
> the same people who argue it doesn't matter because they have
> finally learned to work around the broken syntax.

> So why not again use the broken syntax? Professional

Let me sing that again: Tool support, existing code, existing know how
and training.

> programmers get bitten by the syntax bugs again and still
> continue to claim this isn't important...

>
> A few weeks ago a colleague explained he always writes
>
> if (expr == false)

> {
>
> because a '!' can be hard to see. I told him he could always use
>
> if (!!!expr)
> {
>
> ...


Which langauge is that supposed to be? It ain't C, this much is
sure. And what does that prove?

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-02 21:50                                 ` in defense of GC (was Re: How come Ada isn't more popular?) Gautier
@ 2007-02-04  8:19                                   ` Ray Blaak
  2007-02-04 17:36                                     ` Hyman Rosen
  0 siblings, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-04  8:19 UTC (permalink / raw)


Gautier <gautier@fakeaddress.nil> writes:
> Sorry for my ignorance in this field, but from a real household point of view,
> it seems to me that there is a big difference between
>   (1) "stop using an object"
> and
>   (2) "drop an object on the floor".
> 
> In case of (1), I would hate that my object is taken from the table and thrown
> away by my garbage collecting robot GeeCee. Hey, does he know I really won't
> use my object anymore ?! In the case (2), it's OK that GeeCee takes it away,
> but then there is an action (Drop_on_the_floor) and I can do it in Ada
> (procedure Drop_on_the_floor_and_even_directly_throw_away is new
> Ada.Unchecked_Deallocation).

I was not sure if this was serious or not, so if it was...

Regarding (1), if you are not sure you will use an object anymore, then the GC
will not collect it, *by definition*, since it must be still reachable in some
way in order for it to be even possible to use. GC only collects unreachable
memory.

Regarding (2), doing a matching "drop" for every "new" is the whole trouble,
of course. What if you miss one? Well, with GC, you don't (need to) do an
explicit "drop" at all. And that's the freedom.

Now nothing is completely perfect and free. You can still have memory leaks
with GC, of course, but they are of a different kind. Instead of a lost
pointer that still refers to allocated memory, one has live objects still
being referred to in some way, i.e., "not lost" pointers that one no longer
needs to refer to.

So, the most one needs to do is to clear a pointer/reference occasionally.
Note though, that this is far far simpler than, for example, carefully
deallocating each node of some long lived complex data structure, cycles and
all. Instead one simply clears the reference to the root of entire complex
structure, and it all just goes away.

In the case of stacked based references (which are the most common kind),
there is nothing do to. New objects are made, maybe returned out to the
caller, maybe not, without regard to destruction, copy construction, etc. It
just is no longer necessary.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-04  8:19                                   ` Ray Blaak
@ 2007-02-04 17:36                                     ` Hyman Rosen
  2007-02-04 21:21                                       ` Ray Blaak
  0 siblings, 1 reply; 397+ messages in thread
From: Hyman Rosen @ 2007-02-04 17:36 UTC (permalink / raw)


Ray Blaak wrote:
> GC only collects unreachable memory.

The proper definition of (ideal) GC is that it collects
memory for reuse from objects that the program will never
access again. Reachability and such are implementation
details.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-03 14:51                               ` Markus E Leypold
@ 2007-02-04 17:55                                 ` Dmitry A. Kazakov
  2007-02-04 20:18                                   ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-04 17:55 UTC (permalink / raw)


On Sat, 03 Feb 2007 15:51:33 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Fri, 02 Feb 2007 14:57:17 +0100, Markus E Leypold wrote:
> 
>>> When I talk about all those transitions, I see, that there was no
>>> C->Ada transition, at least no mass movement. So we come back to the
>>> C->initial question: Why not? 
>>
>> How much popular was C that time? I am asking this question because I
> 
> Really popular, at least in Europe and the US, I think. Pascal was a
> real contender on micros in the eighties but it had practically lost
> out in the nineties despite there being still a good number of Delphi
> shops/people/developers around.
> 
>> learned C after Ada. My personal transition was FORTRAN-IV/PL/1 -> Ada 83.
> 
>> I think so. GNAT was a quite poor compiler for too long. Another important
> 
> GNAT is still annoying the hell out of me in fringe areas of the
> language. And the errors are so fundamental, that I begin to think
> that it will take a long time to smoke them out. 

Well, visibility in generics is still a great problem. 

> Furthermore I believe there is simply no incentive for AdaCore (who as
> I understand maintain most of the GNAT code in the GCC repository) to
> establish a good or stable baseline in the publicly accessible
> repository.

Yes, at least a publicly available bug tracking system. I have an
impression that the same bugs come and go over and over again.

>>>> Why? OO is about encapsulation and polymorphism, these don't need
>>>> references everywhere.
>>> 
>>> Yes, but -- you want to keep, say, a list of Shape(s). Those can be
>>> Triangle(s), Circle(s) etc, which are all derived from class
>>> Shape. How do you store this list? An array of Shape'Class is out of
>>> question because of the different allocation requirements for the
>>> descendants of Shape(s).
>>
>> Why should this worry you (and me)? It should Randy and Robert! (:-))
>>
>> The language does not require array implementation to be contiguous. Randy
>> told once that Janus Ada is practically ready for
>>
>>    type X is array (...) of Y'Ckass;
> 
> OK, then But non contigous representation of arrays will really stress
> memory management, fragment the heap (we can't do that on the stack
> AFAIS).

Maybe, but it is no different from what would happen in any other
implementation. Note that there is a sufficiently different case, when all
Y'Class are of the same shape. For this I propose Tag discriminants:

    type X (Shape : Y'Class'Tag.) is array (...) of Y'Ckass (Shape);

Here the compiler can allocate X in one chunk of memory.

> And what about mapping of C arrays to Ada arrays 

That were not possible, but it also were no problem. When you apply pragma
Convention to an array type, the compiler should tell you that you can't
have dynamic bounds and elements. You cannot pass String down to C, but you
could try String(1..20). [If there were no Interfaces.C of course.]

>>>> What about maintainability and reasoning?
>>> 
>>> What about it? It's easy with value-oriented languages (i.e. languages
>>> that just produce new values from old ones in a non-destructive
>>> fashion). Functional languages do this therefore reasoning is a well
>>> developed art there. But the representations of all those values
>>> (trees, lists, ...) (a) rely heavily on representation sharing and (b)
>>> use references because of that. They need and use GC.
> 
>> You are mixing by-value vs. by-reference semantics with no-identity vs.
>> has-identity. 
> 
> No. Values have no "identity".

It depends. Polymorphic value has an identity. But I understand what you
mean, and that meaning is correct.

> "Object identity" smacks heavily of
> storage and memory chunks. 

No. Identity is just a function id:X->I, where I is some set of comparable
values. No magic. X'Address or X'Access can serve as an identity function.
Or not, if you have relocatable objects, for instance. In general nothing
in objects requires X'Address. There are other kinds of identity. For
example id:X->T, where T is the type of X.

> I fail to see how "identity" comes into the question of
> "value-oriented" languages (your term, BTW) and represenation
> sharing. GC is about references (to memory) and representation sharing
> works with references to parts of another "value" (see lisp
> lists). Represnattion sharing needs either reference counting
> (inefficient, is also just some kind of instant GC) or GC.

The example you mentioned was a tree. Do you want to share subtrees between
other trees? When a subtree C of a tree A changes, should it ("it" is an
identity (:-)) in the tree B be changed as well? You cannot answer these
questions in terms of tree values. For all possible answers there exist
corresponding values of A, B, C, C in A, C in B, A/\B etc. It is a semantic
problem which has nothing to do with representation, be it
immutable-functional or shared. Identity is a way to express this
semantics. Value vs. reference is a semantically irrelevant implementation
detail.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-03 14:28                                       ` Markus E Leypold
@ 2007-02-04 18:38                                         ` Dmitry A. Kazakov
  2007-02-04 20:24                                           ` Markus E Leypold
  2007-02-05  0:23                                         ` Robert A Duff
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-04 18:38 UTC (permalink / raw)


On Sat, 03 Feb 2007 15:28:53 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Fri, 02 Feb 2007 13:44:42 +0100, Markus E Leypold wrote:
>>
>>> Closure don't "make things global". They do it as much as returning an
>>> Integer, makes, GOD FORBID!, a value global (don't return Integers
>>> people, that makes a value global and we all know, global is bad --
>>> how nonsensical is that?).
>>
>> You return a value of the type (Integer) which scope encloses the
>> subprogram returning that value.
> 
> Same applies to closures when they are returned. Where is the problem?

The problem is this:

function Foo return ?What? is
   type Bar is ...;
   X : Bar;
begin
   return X;
end Foo;

>> I am a long proponent of procedural types for Ada. Now consider. A
>> procedure is a limited object. Limited objects can be returned in Ada 2005.
>> What else you need? [Note, this is still not an upward closure, I am
>> opposing.]
> 
> I do need that
> 
>   function Make_a_Stepper (K:Integer) return ... is
> 
>       N : Integer := Complicated_Function(K);
> 
>       function Stepper(F:Foo) return Foo is
>     
>         begin
>           
>           return Foo + N;
> 
>         end;
> 
>     begin
>       return Stepper;
>     end;
> 
> 
> would work. And no nitpicking, please, if I made syntax error here: The
> intention should be clear.

I suppose it is. Here I propose a closure-free solution:

package Something_Very_Functional is
   type function Stepper (F : Foo) return Foo;
      -- The type of a function we want to pass somewhere
   procedure Step_It (Visitor : Stepper'Class);
   ...
end Something_Very_Functional;

package Stepper_Factory is
   function Create (K : Integer) return Stepper'Class;
      -- This is a constructor for the stepper provided here. We
      -- don't want to expose any details about it, so the
      -- Stepper type is made private.
private
   type Stepper_With_A_Parameter (N : Integer) is new Stepper;
      -- This is a derived type to have a discriminant, which could serve
      -- as a carrier for the locally computed value.
end Stepper_Factory;

package body Stepper_Factory is
   function Create (K : Integer) return Stepper'Class is
   begin
      return
         Stepper_With_A_Parameter'
         (  N => Complicated_Function (K),
         with
            begin
               return F + N;
            end;
         );
end Stepper_Factory;

Some notes. Within the "procedural record extension" the visibility rules
are such that local variables of Create are invisible. The package data are
visible and accessibility checks are made to ensure that the scope of
Stepper_With_A_Parameter is not statically deeper than one of Stepper.

-----------------
BTW, I prefer an OO design with Stepper being an abstract primitive
operation of some object:

type Data is abstract ...;
function Stepper (State : in out Data; F : Foo) return Foo is abstract;
procedure Step_It (Data : in out Data'Class);
-------------------

> If it works, we have closures. If it doesn't I fail to see what you
> mean by 'returning procedures'.

Hmm, it must be obvious. How could it be so that we have
access-to-procedure types, but no procedure types?

>>> I can't believe I even have to spell it out. On the other side -- why
>>> do I argue? You're even opposing generics.
>>
>> Yes I do! (:-))
> 
> Yes I notice. That does make you avantgarde, certainly. Even the Java
> people learned that there is no live without generics / functors.

As if Java weren't bad enough for them... (:-))

>>>> This is A) mess, B) sloppy programming, C) impossible model in our
>>>> networking distributed relativist world.
>>> 
>>> There are BTW, even distributed garbage collection algorithms. This
>>> probably will enrage you no end? :-).
>>
>> Yes I know, they are usually called trojans...  (:-))
> 
> ?.

They arrive at your computer and collect garbage while you don't notice
that... (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-03 14:16                                   ` Markus E Leypold
@ 2007-02-04 19:33                                     ` Dmitry A. Kazakov
  2007-02-04 20:44                                       ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-04 19:33 UTC (permalink / raw)


On Sat, 03 Feb 2007 15:16:21 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Fri, 02 Feb 2007 13:34:21 +0100, Markus E Leypold wrote:
>>
>>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>>> 
>>>> Sure you can:
>>>>
>>>> type Additive is abstract;  -- Should be enough for an interface
>>>> function "+" (A, B : Additive) return Additive is abstract;
>>>>
>>>> Here I am not sure about "increment" variable, because it is not
>>>> initialized. Anyway:
>>>>
>>>> function Increment return Additive is abstract;
>>>>    -- Better be getter/setter pair. Unfortunately Ada does not have
>>>>    -- "abstract variable interface" it should have
>>>>
>>>> function Inc (Arg : Additive'Class) return Additive'Class is
>>>> begin
>>>>    return Increment + Arg;
>>>> end Inc;
>>>>
>>>> There is a very simple rule for all this:
>>>>
>>>> 1. Formal generic subroutines -> abstract primitive operations
>>>> 2. Subroutines in the body -> class-wide operations.
>>> 
>>> So you derive from Additive to get a specific implementation?
>>> 
>>> Like
>>> 
>>>   type Vector   is Additive with ...;
>>>   type CountVal is Additive with ...;
>>> 
>>> Right? But then, you should note that with
>>> 
>>>   C: Countval;
>>> 
>>>   Inc(C).
>>> 
>>> returns an 'Additive', not a 'CountVal'.
>>
>> No, it does Additive'Class! Thus, no problem.
> 
> I've been expressing myself sloppily. It should return a CountVal, not
> an Additive'Class. CountVal(s) could not be added to Vectors -- there
> is a difference.

When Inc returns CountVal then it is not a member of Additive in the
result. So you cannot reuse the result for other members of Additive.

>> (note that the goal was polymorphic Inc, if you wanted it covariant that
> 
> No. The goal was to have mechanism to write the algorithms once and
> being able to pull various actual operators/procedures from that
> without replicating the specification of the algorithm.

If you want to reuse Inc's algorithm in various specific Inc, then you
would use wrappers:

function Common_Inc (Arg : Additive'Class) return Additive'Class is
begin
   return Increment + Arg;
end Common_Inc;
function Inc (Arg : Additive) return Additive is abstract;

function Inc (Arg : CountVal) return CountVal is
begin
   return CountVal (Common_Inc (Arg));
end Inc;

However I don't understand the reason, why.

> Let me suggest 1 other example for the usefulness of generics and than
> 1 example where generics (Ada-style) help, but parametric polymorphism
> (what I have been talking about as parametrized types) would actually
> bring an advantage.
> 
> Example 1:
> 
>   An abstract vector space is defined as a sets of vectors u,v,w,...
>   and scalars a,b,c, ...  with a number of operations:
>    
>     - scalars are a field, i.e. have additions an multiplication with certain properties
>     - there is amultiplication between scalars and vectors: a*v
>     - there is an addition between vectors.
> 
>   A normalized vector space introduces the idea of distance between
>   verctors, Again certain laws apply. For the programmer: There is a
>   dist-functions which takes 2 vectors a parameters.
> 
>   You can do a really huge amount of mathematics with this
>   axioms. Examples of vector spaces with thos properties would be
>   certain subsets of functions, finite vectors (i.e. arrays of N
>   components), polynomes, etc.
> 
>   One application would be, given a function on the vector space f : V
>   -> Real_Numbers to find minima of this function.
> 
>   There is an algorithm that is not very efficient but works in a vast
>   number of cases without having to use specific knowledge of the
>   underlying vector space.
> 
> With Ada generics I'd define the vector space interface a generic and
> instantiate accordingly. The algorithm to fin minima would be
> implemented as generic and take a vector space package as generic
> parameter.

I am deeply unsatisfied with generics here, because this example is almost
exactly what I am doing. I have:

Over a real domain I have:

1. Scalars
2. Intervals
3. Fuzzy numbers with a partly linear membership function
4. Fuzzy numbers with membership function of a nested intervals
    + algebra of 1, 2, 3, 4 and between them

But that is not all:
5. Sets of 3
   + lattice over it

Not yet:

6. Dimensioned scalars
7. Dimensioned intervals
    + algebra
...
n. Sets of dimensioned numbers
   + lattice over it
   + algebra between dimensioned things and scalars
...
n+1 String I/O for all of them
   + names for the members of the sets
...
n+m+1 GTK widgets for them
...

Should I show you generic specifications of this mess? Can you imagine the
instantiations chain from Float down to GTK+ widget (provided your GNAT
would be able to eat it (:-()? I can give you a link...

I DON"T want it. I need a clean way to describe an algebra and a lattice
and then mix and bend them as I want. I want to be able to derive
everything from a scalar. There is always much talk about how mathematical
are generics and FP, it is amazing to me how readily people believe in
that.

> Example 2
> 
>   Write a quicksort algorithm on arrays which can be reused for arrays
>   of almost arbitrary elements, if a suitable order relationship is
>   defined. Note that I might want to sort _the same_ elements with
>   different orders (like insurance number or alphabetically ...).

No problem [with some extensions to Ada types system]. You need two
interfaces: Array and Weakly_Ordered. You could also sort same elements of
same container object using different orders. [However usually in such
cases one better uses a sorted view, rather than physically sorts.

> Yes. But Georg used generics, so it worked. You said, you don't want
> generics, but it doesn't work. I fail to see, how George has to use
> select an example that makes your misguided hypothesis -- that
> generics are not neeeded -- hold up. Quite the opposite.

generic
   type Element is private;
   with function "<" (Left, Right : Element)  return Boolean;
   with function "=" (Left, Right : Element)  return Boolean;
   type Container ...
procedure Quick_Sort (...);

How can this can sort the same container? This is a clear case where
generics don't work. You need "<" and "=" be passed as proper functions
rather than formal generic functions. [It is not static polymorphism
anymore.]

> As an software engineer I'd always prefer
> generics, since they avoid polymorphism where no polymorphism is
> required or intended.

? Generics = static polymorphism.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-04 17:55                                 ` Dmitry A. Kazakov
@ 2007-02-04 20:18                                   ` Markus E Leypold
  2007-02-04 21:29                                     ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-04 20:18 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:


>>>>> What about maintainability and reasoning?
>>>> 
>>>> What about it? It's easy with value-oriented languages (i.e. languages
>>>> that just produce new values from old ones in a non-destructive
>>>> fashion). Functional languages do this therefore reasoning is a well
>>>> developed art there. But the representations of all those values
>>>> (trees, lists, ...) (a) rely heavily on representation sharing and (b)
>>>> use references because of that. They need and use GC.
>> 
>>> You are mixing by-value vs. by-reference semantics with no-identity vs.
>>> has-identity. 
>> 
>> No. Values have no "identity".
>
> It depends. Polymorphic value has an identity. But I understand what you
> mean, and that meaning is correct.
>
>> "Object identity" smacks heavily of
>> storage and memory chunks. 
>
> No. Identity is just a function id:X->I, where I is some set of comparable
> values. No magic. X'Address or X'Access can serve as an identity function.

OK. In the context of your comment "You are mixing by-value
vs. by-reference semantics with no-identity vs.  has-identity" I had
the impression you were talking about the identities of objects. I
still do not understand your comment comment then: I'm ceratinly not
mixing up anything here.

> Or not, if you have relocatable objects, for instance. In general nothing
> in objects requires X'Address. There are other kinds of identity. For
> example id:X->T, where T is the type of X.
>
>> I fail to see how "identity" comes into the question of
>> "value-oriented" languages (your term, BTW) and represenation
>> sharing. GC is about references (to memory) and representation sharing
>> works with references to parts of another "value" (see lisp
>> lists). Represnattion sharing needs either reference counting
>> (inefficient, is also just some kind of instant GC) or GC.

> The example you mentioned was a tree. Do you want to share subtrees between
> other trees? 

That's the way it is usually done in the implementation of a
functional language.

> When a subtree C of a tree A changes, should it ("it" is an
> identity (:-)) in the tree B be changed as well? 

Change? There is no "change" in a functional world, only the
production of new values (i.e. new elements from a specific set, in
your case the set of all trees).

> You cannot answer these questions in terms of tree values.

It doesn't make sense in terms of tree values, that is, in a purely
functional world.

> For all possible answers there exist
> corresponding values of A, B, C, C in A, C in B, A/\B etc. 

> It is a semantic
> problem which has nothing to do with representation, be it
> immutable-functional or shared. 

> Identity is a way to express this semantics.

?? Given the context we started at, I'm not sure I understand what
you're talking about.

> Value vs. reference is a semantically irrelevant implementation
> detail.

Sementically, yes. If we are talking about a functional language. Not,
if we talk about some imperative language.

But we were talking about useful implementations of a functional
view: There sharing is unavoidable thus GC. (Of course if you don't
care for efficiency ...).

Regards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-04 18:38                                         ` Dmitry A. Kazakov
@ 2007-02-04 20:24                                           ` Markus E Leypold
  2007-02-04 21:57                                             ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-04 20:24 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Sat, 03 Feb 2007 15:28:53 +0100, Markus E Leypold wrote:
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>> On Fri, 02 Feb 2007 13:44:42 +0100, Markus E Leypold wrote:
>>>
>>>> Closure don't "make things global". They do it as much as returning an
>>>> Integer, makes, GOD FORBID!, a value global (don't return Integers
>>>> people, that makes a value global and we all know, global is bad --
>>>> how nonsensical is that?).
>>>
>>> You return a value of the type (Integer) which scope encloses the
>>> subprogram returning that value.
>> 
>> Same applies to closures when they are returned. Where is the problem?
>
> The problem is this:
>
> function Foo return ?What? is
>    type Bar is ...;
>    X : Bar;
> begin
>    return X;
> end Foo;
>
>>> I am a long proponent of procedural types for Ada. Now consider. A
>>> procedure is a limited object. Limited objects can be returned in Ada 2005.
>>> What else you need? [Note, this is still not an upward closure, I am
>>> opposing.]
>> 
>> I do need that
>> 
>>   function Make_a_Stepper (K:Integer) return ... is
>> 
>>       N : Integer := Complicated_Function(K);
>> 
>>       function Stepper(F:Foo) return Foo is
>>     
>>         begin
>>           
>>           return Foo + N;
>> 
>>         end;
>> 
>>     begin
>>       return Stepper;
>>     end;
>> 
>> 
>> would work. And no nitpicking, please, if I made syntax error here: The
>> intention should be clear.
>
> I suppose it is. Here I propose a closure-free solution:
>
> package Something_Very_Functional is
>    type function Stepper (F : Foo) return Foo;
>       -- The type of a function we want to pass somewhere
>    procedure Step_It (Visitor : Stepper'Class);
>    ...
> end Something_Very_Functional;
>
> package Stepper_Factory is
>    function Create (K : Integer) return Stepper'Class;
>       -- This is a constructor for the stepper provided here. We
>       -- don't want to expose any details about it, so the
>       -- Stepper type is made private.
> private
>    type Stepper_With_A_Parameter (N : Integer) is new Stepper;
>       -- This is a derived type to have a discriminant, which could serve
>       -- as a carrier for the locally computed value.
> end Stepper_Factory;
>
> package body Stepper_Factory is
>    function Create (K : Integer) return Stepper'Class is
>    begin
>       return
>          Stepper_With_A_Parameter'
>          (  N => Complicated_Function (K),
>          with
>             begin
>                return F + N;
>             end;
>          );
> end Stepper_Factory;
>
> Some notes. Within the "procedural record extension" the visibility rules
> are such that local variables of Create are invisible. The package data are
> visible and accessibility checks are made to ensure that the scope of
> Stepper_With_A_Parameter is not statically deeper than one of Stepper.
>
> -----------------
> BTW, I prefer an OO design with Stepper being an abstract primitive
> operation of some object:
>
> type Data is abstract ...;
> function Stepper (State : in out Data; F : Foo) return Foo is abstract;
> procedure Step_It (Data : in out Data'Class);
> -------------------

Yes. That is exactly what I referred to in earlier posts as clumsy --
to replace closure by this kind of dance.


>
>> If it works, we have closures. If it doesn't I fail to see what you
>> mean by 'returning procedures'.
>
> Hmm, it must be obvious. How could it be so that we have
> access-to-procedure types, but no procedure types?

?? I still do not understand. If memory serves me right, I cannot
return an access to a local procedure. So?


>>>> I can't believe I even have to spell it out. On the other side -- why
>>>> do I argue? You're even opposing generics.
>>>
>>> Yes I do! (:-))
>> 
>> Yes I notice. That does make you avantgarde, certainly. Even the Java
>> people learned that there is no live without generics / functors.
>
> As if Java weren't bad enough for them... (:-))

Well -- actually that was one of the things they did right for once,
and it wasn't even tacked on ad hoc, but carefully designed by people
who do know hwat they are talking about (i.e. Philip Wadler of FP
fame).

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-04 19:33                                     ` Dmitry A. Kazakov
@ 2007-02-04 20:44                                       ` Markus E Leypold
  2007-02-04 23:00                                         ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-04 20:44 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Sat, 03 Feb 2007 15:16:21 +0100, Markus E Leypold wrote:
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>> On Fri, 02 Feb 2007 13:34:21 +0100, Markus E Leypold wrote:
>>>
>>>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>>>> 
>>>>> Sure you can:
>>>>>
>>>>> type Additive is abstract;  -- Should be enough for an interface
>>>>> function "+" (A, B : Additive) return Additive is abstract;
>>>>>
>>>>> Here I am not sure about "increment" variable, because it is not
>>>>> initialized. Anyway:
>>>>>
>>>>> function Increment return Additive is abstract;
>>>>>    -- Better be getter/setter pair. Unfortunately Ada does not have
>>>>>    -- "abstract variable interface" it should have
>>>>>
>>>>> function Inc (Arg : Additive'Class) return Additive'Class is
>>>>> begin
>>>>>    return Increment + Arg;
>>>>> end Inc;
>>>>>
>>>>> There is a very simple rule for all this:
>>>>>
>>>>> 1. Formal generic subroutines -> abstract primitive operations
>>>>> 2. Subroutines in the body -> class-wide operations.
>>>> 
>>>> So you derive from Additive to get a specific implementation?
>>>> 
>>>> Like
>>>> 
>>>>   type Vector   is Additive with ...;
>>>>   type CountVal is Additive with ...;
>>>> 
>>>> Right? But then, you should note that with
>>>> 
>>>>   C: Countval;
>>>> 
>>>>   Inc(C).
>>>> 
>>>> returns an 'Additive', not a 'CountVal'.
>>>
>>> No, it does Additive'Class! Thus, no problem.
>> 
>> I've been expressing myself sloppily. It should return a CountVal, not
>> an Additive'Class. CountVal(s) could not be added to Vectors -- there
>> is a difference.
>
> When Inc returns CountVal then it is not a member of Additive in the
> result. So you cannot reuse the result for other members of Additive.

Exactly. So it should not return Additive'Class (which can be added to
other members of Additive'Class), but must return CountVal to prevent
this. But since you said

>>> No, it does Additive'Class! Thus, no problem.

that means your approach fails. Do you see the different type
structure the approach using generics produces as compared to your
class inheritance approach?


>>> (note that the goal was polymorphic Inc, if you wanted it covariant that
>> 
>> No. The goal was to have mechanism to write the algorithms once and
>> being able to pull various actual operators/procedures from that
>> without replicating the specification of the algorithm.
>
> If you want to reuse Inc's algorithm in various specific Inc, then you
> would use wrappers:

No I wouldn't. At the most I'd want to instantiate one generic. See:
We are not talking about one procedure. In general we are talking
about whole libraries (i.e. BTree manipulation and search) which
contain -- in extremis -- 10 to hundred procedures, and I certainly do
NOT want to wrap all of them for every new instance differently typed
instance of the problem.


> function Common_Inc (Arg : Additive'Class) return Additive'Class is
> begin
>    return Increment + Arg;
> end Common_Inc;
> function Inc (Arg : Additive) return Additive is abstract;
>
> function Inc (Arg : CountVal) return CountVal is
> begin
>    return CountVal (Common_Inc (Arg));
> end Inc;
>
> However I don't understand the reason, why.

Look at the vector space example.


>> Let me suggest 1 other example for the usefulness of generics and than
>> 1 example where generics (Ada-style) help, but parametric polymorphism
>> (what I have been talking about as parametrized types) would actually
>> bring an advantage.
>> 
>> Example 1:
>> 
>>   An abstract vector space is defined as a sets of vectors u,v,w,...
>>   and scalars a,b,c, ...  with a number of operations:
>>    
>>     - scalars are a field, i.e. have additions an multiplication with certain properties
>>     - there is amultiplication between scalars and vectors: a*v
>>     - there is an addition between vectors.
>> 
>>   A normalized vector space introduces the idea of distance between
>>   verctors, Again certain laws apply. For the programmer: There is a
>>   dist-functions which takes 2 vectors a parameters.
>> 
>>   You can do a really huge amount of mathematics with this
>>   axioms. Examples of vector spaces with thos properties would be
>>   certain subsets of functions, finite vectors (i.e. arrays of N
>>   components), polynomes, etc.
>> 
>>   One application would be, given a function on the vector space f : V
>>   -> Real_Numbers to find minima of this function.
>> 
>>   There is an algorithm that is not very efficient but works in a vast
>>   number of cases without having to use specific knowledge of the
>>   underlying vector space.
>> 
>> With Ada generics I'd define the vector space interface a generic and
>> instantiate accordingly. The algorithm to fin minima would be
>> implemented as generic and take a vector space package as generic
>> parameter.


> I am deeply unsatisfied with generics here, because this example is almost
> exactly what I am doing. I have:
>
> Over a real domain I have:
>
> 1. Scalars
> 2. Intervals
> 3. Fuzzy numbers with a partly linear membership function
> 4. Fuzzy numbers with membership function of a nested intervals
>     + algebra of 1, 2, 3, 4 and between them
>
> But that is not all:
> 5. Sets of 3
>    + lattice over it
>
> Not yet:
>
> 6. Dimensioned scalars
> 7. Dimensioned intervals
>     + algebra
> ...
> n. Sets of dimensioned numbers
>    + lattice over it
>    + algebra between dimensioned things and scalars
> ...
> n+1 String I/O for all of them
>    + names for the members of the sets
> ...
> n+m+1 GTK widgets for them
> ...

> Should I show you generic specifications of this mess? 

And it does become better with OO? I doubt that very much. Indeed the
trick is, not to have n+m+1 instances but package instances of
generics into libraries and instantiate whole "world" in one
stroke. Have a look into the Booch components as an example.

> Can you imagine the instantiations chain from Float down to GTK+
> widget (provided your GNAT would be able to eat it (:-()? I can give
> you a link...

I have solved similar "problems". So, thanks, no.

> I DON"T want it. I need a clean way to describe an algebra and a lattice
> and then mix and bend them as I want. 

I doubt OO is the way. And whereas the Ada generics are somewhat
cumbersome (as compared with parametric polymorphism + functors + type
inference, which George doesn't like), they are your best way out of
this mess if you don't want to use a ML-like type system. They are,
indeed, good. 

> I want to be able to derive everything from a scalar. There is
> always much talk about how mathematical are generics and FP, it is
> amazing to me how readily people believe in that.

So they are wrong? :-)) Perhaps a big conspiracy, like the big C/C++
conspiracy and the big Java conspiracy.

People, you make me laugh. Isn't it avantgarde enough just to use Ada?
Must everyone else in this world be a misguided moron or an evil
propagandist?



>> Example 2
>> 
>>   Write a quicksort algorithm on arrays which can be reused for arrays
>>   of almost arbitrary elements, if a suitable order relationship is
>>   defined. Note that I might want to sort _the same_ elements with
>>   different orders (like insurance number or alphabetically ...).
>

> No problem [with some extensions to Ada types system]. 

A yes, certainly. They are already here thos extension. Their name is
-- da da da daaamm! -- generics!!!

> You need two interfaces: Array and Weakly_Ordered. 

Which forced me to use tagged type all day long and all the way, where
simply traditional array access would suffice. Let's have a look at
verifiability: I'm sure, generics can be verified better and easier. I
heard something about tagged types being really out in that sector,
and by the way, I agree. The dispatch over 13 subclasses mess
"patterns" introduce is just spagetthi programming with a big
name. (Not to disparage pattern, but making something OO doesn't make
it modular or structured per se).

> You could also sort same elements of same container object using
> different orders. [However usually in such cases one better uses a
> sorted view, rather than physically sorts.

That you can do, but I still could just sort accesses to the
elements. Withou the OO mess.


>> Yes. But Georg used generics, so it worked. You said, you don't want
>> generics, but it doesn't work. I fail to see, how George has to use
>> select an example that makes your misguided hypothesis -- that
>> generics are not neeeded -- hold up. Quite the opposite.


> generic
>    type Element is private;
>    with function "<" (Left, Right : Element)  return Boolean;
>    with function "=" (Left, Right : Element)  return Boolean;
>    type Container ...
> procedure Quick_Sort (...);

> How can this can sort the same container?

Which "same container"?

> This is a clear case where
> generics don't work. 

 Have you recently had a look to Ada.Containers? There is a generic array sort. So? 
 And BTW, you don't need "=".

> You need "<" and "=" be passed as proper functions
> rather than formal generic functions. [It is not static polymorphism
> anymore.]

And, what is the problem with that.

>> As an software engineer I'd always prefer
>> generics, since they avoid polymorphism where no polymorphism is
>> required or intended.
>
> ? Generics = static polymorphism.

No. Query negative response indicated. READY> [].

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-04 17:36                                     ` Hyman Rosen
@ 2007-02-04 21:21                                       ` Ray Blaak
  0 siblings, 0 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-04 21:21 UTC (permalink / raw)


Hyman Rosen <hyrosen@mail.com> writes:
> Ray Blaak wrote:
> > GC only collects unreachable memory.
> 
> The proper definition of (ideal) GC is that it collects
> memory for reuse from objects that the program will never
> access again. Reachability and such are implementation
> details.

Fair enough. In practice, is any other measure besides reachability ever used?

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-04 20:18                                   ` Markus E Leypold
@ 2007-02-04 21:29                                     ` Dmitry A. Kazakov
  2007-02-04 22:33                                       ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-04 21:29 UTC (permalink / raw)


On Sun, 04 Feb 2007 21:18:55 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> Or not, if you have relocatable objects, for instance. In general nothing
>> in objects requires X'Address. There are other kinds of identity. For
>> example id:X->T, where T is the type of X.
>>
>>> I fail to see how "identity" comes into the question of
>>> "value-oriented" languages (your term, BTW) and represenation
>>> sharing. GC is about references (to memory) and representation sharing
>>> works with references to parts of another "value" (see lisp
>>> lists). Represnattion sharing needs either reference counting
>>> (inefficient, is also just some kind of instant GC) or GC.
> 
>> The example you mentioned was a tree. Do you want to share subtrees between
>> other trees? 
> 
> That's the way it is usually done in the implementation of a
> functional language.

They don't share subtrees they share memory (implementation), which is
casual to the program semantics.

>> When a subtree C of a tree A changes, should it ("it" is an
>> identity (:-)) in the tree B be changed as well? 
> 
> Change? There is no "change" in a functional world, only the
> production of new values (i.e. new elements from a specific set, in
> your case the set of all trees).

(New values? What is the difference between new and old values? Do values
carry RFID chips with timestamps? (:-))

No, you are wrong. In mathematics there is a notion of dependent and
independent variables. Mathematics is extremely "non-functional". Consider
a mapping between nodes of two trees which has to be preserved. Now a
function of the first tree is evaluated, what happens with the function of
the second tree?
       g
 A  ------>  B
 |           |
f|           | id = nothing happens
 |           |
 V     g     V
f(A) ------> B (broken invariant)

>> You cannot answer these questions in terms of tree values.
> 
> It doesn't make sense in terms of tree values, that is, in a purely
> functional world.

Of course it does. Consider a graph and its complement. Is it one value or
two? [To prevent a possible discussion, consider an incomputable g.]

>> Value vs. reference is a semantically irrelevant implementation
>> detail.
> 
> Sementically, yes. If we are talking about a functional language. Not,
> if we talk about some imperative language.

How so? Ada is imperative, yet Foo (A) should not depend on whether the
compiler passes A by value or by reference.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-04 20:24                                           ` Markus E Leypold
@ 2007-02-04 21:57                                             ` Dmitry A. Kazakov
  2007-02-04 22:47                                               ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-04 21:57 UTC (permalink / raw)


On Sun, 04 Feb 2007 21:24:01 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Sat, 03 Feb 2007 15:28:53 +0100, Markus E Leypold wrote:
>>
>>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>>> 
>>>> On Fri, 02 Feb 2007 13:44:42 +0100, Markus E Leypold wrote:
>>>>
>>>>> Closure don't "make things global". They do it as much as returning an
>>>>> Integer, makes, GOD FORBID!, a value global (don't return Integers
>>>>> people, that makes a value global and we all know, global is bad --
>>>>> how nonsensical is that?).
>>>>
>>>> You return a value of the type (Integer) which scope encloses the
>>>> subprogram returning that value.
>>> 
>>> Same applies to closures when they are returned. Where is the problem?
>>
>> The problem is this:
>>
>> function Foo return ?What? is
>>    type Bar is ...;
>>    X : Bar;
>> begin
>>    return X;
>> end Foo;
>>
>>>> I am a long proponent of procedural types for Ada. Now consider. A
>>>> procedure is a limited object. Limited objects can be returned in Ada 2005.
>>>> What else you need? [Note, this is still not an upward closure, I am
>>>> opposing.]
>>> 
>>> I do need that
>>> 
>>>   function Make_a_Stepper (K:Integer) return ... is
>>> 
>>>       N : Integer := Complicated_Function(K);
>>> 
>>>       function Stepper(F:Foo) return Foo is
>>>     
>>>         begin
>>>           
>>>           return Foo + N;
>>> 
>>>         end;
>>> 
>>>     begin
>>>       return Stepper;
>>>     end;
>>> 
>>> would work. And no nitpicking, please, if I made syntax error here: The
>>> intention should be clear.
>>
>> I suppose it is. Here I propose a closure-free solution:
>>
>> package Something_Very_Functional is
>>    type function Stepper (F : Foo) return Foo;
>>       -- The type of a function we want to pass somewhere
>>    procedure Step_It (Visitor : Stepper'Class);
>>    ...
>> end Something_Very_Functional;
>>
>> package Stepper_Factory is
>>    function Create (K : Integer) return Stepper'Class;
>>       -- This is a constructor for the stepper provided here. We
>>       -- don't want to expose any details about it, so the
>>       -- Stepper type is made private.
>> private
>>    type Stepper_With_A_Parameter (N : Integer) is new Stepper;
>>       -- This is a derived type to have a discriminant, which could serve
>>       -- as a carrier for the locally computed value.
>> end Stepper_Factory;
>>
>> package body Stepper_Factory is
>>    function Create (K : Integer) return Stepper'Class is
>>    begin
>>       return
>>          Stepper_With_A_Parameter'
>>          (  N => Complicated_Function (K),
>>          with
>>             begin
>>                return F + N;
>>             end;
>>          );
>> end Stepper_Factory;
>>
>> Some notes. Within the "procedural record extension" the visibility rules
>> are such that local variables of Create are invisible. The package data are
>> visible and accessibility checks are made to ensure that the scope of
>> Stepper_With_A_Parameter is not statically deeper than one of Stepper.
>>
>> -----------------
>> BTW, I prefer an OO design with Stepper being an abstract primitive
>> operation of some object:
>>
>> type Data is abstract ...;
>> function Stepper (State : in out Data; F : Foo) return Foo is abstract;
>> procedure Step_It (Data : in out Data'Class);
>> -------------------
> 
> Yes. That is exactly what I referred to in earlier posts as clumsy --
> to replace closure by this kind of dance.

Do you refer to the first variant or to the second?

Doesn't it solve the problem? Note that in your solution you were unable to
describe what function Make_a_Stepper actually returns. Consider also this:

protected type Bar is
   function Make_a_Mess return ...;
private
   Baz : Integer;
      -- To be put into the closure together with barriers of the entries!

how do you return a closure from a protected object or a task rendezvous?
Will it be able to assign Baz?

I certainly prefer OO solution because it explicitly maintains the context
of the operation. It is easier to understand for the reader and it easier
to maintain.

>>> If it works, we have closures. If it doesn't I fail to see what you
>>> mean by 'returning procedures'.
>>
>> Hmm, it must be obvious. How could it be so that we have
>> access-to-procedure types, but no procedure types?
> 
> ?? I still do not understand. If memory serves me right, I cannot
> return an access to a local procedure. So?

I merely explained why I consider procedural types necessary. Accessibility
checks are needed for both them and access to procedure. Though for
procedural values they will be much weaker. Downward closures would require
no checks at all.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-04 21:29                                     ` Dmitry A. Kazakov
@ 2007-02-04 22:33                                       ` Markus E Leypold
  2007-02-05  9:20                                         ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-04 22:33 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Sun, 04 Feb 2007 21:18:55 +0100, Markus E Leypold wrote:
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>> Or not, if you have relocatable objects, for instance. In general nothing
>>> in objects requires X'Address. There are other kinds of identity. For
>>> example id:X->T, where T is the type of X.
>>>
>>>> I fail to see how "identity" comes into the question of
>>>> "value-oriented" languages (your term, BTW) and represenation
>>>> sharing. GC is about references (to memory) and representation sharing
>>>> works with references to parts of another "value" (see lisp
>>>> lists). Represnattion sharing needs either reference counting
>>>> (inefficient, is also just some kind of instant GC) or GC.
>> 
>>> The example you mentioned was a tree. Do you want to share subtrees between
>>> other trees? 
>> 
>> That's the way it is usually done in the implementation of a
>> functional language.
>
> They don't share subtrees they share memory (implementation), which is
> casual to the program semantics.

They share representation in the underlying system / VM / whatever you
call it. That is what I (expressing myself in shorthand) mean, by
saying they share subtrees. They (the trees) don't share memory.

>
>>> When a subtree C of a tree A changes, should it ("it" is an
>>> identity (:-)) in the tree B be changed as well? 
>> 
>> Change? There is no "change" in a functional world, only the
>> production of new values (i.e. new elements from a specific set, in
>> your case the set of all trees).
>
> (New values? What is the difference between new and old values? Do values
> carry RFID chips with timestamps? (:-))

Don't play dumb. There is certainly an order between value occuring in
a (functional) program.

> No, you are wrong. In mathematics there is a notion of dependent and
> independent variables. 

No. That must be physics you're talking about. In mathematics there
are only elements and names for elements. If it "depends", what people
mean is a function.

> Mathematics is extremely "non-functional". 

How is that?

> Consider a mapping between nodes of two trees which has to be
> preserved. Now a function of the first tree is evaluated, what
> happens with the function of the second tree?

>        g
>  A  ------>  B
>  |           |
> f|           | id = nothing happens
>  |           |
>  V     g     V
> f(A) ------> B (broken invariant)


The function of the second tree? What are you talking about? Let me
repeat: In pure functional languages nothing "changes".


>>> You cannot answer these questions in terms of tree values.
>> 
>> It doesn't make sense in terms of tree values, that is, in a purely
>> functional world.

(BTW: The quoting sucks a bit -- one cannot readily see that "what
makes sense" here, is the question "When a subtree C of a tree A
changes, should it ("it" is an identity (:-)) in the tree B be changed
as well?" -- and BTW2: "It" is not an identity. Not in any natural
language I know of. Linguists certainly have a better word for it, but
it is a context dependent reference.)

> Of course it does. Consider a graph and its complement. Is it one value or
> two? [To prevent a possible discussion, consider an incomputable g.]

They are 2 values. Two different elements (except in the case it's by
accident, the same ...). I didn't say wether you can "compute" it
(i.e. find it).

>>> Value vs. reference is a semantically irrelevant implementation
>>> detail.
>> 
>> Sementically, yes. If we are talking about a functional language. Not,
>> if we talk about some imperative language.


> How so? Ada is imperative, yet Foo (A) should not depend on whether the
> compiler passes A by value or by reference.

Arrgs.  You're just not listening, do you?  But look: It makes a
difference if you pass an access to a variable instead of passing the
variable as an in/out parameter and there is a difference if you
pass it as an in parameter. This is difference in the respective
semantics of the statements and passing an access is what is usually
called passing a reference. It's a completely different thing if
you're talking about references in the implementation of what the
compiler produces or about references in the languages. 

We are talking firstly about languages here and there is, let me
repeat, NO REFERENCES IN PURE FUNCTIONAL LANGUAGES (value oriented
languages as you said). In their semantics if you like.

But we are talking secondly about the usefulness of GC and when it is
needed in language design.  And there it is so, that functional
languages can only be _implemented_ by representation sharing between
values (simple example: The way lists work in a FP), because else a
lot of copying would have to take place. But representation sharing is
_implemented_ by pointers/references, and since sharing takes place,
multiple pointers often point to one shared "sub-representation". And
since we want to pass values freely around (there is no scope for
values, see the other sub-thread ...)  we hardly ever know statically
how long a particular sub-representation will live.

So a garbage collection mechanism is required. This might be reference
counting (which is indeed just some ad hoc kind of GC) or some more
advanced algorithm. As it turns out, suitably tuned advanced
algorithms, like e.g. a generational garbage collector are much more
efficient since they avoid the overhead of maintaining the reference
counter every time a reference is duplicated (passed up and down the
call stack) and instead do all the work every now and then.

We can, BTW, learn something form this discussion: It might be true
that it is possible to implement some kid of functional structures in
Ada or any other imperative language of the Algol family (i.e
structures you don't mutate to do date manipulation but instead
produce new structures from those you already have), but without a GC
already built in, it will always be inefficient (you either have to
copy representations gratuitously or you'll have to do reference
counting which is also not very efficient. 

But look: I'm just tired of this discussion. It's rather some time ago
since I've been reading up on the relevant vocabulary. But somehow I
doubt it is my inability to make myself understood that the discussion
with you somehow never comes to the point. Partly it's because we seem
to miss a goal in that discussion (what was the thing you wanted to
convince me of?), partly I suggest you read all that up in a book on
language design an implementation.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-04 21:57                                             ` Dmitry A. Kazakov
@ 2007-02-04 22:47                                               ` Markus E Leypold
  2007-02-04 23:08                                                 ` Markus E Leypold
  2007-02-05  8:47                                                 ` Dmitry A. Kazakov
  0 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-04 22:47 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

>> 
>> Yes. That is exactly what I referred to in earlier posts as clumsy --
>> to replace closure by this kind of dance.
>
> Do you refer to the first variant or to the second?

The "closure free solution" (I'm not quoting it again -- it's long
enough as it is).

> Doesn't it solve the problem? 

Perhaps. This (and some other stuff)

>>    type function Stepper (F : Foo) return Foo;
>>       -- The type of a function we want to pass somewhere
>>    procedure Step_It (Visitor : Stepper'Class);
>>    ...

doesn't look like Ada to me. So I assume, you're still proposing
language extensions. With unclear semantics to me. And perhaps they
alow you to do what you want: But so complicated, well ...

You can do that in FORTRAN and too: With even less comprehensibility.

> Note that in your solution you were unable to
> describe what function Make_a_Stepper actually returns. 

Since there is no closure type, presently. But if you like, add a language extension:

  type stepper_t is procedure ( I : Integer ) return Integer;

and the fill in this type.



> Consider also this:
>
> protected type Bar is
>    function Make_a_Mess return ...;
> private
>    Baz : Integer;
>       -- To be put into the closure together with barriers of the entries!
>
> how do you return a closure from a protected object or a task rendezvous?
> Will it be able to assign Baz?

Interesting thought. Either it must inherit the barriers, somehow, or
you simply can't.

> I certainly prefer OO solution because it explicitly maintains the context
> of the operation. It is easier to understand for the reader and it easier
> to maintain.

My mileage varies. 


>>>> If it works, we have closures. If it doesn't I fail to see what you
>>>> mean by 'returning procedures'.
>>>
>>> Hmm, it must be obvious. How could it be so that we have
>>> access-to-procedure types, but no procedure types?
>> 
>> ?? I still do not understand. If memory serves me right, I cannot
>> return an access to a local procedure. So?

> I merely explained why I consider procedural types necessary. 

Well -- I think your addition is not worth the effort, but YMMV. I
still do not see, how you would manage to pass procedure upwards which
us variables that are defined in their environment.

> Accessibility checks are needed for both them and access to
> procedure. Though for procedural values they will be much
> weaker. Downward closures would require no checks at all.

Just write an addendum to the ARM. Perhaps we'll find it in Ada
2020. But somehow I doubt it.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-04 20:44                                       ` Markus E Leypold
@ 2007-02-04 23:00                                         ` Dmitry A. Kazakov
  2007-02-04 23:21                                           ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-04 23:00 UTC (permalink / raw)


On Sun, 04 Feb 2007 21:44:17 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Sat, 03 Feb 2007 15:16:21 +0100, Markus E Leypold wrote:
>>
>>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>>> 
>>>> On Fri, 02 Feb 2007 13:34:21 +0100, Markus E Leypold wrote:
>>>>
>>> I've been expressing myself sloppily. It should return a CountVal, not
>>> an Additive'Class. CountVal(s) could not be added to Vectors -- there
>>> is a difference.
>>
>> When Inc returns CountVal then it is not a member of Additive in the
>> result. So you cannot reuse the result for other members of Additive.
> 
> Exactly. So it should not return Additive'Class (which can be added to
> other members of Additive'Class), but must return CountVal to prevent
> this.

If that was the only reason, then that worked perfectly in Ada 83, before
Ada 95 broke it for tagged types [there were some reasons, though]:

type Isolated_CountVal is new CountVal; -- Done

>>>> No, it does Additive'Class! Thus, no problem.
> 
> that means your approach fails. Do you see the different type
> structure the approach using generics produces as compared to your
> class inheritance approach?

First, what was the problem? Moving target is a difficult thing to track.

>>>> (note that the goal was polymorphic Inc, if you wanted it covariant that
>>> 
>>> No. The goal was to have mechanism to write the algorithms once and
>>> being able to pull various actual operators/procedures from that
>>> without replicating the specification of the algorithm.
>>
>> If you want to reuse Inc's algorithm in various specific Inc, then you
>> would use wrappers:
> 
> No I wouldn't. At the most I'd want to instantiate one generic. See:
> We are not talking about one procedure. In general we are talking
> about whole libraries (i.e. BTree manipulation and search) which
> contain -- in extremis -- 10 to hundred procedures, and I certainly do
> NOT want to wrap all of them for every new instance differently typed
> instance of the problem.

That indeed looks like type X is new Y; Compare ONE line vs. hundred
instantiations!

BTW, there is no generic libraries in a sane sense of this word. Only
instantiations of generic libraries are libraries. (:-))
 
>> Should I show you generic specifications of this mess? 
> 
> And it does become better with OO? I doubt that very much.

Nobody tied. If the efforts spent on fixing Ada generics were invested into
the type system, who knows.

> Indeed the
> trick is, not to have n+m+1 instances but package instances of
> generics into libraries and instantiate whole "world" in one
> stroke.

I does not work, because it is not a world it is a multiverse of
interconnected worlds of different types.

The great idea of Ada numeric types with its ranges and accuracy gets lost,
because for any of them you have to instantiate all this pathetic mess. It
ends up with C-ish everything is double.

Generics is not Ada!

>> I DON"T want it. I need a clean way to describe an algebra and a lattice
>> and then mix and bend them as I want. 
> 
> I doubt OO is the way.

Nobody seriously tried it. Ironically, but C++ which was started as "C with
classes" suddenly turned into #define++!

>> I want to be able to derive everything from a scalar. There is
>> always much talk about how mathematical are generics and FP, it is
>> amazing to me how readily people believe in that.
> 
> So they are wrong? :-)) Perhaps a big conspiracy, like the big C/C++
> conspiracy and the big Java conspiracy.

An argumentation to popularity? (:-))

> People, you make me laugh.

I don't speak for Ada community. I bet the silent majority is rather with
you than with me.

> Isn't it avantgarde enough just to use Ada?
> Must everyone else in this world be a misguided moron or an evil
> propagandist?

Huh, c.l.a is a quite tolerant group, IMO.

>> You need two interfaces: Array and Weakly_Ordered. 
> 
> Which forced me to use tagged type all day long and all the way, where
> simply traditional array access would suffice.

All types shall be tagged, including Boolean.

> Let's have a look at
> verifiability: I'm sure, generics can be verified better and easier. I
> heard something about tagged types being really out in that sector,
> and by the way, I agree. The dispatch over 13 subclasses mess
> "patterns" introduce is just spagetthi programming with a big
> name. (Not to disparage pattern, but making something OO doesn't make
> it modular or structured per se).

What about 13 generic packages?

>> You could also sort same elements of same container object using
>> different orders. [However usually in such cases one better uses a
>> sorted view, rather than physically sorts.
> 
> That you can do, but I still could just sort accesses to the
> elements. Withou the OO mess.

1. It is not a generic solution.
2. Elements must be aliased, which changes the container type.
3. You can do that with OO.

Class-wide (OO), generic, functional are orthogonal approaches which can be
mixed in different proportions. In a proper type system they all will
converge because both generics and procedures will become proper types and
values. I fail to see the reason why this so irritates you.

>>> Yes. But Georg used generics, so it worked. You said, you don't want
>>> generics, but it doesn't work. I fail to see, how George has to use
>>> select an example that makes your misguided hypothesis -- that
>>> generics are not neeeded -- hold up. Quite the opposite.
> 
> 
>> generic
>>    type Element is private;
>>    with function "<" (Left, Right : Element)  return Boolean;
>>    with function "=" (Left, Right : Element)  return Boolean;
>>    type Container ...
>> procedure Quick_Sort (...);
> 
>> How can this can sort the same container?
> 
> Which "same container"?

Of the same type and the same object.

>> This is a clear case where generics don't work. 
> 
>  Have you recently had a look to Ada.Containers? There is a generic array sort. So? 

What is generic there? How polymorphism of comparison is achieved.
Statically, dynamically, ad-hoc?

>> You need "<" and "=" be passed as proper functions
>> rather than formal generic functions. [It is not static polymorphism
>> anymore.]
> 
> And, what is the problem with that.

That it is not generic. You cannot call a car solar just because you carry
wrist-watch with solar cells while driving it...

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-04 22:47                                               ` Markus E Leypold
@ 2007-02-04 23:08                                                 ` Markus E Leypold
  2007-02-05 15:57                                                   ` Markus E Leypold
  2007-02-05  8:47                                                 ` Dmitry A. Kazakov
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-04 23:08 UTC (permalink / raw)



Hi Dmitry, 

I've just been reconsider this thread, which, in my news reader has
become quite unmanageable. I've written (or at list tried to hint at)
in another response that I somehow doubt the sense in continuing this
discussion. I'd like to reinforce that for a last time.

E.g. you write some hardly understandable statement that somehow mixes
up memory live times with visibility and scoping (which already made
me doubt you have any idea what "encapsulation" really means) ..., 


> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> > The second issue is less and less relevant as Randy pointed out. The first
> > issue is always relevant. It is a good design to consider where an object
> > exists. GC [and upward closures] is an attitude of making everything
> > potentially global. In fact it is worse than just global. It is "I don't
> > know where I need that damn thing."

but what is worse, you get refuted by Ray Blake, ...

> GC does not affect visibility or scoping. That is an orthogonal issue, and
> still quite properly under the control of the programmer. Upward closures do
> not make an object more global unless the closure explicitly exposes the
> object for external use.

.. and in more word, but not as well formulated, by me. But do we get
any feedback on that? No, instead we just continue the discussion with
some other quirky notion if yours which until then was a side issue.

>> I am a long proponent of procedural types for Ada. Now consider. A
>> procedure is a limited object. Limited objects can be returned in Ada 2005.
>> What else you need? [Note, this is still not an upward closure, I am
>> opposing.] 

>> I certainly prefer OO solution because it explicitly maintains the context
>> of the operation. It is easier to understand for the reader and it easier
>> to maintain.

And similar things happen in all sub-threads spawned by you or in
which you participate.

Does it make sense under the circumstances to continue the discussion?
I don't think so. 

And I fear to become a crank myself but just arguing OT (!!)  notions in c.l.a.
(and that includes my excursions into GC, FP and typesystems).

So I'm calling quits to this discussion (at least as far as my
participation goes).

Regards -- Markus







^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-04 23:00                                         ` Dmitry A. Kazakov
@ 2007-02-04 23:21                                           ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-04 23:21 UTC (permalink / raw)




"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Sun, 04 Feb 2007 21:44:17 +0100, Markus E Leypold wrote:
>

> I don't speak for Ada community. I bet the silent majority is rather with
> you than with me.

I guess the silent majority is slightly surprised about this recent
flurry of off topic mails but bears it with good grace. I've noticed
this: They are slow in c.l.a. to quench of OT threads (a good thing I
think, but which I now want to stop abusing -- if anybody wants to
continue this discussion (with me) it needs a hypothesis / subject
first and another group).

>> Isn't it avantgarde enough just to use Ada?
>> Must everyone else in this world be a misguided moron or an evil
>> propagandist?
>
> Huh, c.l.a is a quite tolerant group, IMO.

Indeed. There are, though, people, who have at least implied that the
use of C is somewhere between sheer incompetence and crime. OK,
perhaps I read that wrong. I only had the impression that few are
interested in the reasons why -- well, C and C++ (e.g.) are more often
used than Ada -- and that there is at least a sub group which is more
interested in reassuring themselves that this is due to the injustice
of the world -- rather than trying to understand the factors
involved. I admit -- that certainly is not the majority here, which
kept their own counsel :-).

Never mind: I usually like the level of competence in c.l.a., but I
certainly get some strange vibration when the some issues are
touched. The paragraph quoted above was unjust though.

>>> You need two interfaces: Array and Weakly_Ordered. 
>> 
>> Which forced me to use tagged type all day long and all the way, where
>> simply traditional array access would suffice.
>
> All types shall be tagged, including Boolean.

I'm already out of the discussion, but my brain just started to curdle.

> converge because both generics and procedures will become proper types and
> values. I fail to see the reason why this so irritates you.

Because your approach does not work? I'm actually less irritated the
flabbergasted.

<more incomprehensible stuff snipped :->

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-03 20:06                                     ` Markus E Leypold
@ 2007-02-05  0:06                                       ` Markus E Leypold
  2007-02-05 13:58                                         ` Georg Bauhaus
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-05  0:06 UTC (permalink / raw)




Hi George,

First I'd like to sort of apologize for my long last message and the
sudden end (my refusal to continue the discussion). I'd like hold up
the statement that I don't have time to continue the discussion,
though. But I'd like you to understand that you're putting my in some
kind of double bind here: 

First you state your dissatisfaction with type inference and the
typical functional style of programming in highly subjective terms.

Than you assert that advocates of functional programming are all
elistist (they either pretend not to have the problems you have or
they are not having them, perhaps are only a minority in that, but
they find it cool that they have mastered something other people can't
and have nothing better to offer to you than to assert that you
problems are not real problem etc).

I rather don't see a way to help you in this situation. I could

 - say you're right. But what does that buy you, if I think you don't?
   It certainly doesn't solve you problem with FP.

 - assert that I don't have the problem, and even continue to
   understand yours and try to change you, but than I would fall under
   the second point (I'm a FP smart ass) or under the first point (I
   just pretend against better knowledge that those problem don't
   exists).

In my youth I've been discussing extreme religious beliefs with a
number of people and one thing we noticed, that a lot of belief
systems have a kind of self referentiality built in, which is called
self immunization: It's often core axiom that somebody that denies God
or the specific believe serves the devil (or whatever evil principle)
by that and therefore must have been incited by the devil to do it. Of
course arguments by people influenced by the devil need not to be
considered, since the devil lies and says anything just to unsettle
your belief.

I see a similar (though less radical) self immunization strategy here:
Whatever I say is just typical FP advocate and if what I say is not
expressible in Ada parlance it cannot be good, since Ada comprises
everything and does everything right (I'm perhaps a bit fuzzy and
unjust on the last point, my apologies, if so)

But something different and new can not be mapped to the already known
if it is really new. As far as I see a 'jump' of some sort always has
to be taken in learning something really knew -- there is no sure way
to bootstrap into a new "world of knowledge" or paradigm. 

In my opinion that is the basic learning function of the human brain
which has not been well understood so far, this sudden insight. The
sudden insight has probably something to do with experiencing new
things in action, so it's probably impossible to make somebody
entering into a new paradigm if he wants to be convinced beforehand
that it would be worth it. It's the characteristics of a new paradigm
that you cannot understand its explanations until you have entered it,
so nobody can convince you of the paradigm's value.

I know this sounds a bit religious and indeed the same argument can be
made (and has been made) for any esoteric pseudo science or belief
system. Obviously the choice to want to understand something really
new is motivated by something different than understanding the new
thing and learning its advantages beforehand. I think there is a
pre-rational selection mechanism (e.g. I refuse to even discuss
miracle healers or review the "proofs", even while I admit the
principal positivist restriction of natural sciences and "that we
cannot know everything"). So in a sense: Some things are just not for
some people.

I don't know wether functional programming (and the dratted type
inference) is really for your, which, in the light of the elaborations
above I wouldn't necessarily see as a negative judgement, though. I
don't really know, and the final person to decide on that are you.

My intention was to give some insight on modern type systems and things
that newer languages have, that perhaps make them popular. I do not
want (and from what I said simply cannot) convince anyone of the
usefulness of those features or of the desirability of FP (which is OT
here, but that's where e ended up).

I will be glad to be of assistance if you need to understand certain
_technical_ facets of FP, always assuming that I understand them
myself. I'm open to discussion on alternative (more elegance, better
performance, ...) implementations in functional languages for give
pieces of code. But I've begun to find general value judgements a bit
difficult, as well as the attempts to convince you on the general
value of things you seem not to want to look into.

Yes I have been guilty to indulge in them myself. Mea culpa :-). But
now I have to mend my sinful ways. :-)

Discussion of FP now, please, strictly off c.l.a. And shorter, if possible.

Regards -- Markus









 

 



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-03 14:28                                       ` Markus E Leypold
  2007-02-04 18:38                                         ` Dmitry A. Kazakov
@ 2007-02-05  0:23                                         ` Robert A Duff
  2007-02-05  0:55                                           ` Markus E Leypold
  2007-02-05  1:00                                           ` Ray Blaak
  1 sibling, 2 replies; 397+ messages in thread
From: Robert A Duff @ 2007-02-05  0:23 UTC (permalink / raw)


Markus E Leypold
<development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> I do need that
>
>   function Make_a_Stepper (K:Integer) return ... is
>
>       N : Integer := Complicated_Function(K);
>
>       function Stepper(F:Foo) return Foo is
>     
>         begin
>           
>           return Foo + N;
>
>         end;
>
>     begin
>       return Stepper;
>     end;
>
> would work. And no nitpicking, please, if I made syntax error here: The
> intention should be clear.

I don't think this is nitpicking:

I'm not sure what you mean by the above example.  Is N intended to be
constant or variable (in Ada terms -- that's immutable/immutable in FP
terms)?

Do you mean this:

  function Make_a_Stepper (K:Integer) return ... is
      N : constant Integer := Complicated_Function(K); -- I added "constant" here! ****************
      function Stepper(F:Foo) return Foo is
        begin
          return Foo + N;
        end;
    begin
      return Stepper;
    end;

Or this:

  function Make_a_Stepper (K:Integer) return ... is
      N : Integer := Complicated_Function(K);
      function Stepper(F:Foo) return Foo is
        begin
          N := N + 1; -- I added a modification of N here! ****************
          return Foo + N;
        end;
    begin
      return Stepper;
    end;

?

These two cases of so-called "upward closure" seem very different to me.
(Both are of course illegal in Ada, but we're discussing what "ought" to be.)

In the former case, we're returning a function that looks at some local
value (the value of N).  In this case, you comment about returning
integer values is quite correct:

>> Closure don't "make things global". They do it as much as returning an
>> Integer, makes, GOD FORBID!, a value global (don't return Integers
>> people, that makes a value global and we all know, global is bad --
>> how nonsensical is that?).

Right.  What's the big deal?  Returning a function that looks at integer
values is no worse than returning integer values.  Integer values live
forever, and the value of N is certainly not tied to the lifetime of N.

But in the latter, case, the return of Stepper is causing the lifetime
of an integer VARIABLE (i.e. mutable) to be longer than one might
suspect for a local variable of Make_A_Stepper.  That seems like a
problem to me.

You (Markus) seem to be an advocate of (possibly-upward) closures.
And also an advocate of function programming.  So how would you like a
rule that says the former example is OK, but the latter is illegal?
(That is, upward closures are allowed only when you're doing functional
(no side-effects) programming).

Note that the latter is not functional -- a call to the function that
Make_A_Stepper returns modifies N, which is the key difference!

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-01 17:36                             ` Markus E Leypold
  2007-02-01 20:53                               ` Georg Bauhaus
@ 2007-02-05  0:39                               ` Robert A Duff
  2007-02-05  1:00                                 ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-02-05  0:39 UTC (permalink / raw)


Markus E Leypold
<development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> ...As I wrote elsewhere: All common programming languages are Turing
> complet,

OK.

>... so equivalent.

Equivalent in some sense.

>... There is nothing that can be done in one that
> could not be done in the other -- in principle.

No, I won't agree with the "nothing in one than the other" idea.
For example, you can read the clock in standard Ada,
but you cannot in standard Pascal.
That's "something that can be done" in one language but
not the other.

Another example: one cannot write the garbage collector of a Java
implementation in Java.

I'm sure you know this -- I'm just quibbling with your wording:
things that can/cannot be done, versus functions that can/cannot be
computed (by Turing machines).

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-05  0:23                                         ` Robert A Duff
@ 2007-02-05  0:55                                           ` Markus E Leypold
  2007-02-06  0:01                                             ` Robert A Duff
  2007-02-05  1:00                                           ` Ray Blaak
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-05  0:55 UTC (permalink / raw)



Robert A Duff <bobduff@shell01.TheWorld.com> writes:

> Markus E Leypold
> <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>
>> I do need that
>>
>>   function Make_a_Stepper (K:Integer) return ... is
>>
>>       N : Integer := Complicated_Function(K);
>>
>>       function Stepper(F:Foo) return Foo is
>>     
>>         begin
>>           
>>           return Foo + N;
>>
>>         end;
>>
>>     begin
>>       return Stepper;
>>     end;
>>
>> would work. And no nitpicking, please, if I made syntax error here: The
>> intention should be clear.
>
> I don't think this is nitpicking:
>
> I'm not sure what you mean by the above example.  Is N intended to be
> constant or variable (in Ada terms -- that's immutable/immutable in FP
> terms)?

Ah, you see, that is the point with closures. If we were talking about
closures, N would be different for every invocation of Make_Stepper
but every Stepper returned would refer to a constant N during its
livetime. THis is why they are called closure (they enclose their
environment at the instance of their instantiation which in this case
would be (in a sense) between the invocation of Make_Stepper and
before the body of Make_Stepper is executed.


>
> Do you mean this:
>
>   function Make_a_Stepper (K:Integer) return ... is
>       N : constant Integer := Complicated_Function(K); -- I added "constant" here! ****************
>       function Stepper(F:Foo) return Foo is
>         begin
>           return Foo + N;
>         end;
>     begin
>       return Stepper;
>     end;
>
> Or this:
>
>   function Make_a_Stepper (K:Integer) return ... is
>       N : Integer := Complicated_Function(K);
>       function Stepper(F:Foo) return Foo is
>         begin
>           N := N + 1; -- I added a modification of N here! ****************
>           return Foo + N;
>         end;
>     begin
>       return Stepper;
>     end;
>
> ?
>
> These two cases of so-called "upward closure" seem very different to me.

Ah, yes, I see how that comes about, since we are not talking names
and their bindings to values here, but variables in Ada. Actually I'd
say both. The last case is no pure functional programming (just for
the record) and in OCaml would b expressed like this:

  let make_stepper k = 
      let n = ref (complicated k) 
      in
        (fun f -> n := !n + 1; f + !n)

and in the first case (just to show the difference)

  let make_stepper k = 
      let n = (complicated k) 
      in
        (fun f -> f + n)

So in (impure) FP both cases are possible and if we go back to Ada
syntax from that: Why not use the constant keyword to distinguish the
cases?

> (Both are of course illegal in Ada, but we're discussing what "ought" to be.)

Ehm. And I'm thankful for you rather insightful input :-).

> In the former case, we're returning a function that looks at some local
> value (the value of N).  In this case, you comment about returning
> integer values is quite correct:
>
>>> Closure don't "make things global". They do it as much as returning an
>>> Integer, makes, GOD FORBID!, a value global (don't return Integers
>>> people, that makes a value global and we all know, global is bad --
>>> how nonsensical is that?).
>
> Right.  What's the big deal?  Returning a function that looks at integer
> values is no worse than returning integer values.  Integer values live
> forever, and the value of N is certainly not tied to the lifetime of N.


> But in the latter, case, the return of Stepper is causing the lifetime
> of an integer VARIABLE (i.e. mutable) to be longer than one might
> suspect for a local variable of Make_A_Stepper.  That seems like a
> problem to me.

It's still only encapsulated _state_, the visibility of indentifiers
(scope) is not extended by that. Outsiders can't look into the
closure. And we already have encapsulate state like this, in tagged
objects. Closure differ from objects only in 2 points: (1) they have
only one method (apply) and (2) they can take (implicitely) values
from their whole environment at the place of instantiation.

Indeed the more I think about it, a bit compiler support (which knows
which parts of the environment are actually used in the closures
body), a new keyword (closure to distinguis them from ordinary
procedures) and all the mechanisms already present in tagged types
should be quite enough implement closure in Ada. (Well, OK, there are
problems with sharing N in the non-constant case with other closures
and other procedure which can see N, but in the constant case (which
is perhaps in the presence of object the mor important one), 'constant
N' can be just copied into a implementation object of the type

   tagged record
     N : Integer;
   end record;



> You (Markus) seem to be an advocate of (possibly-upward) closures.

Certainly. As I said in another post of the recent article storm, I
_think_ (think, mind you, no hard data) that if too much thinks
(narrowing of type w/o RTTI during method invocation, scope of access
types and passing procedural parameters can only be "done downwards",
some desirable structures become impossible.

> And also an advocate of function programming.  

'functional'. :-).

> So how would you like a rule that says the former example is OK, but
> the latter is illegal?

I'd weep silently :-).

> (That is, upward closures are allowed only when you're doing functional
> (no side-effects) programming).


> Note that the latter is not functional -- a call to the function that
> Make_A_Stepper returns modifies N, which is the key difference!

Yes. That is so. But Ada is not functional any way and impure closures
have their value. Nonetheless: Since the encapsulation of mutable
state can be done in tagged types forbidding the latter would probably
not hurt as much as in languages w/o OO. Still, that requires further
analysis of the most frequent use cases. 

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-05  0:23                                         ` Robert A Duff
  2007-02-05  0:55                                           ` Markus E Leypold
@ 2007-02-05  1:00                                           ` Ray Blaak
  2007-02-05  1:19                                             ` Markus E Leypold
  2007-02-06  0:18                                             ` Robert A Duff
  1 sibling, 2 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-05  1:00 UTC (permalink / raw)


Robert A Duff <bobduff@shell01.TheWorld.com> writes:
> But in the latter, case, the return of Stepper is causing the lifetime
> of an integer VARIABLE (i.e. mutable) to be longer than one might
> suspect for a local variable of Make_A_Stepper.  That seems like a
> problem to me.

Yes, the lifetime is longer (as specified by the programmer) but the global
visibility is not affected -- N still cannot be directly manipulated in a
global sense.

> You (Markus) seem to be an advocate of (possibly-upward) closures.
> And also an advocate of function programming.  So how would you like a
> rule that says the former example is OK, but the latter is illegal?
> (That is, upward closures are allowed only when you're doing functional
> (no side-effects) programming).

I like general closures akin to the style of Lisp. It is interesting that
JavaScript, for all its faults, at least has fully general closures.

C# delegates are now essentially the same thing, and have full GC support.

Java's are busted due to implementation laziness.

So why not have general closures in Ada? They are a very powerful concept, and
one still gets the control if needed by simply not using them if appropriate.

But to do closures properly, one really does need GC.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-05  0:39                               ` Robert A Duff
@ 2007-02-05  1:00                                 ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-05  1:00 UTC (permalink / raw)




Robert A Duff <bobduff@shell01.TheWorld.com> writes:

> Markus E Leypold
> <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>
>> ...As I wrote elsewhere: All common programming languages are Turing
>> complet,
>
> OK.
>
>>... so equivalent.
>
> Equivalent in some sense.
>
>>... There is nothing that can be done in one that
>> could not be done in the other -- in principle.
>
> No, I won't agree with the "nothing in one than the other" idea.
> For example, you can read the clock in standard Ada,
> but you cannot in standard Pascal.
> That's "something that can be done" in one language but
> not the other.
>
> Another example: one cannot write the garbage collector of a Java
> implementation in Java.
>
> I'm sure you know this -- I'm just quibbling with your wording:
> things that can/cannot be done, versus functions that can/cannot be
> computed (by Turing machines).

You're also missing my point somewhat :-): That equivalence "in
principle" that every program in L1 has a program in L2 which does the
same computation (and I/O if you like) is not the point. The point is,
in practical programming, _how_ things are done. Not their result, but
how much pain is inflicted and how comprehensible the end result
is. How long in absolute terms the way from the problem mind set to
the implementation of a solution is.

But I take your point. It's less the language though, but the standard
library (which in pascal is not seperable, I know).

Regards -- Markus.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-02  1:37                               ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
                                                   ` (2 preceding siblings ...)
  2007-02-02 21:50                                 ` in defense of GC (was Re: How come Ada isn't more popular?) Gautier
@ 2007-02-05  1:12                                 ` Robert A Duff
  2007-02-05  9:06                                   ` Ray Blaak
  3 siblings, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-02-05  1:12 UTC (permalink / raw)


Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:

> GC is one of the modern advances of computing science, period, akin to the use
                   ^^^^^^
> of high level languages vs assembly, lexical scoping vs dynamic scoping,
> strong typing vs no typing, etc.

Modern?  GC is many years old.

> It should be used by default and turned off only in unusual situations.

Agreed that it should be "on" by default.

> Of course, in this group, those circumstances are probably the usual
> case, what with the use of Ada for realtime and embedded programming.
>
> Still, given recent realtime GC algorithms, I would consider the use of GC for
> any system I was responsible for, and would give it up reluctantly, only if
> timing and space constraints were too tight for a given application.

I've read many papers about so-called real-time GC, but I remain
skeptical.

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-05  1:00                                           ` Ray Blaak
@ 2007-02-05  1:19                                             ` Markus E Leypold
  2007-02-06  8:32                                               ` Ray Blaak
  2007-02-06  0:18                                             ` Robert A Duff
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-05  1:19 UTC (permalink / raw)



Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:

> But to do closures properly, one really does need GC.

My words. Or to be truthful: They could have been. :-)

BTW, you've been discussing (with ... I don't remember) the possibilty
of switching GC off. My idea (even with ML-like languages) was always
to define a subset of the language which the compiler can implement
with a stack only. One could use this subset to embbed real time
capable procedures (as prioritized tasks or as interrupt handlers) in
a system with as a whole (user interface) does only provide "soft
realtime" (i.e. +/- garbage collection times). This should cover even
most cases with hard real time requirements.

Suitable subsets again could be compiled to pure C code which works
without GC (compile time garbage collections) or without heap. It's
the compilers job (maybe with help of the annotations) to
discover/prove that the code can be translated like that.

Seems to me a more useful approach than explicit pointer management.

One advantage of that would be that one could start out with code that
requires GC, but is "obviously" correct and then transform in
correctness preserving steps it into code that doesn't require
GC. Both implementations could be tested against each other, at least
as far as their computational function goes, not the real time
capabilities that are perhaps only possessed by the 2nd variant.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-04 22:47                                               ` Markus E Leypold
  2007-02-04 23:08                                                 ` Markus E Leypold
@ 2007-02-05  8:47                                                 ` Dmitry A. Kazakov
  2007-02-05 14:03                                                   ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-05  8:47 UTC (permalink / raw)


On Sun, 04 Feb 2007 23:47:29 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> Consider also this:
>>
>> protected type Bar is
>>    function Make_a_Mess return ...;
>> private
>>    Baz : Integer;
>>       -- To be put into the closure together with barriers of the entries!
>>
>> how do you return a closure from a protected object or a task rendezvous?
>> Will it be able to assign Baz?
> 
> Interesting thought. Either it must inherit the barriers, somehow, or
> you simply can't.

But proposing upward closures you have to be prepared to answer such
questions...
 
>  I
> still do not see, how you would manage to pass procedure upwards which
> us variables that are defined in their environment.

As I said before, I don't want this feature in Ada, it is your wish, not
mine. For all I don't know what is "environment." It is up to you to define
it, before it could be reasonably discussed. Further I don't see why
environment cannot be expressed as an object of manifested type (=OO
solution) and why it should be passed outwards.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC (was Re: How come Ada isn't more popular?)
  2007-02-05  1:12                                 ` Robert A Duff
@ 2007-02-05  9:06                                   ` Ray Blaak
  2007-02-06  0:28                                     ` in defense of GC Robert A Duff
  0 siblings, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-05  9:06 UTC (permalink / raw)


Robert A Duff <bobduff@shell01.TheWorld.com> writes:
> > GC is one of the modern advances of computing science, period, akin to the use
>                    ^^^^^^
> > of high level languages vs assembly, lexical scoping vs dynamic scoping,
> > strong typing vs no typing, etc.
> 
> Modern?  GC is many years old.

Well, sure. But you do realize that mainstream computing is slowly and
inexorably reimplementing good old fashioned Lisp, right? :-)

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-04 22:33                                       ` Markus E Leypold
@ 2007-02-05  9:20                                         ` Dmitry A. Kazakov
  2007-02-05 12:16                                           ` Harald Korneliussen
  2007-02-05 13:53                                           ` Markus E Leypold
  0 siblings, 2 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-05  9:20 UTC (permalink / raw)


On Sun, 04 Feb 2007 23:33:40 +0100, Markus E Leypold wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> They don't share subtrees they share memory (implementation), which is
>> casual to the program semantics.
> 
> They share representation in the underlying system / VM / whatever you
> call it.

Whatever. They are not shared in the sense of aliasing, when you have more
than one name for semantically same thing. 1+1 and 3-1 "share" 2. And this
is irrelevant to where 2 might be allocated or how it were encoded.

>>>> When a subtree C of a tree A changes, should it ("it" is an
>>>> identity (:-)) in the tree B be changed as well? 
>>> 
>>> Change? There is no "change" in a functional world, only the
>>> production of new values (i.e. new elements from a specific set, in
>>> your case the set of all trees).
>>
>> (New values? What is the difference between new and old values? Do values
>> carry RFID chips with timestamps? (:-))
> 
> Don't play dumb. There is certainly an order between value occuring in
> a (functional) program.

Values "occur"? [ You cannot fix the sentence above, it is semantically
broken. ]

>> Consider a mapping between nodes of two trees which has to be
>> preserved. Now a function of the first tree is evaluated, what
>> happens with the function of the second tree?
> 
>>        g
>>  A  ------>  B
>>  |           |
>> f|           | id = nothing happens
>>  |           |
>>  V     g     V
>> f(A) ------> B (broken invariant)
> 
> The function of the second tree? What are you talking about? Let me
> repeat: In pure functional languages nothing "changes".

Consider A = tree, B = its maximal path. When f(A) "occur," [a node is
removed] what happens with B? 

Answer: nothing happens, the program is broken. The programmer has to
manually evaluate Max_Path(f(A)). "Maximal path of" is non-functional.

> It makes a
> difference if you pass an access to a variable instead of passing the
> variable as an in/out parameter and there is a difference if you
> pass it as an in parameter.

It does because access T and T are different types! T is not passed, access
T is passed (by reference or by value).

You could save much time if you just said that you wanted referential
objects (pointers). Period.

Now my question again, what prevents you from collecting objects accessed
via controlled referential objects?

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02 13:57                           ` Markus E Leypold
  2007-02-03  9:44                             ` Dmitry A. Kazakov
@ 2007-02-05  9:59                             ` Maciej Sobczak
  2007-02-05 13:43                               ` Markus E Leypold
  2007-02-05 19:05                               ` Ray Blaak
  1 sibling, 2 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-05  9:59 UTC (permalink / raw)


Markus E Leypold wrote:

[I agree with what you say on historical perspective on language 
transitions and the probabilistic effects that languages have on 
newbies, so this part was cut.]

> If you just do f(s.c_str()) and f _is_ properly behaved, that is, only
> reads from the pointer or does a strdup(), everything is fine, but, I
> note, not thread safe. I wouldn't exclude the possibility that the
> resulting race condition is hidden in a nice number of C++ programs
> out there.

If you have a race condition because of some thread is modifying a 
string object *while* some other thread is using it, then you have a 
heavy design problem. This is absolutely not related to interfacing with C.

> Your solution is thread safe, if the strings package is (which it
> wasn't in the past).

Strings package cannot make it any better, because the granularity of 
thread-safety results from the program logic, not from the package 
interface. String is too low-level (it's a general utility) to be 
thread-safe in any useful sense. That's why: a) it should not be 
thread-safe on its own, b) you still have a design problem.

Interestingly, Ada doesn't make it any better. Neithe does Java. You 
always need to coordinate threads/tasks/whatever on some higher 
conceptual level than primitive string operations.


[about closures]

>> You can have it by refcounting function frames (and preserving some
>> determinism of destructors). GC is not needed for full closures, as
>> far as I perceive it (with all my misconceptions behind ;-) ).
> 
> Yes, one could do it like that. Ref-counting is rumoured to be
> inefficient

Which relates to cascading destructors, not to function frames.

> but if you don't have too many closure that might just
> work.

If you have too many closures, then well, you have too many closures. :-)

We've been talking not only about performance, but also about 
readability and maintenance. ;-)

>>> Furthermore I've been convinced that manual memory management hinders
>>> modularity.
> 
>> Whereas I say that I don't care about manual memory management in my
>> programs. You can have modularity without GC.
> 
> Certainly. But you can have more with GC.

In a strictly technical sense of the word, yes. But then there comes a 
question about possible loses in other areas, like program structure or 
clarity.

Being able to just drop things on the floor is a nice feature when 
considered in isolation, but not necessarily compatible with other 
objectives that must be met at the same time.

> People who don't have GC often say that they can do anything with
> manual memory management.

And I say that this is misconception. I don't have/use GC and I don't 
bother with *manual* memory management neither. That's the point. In Ada 
this point is spelled [Limited_]Controlled (it's a complete mess, but 
that's not the fault of the concept) and in C++ it's spelled automatic 
storage duration.
Today manual memory management is a low-level thingy that you don't have 
to care about, unless you *really* want to (and then it's really good 
that you can get it). And as I've already pointed out, in my regular 
programming manual memory management is a rarity.

On the other hand, most languages with GC get it wrong by relying *only* 
on GC, everywhere, whereas it is useful (if at all) only for memory. The 
problem is that few programs rely on only memory and in a typical case 
there are lots of resources that are not memory oriented and they have 
to be managed, somehow. When GC is a shiny center of the language, those 
other kinds of resources suffer from not having appropriate support. In 
practical terms, you don't have manual management of memory, but you 
have *instead* manual management of *everything else* and the result is 
either code bloat or more bugs (or both, typically).

Languages like Ada or C++ provide more general solution, which is 
conceptually not related to any kind of resource and can be therefore 
applied to every one. The result is clean, short and uniform code, which 
is even immune to extensions in the implementation of any class. Think 
about adding a non-memory resource to a class that was up to now only 
memory oriented - if it requires any modification on the client side, 
like adding tons of finally blocks and calls to 
close/dispose/dismiss/etc. methods *everywhere*, then in such a language 
the term "encapsulation" is a joke.

An ideal solution seems to be a mix of both (GC and automatic objects), 
but I think that the industry needs a few generations of failed attempts 
to get this mix right. We're not yet there.


>> OO is about encapsulation and polymorphism, these don't need
>> references everywhere.
> 
> Yes, but -- you want to keep, say, a list of Shape(s). Those can be
> Triangle(s), Circle(s) etc, which are all derived from class
> Shape. How do you store this list? An array of Shape'Class is out of
> question because of the different allocation requirements for the
> descendants of Shape(s).

Why should I bother?

Note also that I didn't say that references/pointers should be dropped. 
I say that you don't need them everywhere. That's a difference.


> I've decided, if I want to deliver any interesting functionality to
> the end user, my resources (developer time) are limited, I have to
> leave everything I can to automation (i.e. compilers, garbage
> collectors, even libraries), to be able to reach my lofty goals.

I also leave everything I can to automation. It's spelled 
[Limited_]Controlled in Ada and automatic storage duration in C++.
I cannot imagine reaching my lofty goals otherwise. ;-)

> The point is to know when to optimise, not to do it
> always.

I didn't even mention the word "optimization". I'm taling about structure.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-05  9:20                                         ` Dmitry A. Kazakov
@ 2007-02-05 12:16                                           ` Harald Korneliussen
  2007-02-05 14:06                                             ` Dmitry A. Kazakov
  2007-02-05 13:53                                           ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Harald Korneliussen @ 2007-02-05 12:16 UTC (permalink / raw)



> On Sun, 04 Feb 2007 23:33:40 +0100, Markus E Leypold wrote:
> > The function of the second tree? What are you talking about? Let me
> > repeat: In pure functional languages nothing "changes".

On 5 Feb, 10:20, "Dmitry A. Kazakov" <mail...@dmitry-kazakov.de>
wrote:
> Consider A = tree, B = its maximal path. When f(A) "occur," [a node is
> removed] what happens with B?
>
> Answer: nothing happens, the program is broken.

With respect, Dmitry, this is nonsense. There is no way of removing a
node from A at all (this is a feature of pure functional languages,
like Haskell and Clean), and so you are sure that B will always be the
maximal path of A. A can not be redefined. What you can do is define a
new tree, which is a function of the old tree. Naturally, this new
tree may have another maximal path. You will not find that path in B,
as B is the path of A, not the path of f(A)! That A and f(A) may share
structure in the implementation is irrelevant for the correctness of
the program. It is relevant for memory use, certainly, but since the
structures are immutable, it does not affect correctness, and the
program is absolutely not "broken".

> The programmer has to manually evaluate Max_Path(f(A)). "Maximal path of" is non-functional.

"Maximal path of" is a function. It takes a tree, and returns the
maximal path of that tree. It does not take a reference to a mutable
data structure or something. If it did, it would not be a function,
because it could give different results when passed the same
reference. But yes, once you have a new value, this does not update
your old values in any way, so perhaps you have to "manually" derive
some further values from this.

(It's also quite reasonable to speak of "newer" and "older" values in
functional languages, as in mathemathics. If value B is derived from
value A, of course value A must be computed first, and A is the "old"
value, B is the "new". A is still very much around, however!)




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-05  9:59                             ` Maciej Sobczak
@ 2007-02-05 13:43                               ` Markus E Leypold
  2007-02-06  9:15                                 ` Maciej Sobczak
  2007-02-05 19:05                               ` Ray Blaak
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-05 13:43 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:
>
> [I agree with what you say on historical perspective on language
> transitions and the probabilistic effects that languages have on
> newbies, so this part was cut.]
>
>> If you just do f(s.c_str()) and f _is_ properly behaved, that is, only
>> reads from the pointer or does a strdup(), everything is fine, but, I
>> note, not thread safe. I wouldn't exclude the possibility that the
>> resulting race condition is hidden in a nice number of C++ programs
>> out there.
>
> If you have a race condition because of some thread is modifying a
> string object *while* some other thread is using it, then you have a
> heavy design problem. This is absolutely not related to interfacing
> with C.

Yes, I realize that. Still, providing a pointer to a the inner state
of an object, that only stays valid until I touch the object for the
next time is not only not thread safe (which I realized is not the
problem) but it's not type safe: A error in programming leads to
erronous execution, i.e. reading and writing invalid memory. That is
worse.

I think there was talk once about a thread safe string library, but at
the moment I fail to see how that relates to the problem in question.
>
>> Your solution is thread safe, if the strings package is (which it
>> wasn't in the past).
>
> Strings package cannot make it any better, because the granularity of
> thread-safety results from the program logic, not from the package
> interface. String is too low-level (it's a general utility) to be
> thread-safe in any useful sense. That's why: a) it should not be
> thread-safe on its own, b) you still have a design problem.

Yes. I realize that. Don't know what made me write that :-).

> Interestingly, Ada doesn't make it any better. Neithe does Java. You
> always need to coordinate threads/tasks/whatever on some higher
> conceptual level than primitive string operations.

So forgive me. Let's ditch the thread safety aspect and instead:
Giving pointers to internal state of objects violates (a)
encapsulation (it fixes a specific implementation) and (b) is not type
safe. I'm sure we can hang c_str() on account of this charge alone and
can drop the thread-unsafety allegation.

> [about closures]
>
>>> You can have it by refcounting function frames (and preserving some
>>> determinism of destructors). GC is not needed for full closures, as
>>> far as I perceive it (with all my misconceptions behind ;-) ).
>> Yes, one could do it like that. Ref-counting is rumoured to be
>> inefficient
>
> Which relates to cascading destructors, not to function frames.

My impression was it relates to both. Especially since both are
interlocked: In a world with closure objects (from OO) and variables
can refer to closures (function frames) and vice versa).

>> but if you don't have too many closure that might just
>> work.
>
> If you have too many closures, then well, you have too many closures. :-)

Yes :-). Only in a ref counted implementation even too many might not
be enough.

> We've been talking not only about performance, but also about
> readability and maintenance. ;-)

Of this thread? :-)

>
>>>> Furthermore I've been convinced that manual memory management hinders
>>>> modularity.
>>
>>> Whereas I say that I don't care about manual memory management in my
>>> programs. You can have modularity without GC.
>> Certainly. But you can have more with GC.
>
> In a strictly technical sense of the word, yes. But then there comes a
> question about possible loses in other areas, like program structure
> or clarity.

I think the absence of manual memory management code actually furthers
clarity.

>
> Being able to just drop things on the floor is a nice feature when
> considered in isolation, but not necessarily compatible with other
> objectives that must be met at the same time.

Which?


>> People who don't have GC often say that they can do anything with
>> manual memory management.
>
> And I say that this is misconception. I don't have/use GC and I don't
> bother with *manual* memory management neither. That's the point. In
> Ada this point is spelled [Limited_]Controlled (it's a complete mess,
> but that's not the fault of the concept) and in C++ it's spelled
> automatic storage duration.

My impression was that Ada Controlled storage is actually quite a
clean concept compared to C++ storage duration.

But both tie allocation to program scope, synchronous with a stack. I
insist that is not always desirable: It rules out some architecture,
especially those where OO abounds.

The problem with Controlled, BTW, is that it seems to interact with
the rest of the language in such a way that GNAT didn't get it right
even after ~10 years of development. Perhaps difficult w/o a formal
semantics.

> Today manual memory management is a low-level thingy that you don't
> have to care about, unless you *really* want to (and then it's really
> good that you can get it). And as I've already pointed out, in my
> regular programming manual memory management is a rarity.

> On the other hand, most languages with GC get it wrong by relying
> *only* on GC, everywhere, whereas it is useful (if at all) only for
> memory. 

I've heard that complaint repeatedly, but still do not understand it.

> The problem is that few programs rely on only memory and in a
> typical case there are lots of resources that are not memory oriented
> and they have to be managed, somehow. 

> When GC is a shiny center of the
> language, those other kinds of resources suffer from not having
> appropriate support. In practical terms, you don't have manual
> management of memory, but you have *instead* manual management of
> *everything else* and the result is either code bloat or more bugs (or
> both, typically).

Now, now. Having GC doesn't preclude you from managing ressources
unrelated to memory in a manual fashion. Apart from that languages
with GC often provide nice tricks to tie external ressources to their
memory proxy and ditch them when the memory proxy is unreachable
(i.e. the programm definitely won't use the external ressource any
longer). Examples: IO channels (only sometimes useful), temporary
files, files locks, shared memory allocations. Even if you manage
ressources manually, GC still limits the impact of leaks. And BTW - in
fcuntional langauges you can do more against ressource leaks, sicne
you can "wrap" functions:


  (with_file "output" (with_file "out_put" copy_data))

It's not always done, but a useful micro pattern.

> Languages like Ada or C++ provide more general solution, which is
> conceptually not related to any kind of resource and can be
> therefore applied to every one.

Since you're solving a problem here, which I deny that it exists, I
can't follow you here. But I notice, that

 "Languages like C provide a more general solution (with regard to
  accessing memory), which is conceptually not related to any kind of
  fixed type system and can therefore implement any type and data model"

would become a valid argument if I agreed with you. It's the
generality we are getting rid of during the evolution of programming
languages. Assembler is the "most general" solution, but we are
getting structured programming, typesystems amd finally garbage
collection.


> The result is clean, short and uniform code,
>which is even immune to extensions in the implementation of any
>class. Think about adding a non-memory resource to a class that was
>up to now only memory oriented - if it requires any modification on
>the client side, like adding tons of finally blocks and calls to
>close/dispose/dismiss/etc. methods *everywhere*, then in such a
>language the term "encapsulation" is a joke.

Well, you think Ada here. In an FP I write (usually) something like:

     with_lock "/var/foo/some.lck" (fun () -> do_something1 (); do_something2 param;  ...).

The fact that Ada and C++ don't have curried functions and cannot
construct unnamed functions or procedures is really limiting in this
case and probably causal to your misconception that it would be
necessary to add tons of exceaption handling at the client side.

And BTW: In Ada I would encapsulate the ressource in a Controlled
object (a ressource proxy or handle) and get the same effect (tying it
to a scope). Indeed I have already done so, to make a program which
uses quite a number of locks, to remove locks when it terminated or
crashes. Works nicely.

> An ideal solution seems to be a mix of both (GC and automatic
> objects), but I think that the industry needs a few generations of
> failed attempts to get this mix right. We're not yet there.


>>> OO is about encapsulation and polymorphism, these don't need
>>> references everywhere.
>> Yes, but -- you want to keep, say, a list of Shape(s). Those can be
>> Triangle(s), Circle(s) etc, which are all derived from class
>> Shape. How do you store this list? An array of Shape'Class is out of
>> question because of the different allocation requirements for the
>> descendants of Shape(s).

> Why should I bother?
>
> Note also that I didn't say that references/pointers should be
> dropped. I say that you don't need them everywhere. That's a
> difference.

OK, so you need them _almost_ everywhere :-). I take your point.


>> I've decided, if I want to deliver any interesting functionality to
>> the end user, my resources (developer time) are limited, I have to
>> leave everything I can to automation (i.e. compilers, garbage
>> collectors, even libraries), to be able to reach my lofty goals.
>
> I also leave everything I can to automation. It's spelled
> [Limited_]Controlled in Ada and automatic storage duration in C++.
> I cannot imagine reaching my lofty goals otherwise. ;-)

Good. 'Controlled' buys you a lot in Ada, but there are 2 problems

 (a) AFAIS (that is still my hypothesis, binding storage to scope is
     not alway possible (esp. when doing GUIs and MVC and this
     like). I cannot prove but from what I experienced I rather
     convinced of it.

 (b) AFAIR there are restrictions on _where_ I can define controlled
     types. AFAIR that was a PITA.


>> The point is to know when to optimise, not to do it
>> always.

> I didn't even mention the word "optimization". I'm taling about structure.

OK. But how does a program become less structured by removing the
manual memory management? The GC is not magically transforming the
program into spaghetti code ...

Regards -- Markus





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-05  9:20                                         ` Dmitry A. Kazakov
  2007-02-05 12:16                                           ` Harald Korneliussen
@ 2007-02-05 13:53                                           ` Markus E Leypold
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-05 13:53 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Sun, 04 Feb 2007 23:33:40 +0100, Markus E Leypold wrote:
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>> They don't share subtrees they share memory (implementation), which is
>>> casual to the program semantics.
>> 
>> They share representation in the underlying system / VM / whatever you
>> call it.
>
> Whatever. They are not shared in the sense of aliasing, when you have more
> than one name for semantically same thing. 1+1 and 3-1 "share" 2. And this
> is irrelevant to where 2 might be allocated or how it were encoded.
>
>>>>> When a subtree C of a tree A changes, should it ("it" is an
>>>>> identity (:-)) in the tree B be changed as well? 
>>>> 
>>>> Change? There is no "change" in a functional world, only the
>>>> production of new values (i.e. new elements from a specific set, in
>>>> your case the set of all trees).
>>>
>>> (New values? What is the difference between new and old values? Do values
>>> carry RFID chips with timestamps? (:-))
>> 
>> Don't play dumb. There is certainly an order between values occuring in
>> a (functional) program.

If let x = f y, I suggest that the parlance "x occurs after y" in the
sense of "order in which values are produced during evaluation" is
certainly legal.

>
> Values "occur"? [ You cannot fix the sentence above, it is semantically
> broken. ]

You're playing dumb. Can't help you there.

>>> Consider a mapping between nodes of two trees which has to be
>>> preserved. Now a function of the first tree is evaluated, what
>>> happens with the function of the second tree?
>> 
>>>        g
>>>  A  ------>  B
>>>  |           |
>>> f|           | id = nothing happens
>>>  |           |
>>>  V     g     V
>>> f(A) ------> B (broken invariant)
>> 
>> The function of the second tree? What are you talking about? Let me
>> repeat: In pure functional languages nothing "changes".
>
> Consider A = tree, B = its maximal path. When f(A) "occur," [a node is
> removed] what happens with B? 

'occur' -- your way to read what I said is certainly semantically
broken.


> Answer: nothing happens, the program is broken. The programmer has to
> manually evaluate Max_Path(f(A)). "Maximal path of" is non-functional.

LOL.

>
>> It makes a
>> difference if you pass an access to a variable instead of passing the
>> variable as an in/out parameter and there is a difference if you
>> pass it as an in parameter.
>
> It does because access T and T are different types! T is not passed, access
> T is passed (by reference or by value).
>
> You could save much time if you just said that you wanted referential
> objects (pointers). Period.
>
> Now my question again, what prevents you from collecting objects accessed
> via controlled referential objects?

I do not even know any more what you're talking about. You introduced
the "references don't matter, its a question of implementation not
semantics". Since you also fail to understand how representation
sharing is implemented in (most functional languages) you are bound to
fail to understand how that makes GC a necessity in FP. That AFAIR was
my last train of clear thought in this sub-thread. What that has to do
with "controlled referential objects" I don't know, since I certainly
didn't talk about them.

I now have to go to my brain engineer and let him overhaul some areas
of my brain. Talking to to you has, sometimes, a rather destructive
effect.


Regards -- Markus






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-05  0:06                                       ` Markus E Leypold
@ 2007-02-05 13:58                                         ` Georg Bauhaus
  2007-02-05 14:23                                           ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-05 13:58 UTC (permalink / raw)


On Mon, 2007-02-05 at 01:06 +0100, Markus E Leypold wrote:
> 
> Hi George,

> 
> First you state your dissatisfaction with type inference and the
> typical functional style of programming in highly subjective terms.

I can only speculate on how you can transform what I have written
into what you say I have written. I must try harder to choose my words
carefully. I will provide some simple facts and a frame of reference
for a reasonable discussion of why and when "explicitly referring" (Ada
style) is better than "inferred from context" (FP style). It will take
some time to sort out the arguments in more approachable terms.

The arguments won't stop me from using FP languages. They won't stop
me from arguing about the costly effects of syntax either.

For now:
Since Ada was born out of an attempt to provide a programming language
specifically addressing the needs of production in a large organization
paying attention to syntax vis-a-vis available industry standard
programmers has been an important design goal. Obviously.
I think it still is.

It is therefore necessary to always refer to how a language is
apparently used. Not to how you can use a language.






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-05  8:47                                                 ` Dmitry A. Kazakov
@ 2007-02-05 14:03                                                   ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-05 14:03 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Sun, 04 Feb 2007 23:47:29 +0100, Markus E Leypold wrote:
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>> 
>>> Consider also this:
>>>
>>> protected type Bar is
>>>    function Make_a_Mess return ...;
>>> private
>>>    Baz : Integer;
>>>       -- To be put into the closure together with barriers of the entries!
>>>
>>> how do you return a closure from a protected object or a task rendezvous?
>>> Will it be able to assign Baz?
>> 
>> Interesting thought. Either it must inherit the barriers, somehow, or
>> you simply can't.
>
> But proposing upward closures you have to be prepared to answer such
> questions...

Since I'm not on the standard commitee and neither get paid for it I
can afford such sloppiness. But just to set the record straight: I've
been talking about the general usefulness of closures (in languages)
in the context of general considerations of what other languages have
that might make the more popular than our beloved Ada, than about the
definition of closure (after your slip up concerning "but this makes
it global" this was necessary).

I've never tried to suggest that it would be easy to fit closures back
into Ada, nor did I suggest to do so in the short term. After all, the
next standard probably doesn't happen until 2020 or so ... :-).


>>  I
>> still do not see, how you would manage to pass procedure upwards which
>> us variables that are defined in their environment.
>
> As I said before, I don't want this feature in Ada, it is your wish, not

It was your suggestion that the code you proposed was some substitute
for closures.

> mine. 

> For all I don't know what is "environment." 

Exactly.

> It is up to you to define it, before it could be reasonably
> discussed. Further I don't see why environment cannot be expressed
> as an object of manifested type (=OO solution) and why it should be
> passed outwards.

Well, you see, we (that is: you, in this case :-) are missing some
groundwork on programming langauge design and implementation. So
necessarily we talk on cross purposes (closures cannot be grasped in
Ada terms). And I don't want to start a beginner's tutorial on FP here
on c.l.a. of all things.

So'll have to leave it at that, but -- as always I'm available to
targeted questions.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-05 12:16                                           ` Harald Korneliussen
@ 2007-02-05 14:06                                             ` Dmitry A. Kazakov
  0 siblings, 0 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-05 14:06 UTC (permalink / raw)


On 5 Feb 2007 04:16:05 -0800, Harald Korneliussen wrote:

>> On Sun, 04 Feb 2007 23:33:40 +0100, Markus E Leypold wrote:
>>> The function of the second tree? What are you talking about? Let me
>>> repeat: In pure functional languages nothing "changes".
> 
> On 5 Feb, 10:20, "Dmitry A. Kazakov" <mail...@dmitry-kazakov.de>
> wrote:
>> Consider A = tree, B = its maximal path. When f(A) "occur," [a node is
>> removed] what happens with B?
>>
>> Answer: nothing happens, the program is broken.
> 
> With respect, Dmitry, this is nonsense. There is no way of removing a
> node from A at all (this is a feature of pure functional languages,

Yes, it is broken per feature of FP. (:-))

[...]

Note there is nothing wrong with that. No language keeps all invariants all
the time. The point is that what you describe as sharing etc, is not
automatically the program sematic. It is a language feature given to you to
express the semantics, like the relation above.

A and B above are functions:

A : S -> T
B : S -> T

Here S is the set of execution states. T is the set of trees. "B is max
path of A" is a constraint:

forall s in S
   B(s) = Max_Path (A(s))

>> The programmer has to manually evaluate Max_Path(f(A)). "Maximal path of" is non-functional.
> 
> "Maximal path of" is a function.

In this context it is a relation between values of two objects. The
constraint is that this relation should hold in all observable execution
states.

> (It's also quite reasonable to speak of "newer" and "older" values in
> functional languages, as in mathemathics. If value B is derived from
> value A, of course value A must be computed first, and A is the "old"
> value, B is the "new". A is still very much around, however!)

Really? What is newer e or pi? (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-05 13:58                                         ` Georg Bauhaus
@ 2007-02-05 14:23                                           ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-05 14:23 UTC (permalink / raw)



Georg Bauhaus <bauhaus@arcor.de> writes:

> On Mon, 2007-02-05 at 01:06 +0100, Markus E Leypold wrote:
>> 
>> Hi George,
>
>> 
>> First you state your dissatisfaction with type inference and the
>> typical functional style of programming in highly subjective terms.
>
> I can only speculate on how you can transform what I have written
> into what you say I have written. I must try harder to choose my words
> carefully. 

The way you say this, it becomes a rather one sided imposition of
interpretation on my part. Actually I only wrote my last response not
to just impolitely drop the discussion w/o further comment, since I
was, am, absolutely dissatisfied with it. I also tried honestly to
understand what went wrong ... but now I fear that just became a new
reproach in your eyes. Well -- we can't be perfect.

Let me state it like this: If you want to continue the discussion I
must have some open "corridor" for arguments left, not just that the
alternatives between shutting up or "typical FP advocate".

> I will provide some simple facts and a frame of reference
> for a reasonable discussion of why and when "explicitly referring" (Ada
> style) is better than "inferred from context" (FP style). It will take
> some time to sort out the arguments in more approachable terms.

> The arguments won't stop me from using FP languages. 

I don't want to stop you. I only had the impression you don't want
them.

> They won't stop me from arguing about the costly effects of syntax
> either.

Well -- it will stop me. I'm not interested in syntax discussions, since
I don't experience detrimental effects. Neither from ML syntax nor
from Ada syntax. I'm adaptable and as Appel consider most if not all
syntax discussion longer than a paragrap a complete waste of time.

> For now: Since Ada was born out of an attempt to provide a
> programming language specifically addressing the needs of production
> in a large organization paying attention to syntax vis-a-vis
> available industry standard programmers has been an important design
> goal. Obviously.

My suggestion (at the beginning of this thread) was that Ada in this
respect reflects the state of the art 20 years ago. Not necessarily
bad, but (a) application areas have changed and (b) the competitors
have changed. So, my suggestion was, perhaps the ecological niche in
which Ada lives has become more narrow: Those who want low level use C
(either misguided or very selectively) and those who want high level
use languages with more possibilities for abstraction, other type
systems and GC.

> I think it still is.

Is born :-)? 

> It is therefore necessary to always refer to how a language is
> apparently used. Not to how you can use a language.

OK. Ada is used in embedded programming. So the OP's question must be
answered with "Ada is not more popular because embedded programming is
not more popular". I regret to say, that this doesn't answer the
question, obviously. The OP's question was about "could have beens".

But I was serious when I said, I'm tired of this discussion, so I
_will_ stop here. I can't afford the time and see (momentarily) no
useful goal served either.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-04 23:08                                                 ` Markus E Leypold
@ 2007-02-05 15:57                                                   ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-05 15:57 UTC (permalink / raw)



Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> Hi Dmitry, 
>
> I've just been reconsider this thread, which, in my news reader has
> become quite unmanageable. I've written (or at list tried to hint at)
> in another response that I somehow doubt the sense in continuing this
> discussion. I'd like to reinforce that for a last time.
>
> E.g. you write some hardly understandable statement that somehow mixes
> up memory live times with visibility and scoping (which already made
> me doubt you have any idea what "encapsulation" really means) ..., 
>
>
>> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
>
>> > The second issue is less and less relevant as Randy pointed out. The first
>> > issue is always relevant. It is a good design to consider where an object
>> > exists. GC [and upward closures] is an attitude of making everything
>> > potentially global. In fact it is worse than just global. It is "I don't
>> > know where I need that damn thing."
>
> but what is worse, you get refuted by Ray Blake, ...
>
>> GC does not affect visibility or scoping. That is an orthogonal issue, and
>> still quite properly under the control of the programmer. Upward closures do
>> not make an object more global unless the closure explicitly exposes the
>> object for external use.
>
> .. and in more word, but not as well formulated, by me. But do we get
> any feedback on that? No, instead we just continue the discussion with
> some other quirky notion if yours which until then was a side issue.

=> of yours.

Sorry.

>
>>> I am a long proponent of procedural types for Ada. Now consider. A
>>> procedure is a limited object. Limited objects can be returned in Ada 2005.
>>> What else you need? [Note, this is still not an upward closure, I am
>>> opposing.] 
>
>>> I certainly prefer OO solution because it explicitly maintains the context
>>> of the operation. It is easier to understand for the reader and it easier
>>> to maintain.
>
> And similar things happen in all sub-threads spawned by you or in
> which you participate.
>
> Does it make sense under the circumstances to continue the discussion?
> I don't think so. 
>
> And I fear to become a crank myself but just arguing OT (!!)  notions in c.l.a.
> (and that includes my excursions into GC, FP and typesystems).
>
> So I'm calling quits to this discussion (at least as far as my
> participation goes).
>
> Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-05  9:59                             ` Maciej Sobczak
  2007-02-05 13:43                               ` Markus E Leypold
@ 2007-02-05 19:05                               ` Ray Blaak
  1 sibling, 0 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-05 19:05 UTC (permalink / raw)


Maciej Sobczak <no.spam@no.spam.com> writes:
> On the other hand, most languages with GC get it wrong by relying *only* on
> GC, everywhere, whereas it is useful (if at all) only for memory. The problem
> is that few programs rely on only memory and in a typical case there are lots
> of resources that are not memory oriented and they have to be managed,
> somehow. When GC is a shiny center of the language, those other kinds of
> resources suffer from not having appropriate support. In practical terms, you
> don't have manual management of memory, but you have *instead* manual
> management of *everything else* and the result is either code bloat or more
> bugs (or both, typically).

This is a fair criticism, actually. I am not sure what the state of the art is
with regards to expanding the concept of a garbage collector to a more general
resource manager, of which memory is only a particular kind of resource.

> Think about adding a non-memory resource to a class that was up to now only
> memory oriented - if it requires any modification on the client side, like
> adding tons of finally blocks and calls to close/dispose/dismiss/etc.
> methods *everywhere*, then in such a language the term "encapsulation" is a
> joke.

I do a lot of coding, and I simply do not find this to be such a prevelant
problem in practice. The most common case is to do with the closing of open
files in a finally block, but one only does the open in a few places and the
majority of the code is not concerned with cleanup at all.

The other common case is getting access to system drawing contexts that must
be explicitly disposed of. Again, one tends to get it only in a few places,
and the occurrences of explicit clean up are few.

Still, when I use C++ after using C# or Java, I find myself immediately taking
advantage of the scope-based destructors for these situations.

> An ideal solution seems to be a mix of both (GC and automatic objects), but I
> think that the industry needs a few generations of failed attempts to get this
> mix right. We're not yet there.

Yes, when you need scope based cleanup, proper controlled types are useful.
We will see how it goes.

Still, with the existence of proper closures in a language, one can get scope
clean up already. In the C# example below, I can define a file open/close
helper that let's the grunt-work cleanup be specified only once, such that I
pass an "action" closure to perform the actual work on an open file (think of
"delegate" as closure/lambda/procedure, etc.):

  string path = "C:/temp/someFile.xml";
  string xmlData = null;
  WithOpenFile
    (path, 
     delegate (TextReader reader) 
     {
       xmlData = reader.ReadToEnd(); // note we can modify the local variable
     });

where WithOpenFile is defined as:

  public delegate void FileReader(TextReader reader);

  public void WithOpenFile(string path, FileReader fileReader)
  {
    TextReader reader = new TextReader(path);
    try
    {
       fileReader(reader);
    }
    finally
    {
       reader.Close();
    }
  }
-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-05  0:55                                           ` Markus E Leypold
@ 2007-02-06  0:01                                             ` Robert A Duff
  2007-02-06  1:06                                               ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-02-06  0:01 UTC (permalink / raw)


Markus E Leypold
<development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:

> Robert A Duff <bobduff@shell01.TheWorld.com> writes:
>
>> Markus E Leypold
>> <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>>
>>> I do need that
>>>
>>>   function Make_a_Stepper (K:Integer) return ... is
>>>
>>>       N : Integer := Complicated_Function(K);
>>>
>>>       function Stepper(F:Foo) return Foo is
>>>     
>>>         begin
>>>           
>>>           return Foo + N;
>>>
>>>         end;
>>>
>>>     begin
>>>       return Stepper;
>>>     end;
>>>
>>> would work. And no nitpicking, please, if I made syntax error here: The
>>> intention should be clear.
>>
>> I don't think this is nitpicking:
>>
>> I'm not sure what you mean by the above example.  Is N intended to be
>> constant or variable (in Ada terms -- that's immutable/immutable in FP
>> terms)?
>
> Ah, you see, that is the point with closures. If we were talking about
> closures, N would be different for every invocation of Make_Stepper
> but every Stepper returned would refer to a constant N during its
> livetime.

The VALUE of N is different for every....

>... THis is why they are called closure (they enclose their
> environment at the instance of their instantiation which in this case
> would be (in a sense) between the invocation of Make_Stepper and
> before the body of Make_Stepper is executed.

Agreed.

>> Do you mean this:
>>
>>   function Make_a_Stepper (K:Integer) return ... is
>>       N : constant Integer := Complicated_Function(K); -- I added "constant" here! ****************
>>       function Stepper(F:Foo) return Foo is
>>         begin
>>           return Foo + N;
>>         end;
>>     begin
>>       return Stepper;
>>     end;
>>
>> Or this:
>>
>>   function Make_a_Stepper (K:Integer) return ... is
>>       N : Integer := Complicated_Function(K);
>>       function Stepper(F:Foo) return Foo is
>>         begin
>>           N := N + 1; -- I added a modification of N here! ****************
>>           return Foo + N;
>>         end;
>>     begin
>>       return Stepper;
>>     end;
>>
>> ?
>>
>> These two cases of so-called "upward closure" seem very different to me.
>
> Ah, yes, I see how that comes about, since we are not talking names
> and their bindings to values here, but variables in Ada. Actually I'd
> say both. The last case is no pure functional programming (just for
> the record) and in OCaml would b expressed like this:
>
>   let make_stepper k = 
>       let n = ref (complicated k) 
>       in
>         (fun f -> n := !n + 1; f + !n)
>
> and in the first case (just to show the difference)
>
>   let make_stepper k = 
>       let n = (complicated k) 
>       in
>         (fun f -> f + n)
>
> So in (impure) FP both cases are possible and if we go back to Ada
> syntax from that: Why not use the constant keyword to distinguish the
> cases?

Because in Ada I want local variables (local in visibility and local and
lifetime).  I think the Ada equivalent of OCaml's non-functional
"let n = ref (complicated k)" would be:

    type Ref_Integer is access Integer;
    N: constant Ref_Integer := new Integer'(Complicated_Function(K));

That makes it clear that the integer variable we're talking about is an
object allocated on the heap and therefore has who-knows-what lifetime.
I have no problem with forming a closure that can see N (a constant)
and passing that closure outward.

My point is: I want to understand the lifetime of a variable (Ada term)
by looking at its declaration.  I don't care about the lifetimes of
constants/values -- they can be copied willy-nilly anyway.

>> (Both are of course illegal in Ada, but we're discussing what "ought" to be.)
>
> Ehm. And I'm thankful for you rather insightful input :-).

Thanks for saying so.  And I'm thankful to you for helping me understand
these conceptual issues.  :-)

>> In the former case, we're returning a function that looks at some local
>> value (the value of N).  In this case, you comment about returning
>> integer values is quite correct:
>>
>>>> Closure don't "make things global". They do it as much as returning an
>>>> Integer, makes, GOD FORBID!, a value global (don't return Integers
>>>> people, that makes a value global and we all know, global is bad --
>>>> how nonsensical is that?).
>>
>> Right.  What's the big deal?  Returning a function that looks at integer
>> values is no worse than returning integer values.  Integer values live
>> forever, and the value of N is certainly not tied to the lifetime of N.
>
>
>> But in the latter, case, the return of Stepper is causing the lifetime
>> of an integer VARIABLE (i.e. mutable) to be longer than one might
>> suspect for a local variable of Make_A_Stepper.  That seems like a
>> problem to me.
>
> It's still only encapsulated _state_, the visibility of indentifiers
> (scope) is not extended by that.

Right.  We're talking about lifetimes.  Some folks in this thread may
have said "scope" when they meant "lifetime".

>...Outsiders can't look into the
> closure. And we already have encapsulate state like this, in tagged
> objects. Closure differ from objects only in 2 points: (1) they have
> only one method (apply) and (2) they can take (implicitely) values
> from their whole environment at the place of instantiation.
>
> Indeed the more I think about it, a bit compiler support (which knows
> which parts of the environment are actually used in the closures
> body), a new keyword (closure to distinguis them from ordinary
> procedures) and all the mechanisms already present in tagged types
> should be quite enough implement closure in Ada. (Well, OK, there are
> problems with sharing N in the non-constant case with other closures
> and other procedure which can see N, but in the constant case (which
> is perhaps in the presence of object the mor important one), 'constant
> N' can be just copied into a implementation object of the type
>
>    tagged record
>      N : Integer;
>    end record;
>
>
>
>> You (Markus) seem to be an advocate of (possibly-upward) closures.
>
> Certainly. As I said in another post of the recent article storm, I
> _think_ (think, mind you, no hard data) that if too much thinks
> (narrowing of type w/o RTTI during method invocation, scope of access
> types and passing procedural parameters can only be "done downwards",
> some desirable structures become impossible.
>
>> And also an advocate of function programming.  
>
> 'functional'. :-).

Sorry, that was a typo.  I meant "functional".

>> So how would you like a rule that says the former example is OK, but
>> the latter is illegal?
>
> I'd weep silently :-).
>
>> (That is, upward closures are allowed only when you're doing functional
>> (no side-effects) programming).
>
>
>> Note that the latter is not functional -- a call to the function that
>> Make_A_Stepper returns modifies N, which is the key difference!
>
> Yes. That is so. But Ada is not functional any way and impure closures
> have their value.

This is the key point I'm interested in: why do impure outward closures
have value?  We can emulate that via tagged types, as you say:

>...Nonetheless: Since the encapsulation of mutable
> state can be done in tagged types forbidding the latter would probably
> not hurt as much as in languages w/o OO. Still, that requires further
> analysis of the most frequent use cases. 

And it seems to me there's some value in knowing that a local variable
of a procedure has local lifetime.  And if it doesn't, the extra
syntactic baggage of tagged types or access types or whatever seems like
a benefit (i.e. warning: this thing lives longer).

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-05  1:00                                           ` Ray Blaak
  2007-02-05  1:19                                             ` Markus E Leypold
@ 2007-02-06  0:18                                             ` Robert A Duff
  2007-02-06  0:59                                               ` Ray Blaak
  2007-02-06  1:07                                               ` Markus E Leypold
  1 sibling, 2 replies; 397+ messages in thread
From: Robert A Duff @ 2007-02-06  0:18 UTC (permalink / raw)


Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:

> Robert A Duff <bobduff@shell01.TheWorld.com> writes:
>> But in the latter, case, the return of Stepper is causing the lifetime
>> of an integer VARIABLE (i.e. mutable) to be longer than one might
>> suspect for a local variable of Make_A_Stepper.  That seems like a
>> problem to me.
>
> Yes, the lifetime is longer (as specified by the programmer) but the global
> visibility is not affected -- N still cannot be directly manipulated in a
> global sense.

My complaint is that it's not specified by the programmer on the
declaration of that variable (mutable object).  Some nested procedure
happens to mention N and that procedure gets returned to a more-global
place, dragging N along with it.  (Yes of course I understand that
"N still cannot be directly manipulated in a global sense.")

I've got no problem with passing functions (closures) outward in a
functional (no mutable variables) context.

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-05  9:06                                   ` Ray Blaak
@ 2007-02-06  0:28                                     ` Robert A Duff
  2007-02-06  8:24                                       ` Ray Blaak
  0 siblings, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-02-06  0:28 UTC (permalink / raw)


Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:

> Robert A Duff <bobduff@shell01.TheWorld.com> writes:
>> > GC is one of the modern advances of computing science, period, akin to the use
>>                    ^^^^^^
>> > of high level languages vs assembly, lexical scoping vs dynamic scoping,
>> > strong typing vs no typing, etc.
>> 
>> Modern?  GC is many years old.
>
> Well, sure. But you do realize that mainstream computing is slowly and
> inexorably reimplementing good old fashioned Lisp, right? :-)

Yeah.  ;-)

Did you mean "reimplementing... Lisp, right?"
or "reimplementing... Lisp right."?

;-)

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06  0:18                                             ` Robert A Duff
@ 2007-02-06  0:59                                               ` Ray Blaak
  2007-02-06  1:07                                               ` Markus E Leypold
  1 sibling, 0 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-06  0:59 UTC (permalink / raw)


Robert A Duff <bobduff@shell01.TheWorld.com> writes:
> My complaint is that it's not specified by the programmer on the
> declaration of that variable (mutable object).  Some nested procedure
> happens to mention N and that procedure gets returned to a more-global
> place, dragging N along with it.

I am wondering if your complaint is caused by strangeness of it all, rather
than its actual detrimental effects.

Consider a local pointer variable. You cannot tell from the declaration that
the lifetime of the pointer will necessarily outlive its scope. Possibly it
can, but some return call might pass it out, some might not. Some nested
procedure might do so, some might not. 

You instead have to look to see the actual step of extension past its scope to
know.

I suppose that the pointer's declaration explicitly indicates heap allocation
makes the essential difference to you. As a programmer you simply feel more in
control of what is precisely happening.

In languages with full closures, *any* local variable can outlive it's
scope. That you cannot tell this from the variable itself does not bother me
at all. Why should it?

Local variables can only be captured by the lexical scoping of a nested
procedure/closure. "Some nested procedure" should be very much obviously
present and visible to the programmer. How big do methods on typical Ada
projects get anyway? I tend to assume that a method is always under the
control of a single programmer at a time.

I would think that any fears you have about a local variable having an
unexpected lifetime should also occur with the heap allocated values as
well. After all they must be more carefully managed. It is just that with heap
values you are naturally "on guard".

Well, the presence of closure declarations provides a similar indication.

Of course, without GC, things are quite a bit more dangerous, since it becomes
quite a bit more difficult to reason when something can be safely freed. That
might also explain the instinct to mistrust general closures.

My question in the context of this example is how can a mutating local integer
variable that outlives its scope be possibly dangerous? Only the closure is
operating on it and nothing else. There are no heap issues to worry
about. What is the possible harm?

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06  0:01                                             ` Robert A Duff
@ 2007-02-06  1:06                                               ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-06  1:06 UTC (permalink / raw)



Robert A Duff <bobduff@shell01.TheWorld.com> writes:

>> So in (impure) FP both cases are possible and if we go back to Ada
>> syntax from that: Why not use the constant keyword to distinguish the
>> cases?
>
> Because in Ada I want local variables (local in visibility and local and
> lifetime).  I think the Ada equivalent of OCaml's non-functional
> "let n = ref (complicated k)" would be:
>
>     type Ref_Integer is access Integer;
>     N: constant Ref_Integer := new Integer'(Complicated_Function(K));
>
> That makes it clear that the integer variable we're talking about is an
> object allocated on the heap and therefore has who-knows-what lifetime.
> I have no problem with forming a closure that can see N (a constant)
> and passing that closure outward.
>
> My point is: I want to understand the lifetime of a variable (Ada term)
> by looking at its declaration.  I don't care about the lifetimes of
> constants/values -- they can be copied willy-nilly anyway.

I agree with the last part of the last sentence, but your handling of
mutable data strikes me as complicated: Variables are already namens
for mutable storage (that is different in ML and why we need an
explicti ref there). So N: Integer vs. N: constant Integer should be
quite enough.


>>> In the former case, we're returning a function that looks at some local
>>> value (the value of N).  In this case, you comment about returning
>>> integer values is quite correct:
>>>
>>>>> Closure don't "make things global". They do it as much as returning an
>>>>> Integer, makes, GOD FORBID!, a value global (don't return Integers
>>>>> people, that makes a value global and we all know, global is bad --
>>>>> how nonsensical is that?).
>>>
>>> Right.  What's the big deal?  Returning a function that looks at integer
>>> values is no worse than returning integer values.  Integer values live
>>> forever, and the value of N is certainly not tied to the lifetime of N.
>>
>>
>>> But in the latter, case, the return of Stepper is causing the lifetime
>>> of an integer VARIABLE (i.e. mutable) to be longer than one might
>>> suspect for a local variable of Make_A_Stepper.  That seems like a
>>> problem to me.
>>
>> It's still only encapsulated _state_, the visibility of indentifiers
>> (scope) is not extended by that.
>
> Right.  We're talking about lifetimes.  Some folks in this thread may
> have said "scope" when they meant "lifetime".

Yes, that was the underlying reason dfor sime mixups. But as I said:
We've already indefinitely long living storage by the fact that we
have 'new' and can return objects (which contain state).

>
>>...Outsiders can't look into the
>> closure. And we already have encapsulate state like this, in tagged
>> objects. Closure differ from objects only in 2 points: (1) they have
>> only one method (apply) and (2) they can take (implicitely) values
>> from their whole environment at the place of instantiation.
>>
>> Indeed the more I think about it, a bit compiler support (which knows
>> which parts of the environment are actually used in the closures
>> body), a new keyword (closure to distinguis them from ordinary
>> procedures) and all the mechanisms already present in tagged types
>> should be quite enough implement closure in Ada. (Well, OK, there are
>> problems with sharing N in the non-constant case with other closures
>> and other procedure which can see N, but in the constant case (which
>> is perhaps in the presence of object the mor important one), 'constant
>> N' can be just copied into a implementation object of the type
>>
>>    tagged record
>>      N : Integer;
>>    end record;


BTW. I take that back. It's not as easy as that to tack really useful
closures on top of a stack oriented language implementation.

>>
>>> You (Markus) seem to be an advocate of (possibly-upward) closures.
>>
>> Certainly. As I said in another post of the recent article storm, I
>> _think_ (think, mind you, no hard data) that if too much thinks
>> (narrowing of type w/o RTTI during method invocation, scope of access
>> types and passing procedural parameters can only be "done downwards",
>> some desirable structures become impossible.
>>
>>> And also an advocate of function programming.  
>>
>> 'functional'. :-).
>
> Sorry, that was a typo.  I meant "functional".

I know. Therefore the ":-)".

>
>>> So how would you like a rule that says the former example is OK, but
>>> the latter is illegal?
>>
>> I'd weep silently :-).
>>
>>> (That is, upward closures are allowed only when you're doing functional
>>> (no side-effects) programming).
>>
>>
>>> Note that the latter is not functional -- a call to the function that
>>> Make_A_Stepper returns modifies N, which is the key difference!
>>
>> Yes. That is so. But Ada is not functional any way and impure closures
>> have their value.
>
> This is the key point I'm interested in: why do impure outward closures
> have value?  

> We can emulate that via tagged types, as you say:

Because ideally they can capture their whole environment e.g. when
returning functions that return functions that return functions.

Of course for any specific case that can be cumbersomely emulated by
using tagged objects: But that requires very specialized solutions in
every single case. It feels -- wrong?

Why encaspualting mutable state? Well -- i.e. to produce time stampers
and similar functional objects.

I admit, perhaps, just copying to the heap all the "external values" a
"closure" refers to at instantiation time will have the desired effect
(mostly). This still leaves the possibility to encapsulate mutable
state by having a heap reference (as you did above), Perhaps that will
work. I do not remember any important cases where state sharing
between different closures really was an important
consideration. "closure" in the following code fragment will thus just
be syntactic sugar for deriving a class and instantiating some object
from it:


  type Stepper_T is closure ( I : Integer ) return Integer;    -- in reality a cloaked tagged type


  function Make_Stepper ( K : Integer ) return Stepper_T

    N : Integer := Step_Width(K);

    closure Stepper ( I : integer ) return Integer is begin     -- derive a new tagged type and body becomes method, 
                                                              instantiate once
      return I + N;
    end;

  is begin
     return Stepper;
  end;


I try to sketch what the underlying implementation would be:


  type Stepper_T is abstract tagged null record;

  procedure Apply( S : Stepper_T ) return Integer is abstract;

  function Make_Stepper ( K : Integer ) return Stepper_T

    N : Integer := Step_Width(K);

    type Concrete_Stepper is new Stepper_T with record

         N : Integer;   -- This identifier is produced by the compiler from N in Stepper body 
                           because it refers to a definition outside the body of Stepper

    end record;

    procedure Apply( S : Concrete_Stepper ) return Integer is 
      return I + S.N; -- N as reference outside stepper body is rewritten to S.N.
    end;

    Stepper : Concrete_Stepper'( N => N );
    
  is begin
     return Stepper;
  end;
   
 
Having a shortcut syntax as follows would be also nice:

  type Stepper_T is abstract tagged null record;

  function Make_Stepper ( K : Integer ) return Stepper_T
    N : Integer := Step_Width(K);
  is begin 
    return closure ( I : integer ) return Integer is return N+I;
  end;

(This looks a bit strange, but what I want to say is, that the closure
name is actually not important and perhaps the begin/end block is a
bit nois in some cases. But YMMV.)

Even shorter:

  type Stepper_T is abstract tagged null record;

  function Make_Stepper ( K : Integer ) return Stepper_T

    N : Integer := Step_Width(K);

  is return closure ( I : integer ) return Integer is return N+I;



>>...Nonetheless: Since the encapsulation of mutable
>> state can be done in tagged types forbidding the latter would probably
>> not hurt as much as in languages w/o OO. Still, that requires further
>> analysis of the most frequent use cases. 
>
> And it seems to me there's some value in knowing that a local variable
> of a procedure has local lifetime.  

It has a local lifetime: Next call into the procedure could create a
new one, which of course could be captured by another closure. It would not be
like "static" variables in C!

> And if it doesn't, the extra syntactic baggage of tagged types or
> access types or whatever seems like a benefit (i.e. warning: this
> thing lives longer).

I understand your approach. It would go well with a "copy all relevant
data to the closure" approach (as sketched above), so basically, yes
(I hope I really understood what I'm agreeing to).

Regards -- Markus







^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06  0:18                                             ` Robert A Duff
  2007-02-06  0:59                                               ` Ray Blaak
@ 2007-02-06  1:07                                               ` Markus E Leypold
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-06  1:07 UTC (permalink / raw)




Robert A Duff <bobduff@shell01.TheWorld.com> writes:

> Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:
>
>> Robert A Duff <bobduff@shell01.TheWorld.com> writes:
>>> But in the latter, case, the return of Stepper is causing the lifetime
>>> of an integer VARIABLE (i.e. mutable) to be longer than one might
>>> suspect for a local variable of Make_A_Stepper.  That seems like a
>>> problem to me.
>>
>> Yes, the lifetime is longer (as specified by the programmer) but the global
>> visibility is not affected -- N still cannot be directly manipulated in a
>> global sense.
>
> My complaint is that it's not specified by the programmer on the
> declaration of that variable (mutable object).  Some nested procedure
> happens to mention N and that procedure gets returned to a more-global
> place, dragging N along with it.  (Yes of course I understand that
> "N still cannot be directly manipulated in a global sense.")

But exactly the same happens with record fields. They are also mutable
(when a record is returned). Why is it a problem here and not there?


Regards -- Markus

>
> I've got no problem with passing functions (closures) outward in a
> functional (no mutable variables) context.
>
> - Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06  0:28                                     ` in defense of GC Robert A Duff
@ 2007-02-06  8:24                                       ` Ray Blaak
  2007-02-06 11:50                                         ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-06  8:24 UTC (permalink / raw)


Robert A Duff <bobduff@shell01.TheWorld.com> writes:
> Did you mean "reimplementing... Lisp, right?"
> or "reimplementing... Lisp right."?

Well, both I suppose, although I imagine I would do it differently from you.

I used to be a shiny eyed Ada true believer. These days I find myself seduced
by the dynamic side of things.

The main thing I would do with Lisp is make it have optional type declarations
(Common Lisp sort of has them; Dylan has them, but what's a Dylan? And if only
it could make up it's mind about looking like a lisp vs a pascal).

I really do like C# these days, actually. You can see the Ada influence on
it. It's a pity about the politics of Microsoft, though.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-05  1:19                                             ` Markus E Leypold
@ 2007-02-06  8:32                                               ` Ray Blaak
  2007-02-06 11:07                                                 ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-06  8:32 UTC (permalink / raw)


Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
> BTW, you've been discussing (with ... I don't remember) the possibilty
> of switching GC off. My idea (even with ML-like languages) was always
> to define a subset of the language which the compiler can implement
> with a stack only. 

Does that automatically preclude any access types, unbounded strings, etc.,
then? That would be quite a limited language.

That would simplify GC right off the bat. Maybe it wouldn't be needed at all,
right there.

> One advantage of that would be that one could start out with code that
> requires GC, but is "obviously" correct and then transform in
> correctness preserving steps it into code that doesn't require
> GC. Both implementations could be tested against each other, at least
> as far as their computational function goes, not the real time
> capabilities that are perhaps only possessed by the 2nd variant.

My reaction is to not trust testing only to gain the measure of confidence
needed. You must be able to prove the the equivalence well enough.

Also, if the non-GC version was realtime, why wouldn't the first one be? Where
is the difference in behaviour? If you have no heap, then wouldn't execution
should be equivalently predictable in either case?

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-05 13:43                               ` Markus E Leypold
@ 2007-02-06  9:15                                 ` Maciej Sobczak
  2007-02-06 11:45                                   ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-06  9:15 UTC (permalink / raw)


Markus E Leypold wrote:

> Let's ditch the thread safety aspect and instead:
> Giving pointers to internal state of objects violates (a)
> encapsulation (it fixes a specific implementation) and (b) is not type
> safe.

That's right. This is a result of the fact that C-style strings are not 
encapsulated at all and interfacing to them means stepping down to the 
common ground.

>> We've been talking not only about performance, but also about
>> readability and maintenance. ;-)
> 
> Of this thread? :-)

:-)

>>>>> Furthermore I've been convinced that manual memory management hinders
>>>>> modularity.
>>>> Whereas I say that I don't care about manual memory management in my
>>>> programs. You can have modularity without GC.
>>> Certainly. But you can have more with GC.
>> In a strictly technical sense of the word, yes. But then there comes a
>> question about possible loses in other areas, like program structure
>> or clarity.
> 
> I think the absence of manual memory management code actually furthers
> clarity.

I believe so. And I stress again - GC is not the only solution for 
manual memory management.

>> Being able to just drop things on the floor is a nice feature when
>> considered in isolation, but not necessarily compatible with other
>> objectives that must be met at the same time.
> 
> Which?

Determinism in both timing and resource consumption?

>>> People who don't have GC often say that they can do anything with
>>> manual memory management.
>> And I say that this is misconception. I don't have/use GC and I don't
>> bother with *manual* memory management neither. That's the point. In
>> Ada this point is spelled [Limited_]Controlled (it's a complete mess,
>> but that's not the fault of the concept) and in C++ it's spelled
>> automatic storage duration.
> 
> My impression was that Ada Controlled storage is actually quite a
> clean concept compared to C++ storage duration.

Clean? It adds tag to the type, which then becomes a controlling type in 
every primitive operation. I got bitten by this recently.
Adding a destructor to C++ class never has any side effects like this.

Apart from this, the bare existence of *two* base types Controlled and 
Limited_Controlled means that the concepts of controlled and limited are 
not really orthogonal in the sense that adding one of these 
meta-properties affects the interface that is "shared" by the other aspect.

It's a mess. Actually, it prevents me from thinking clearly about what I 
want to achieve.

> But both tie allocation to program scope, synchronous with a stack. I
> insist that is not always desirable: It rules out some architecture,
> especially those where OO abounds.

What architecture?

> The problem with Controlled, BTW, is that it seems to interact with
> the rest of the language in such a way that GNAT didn't get it right
> even after ~10 years of development. Perhaps difficult w/o a formal
> semantics.

You see.

>> On the other hand, most languages with GC get it wrong by relying
>> *only* on GC, everywhere, whereas it is useful (if at all) only for
>> memory. 

> Now, now. Having GC doesn't preclude you from managing ressources
> unrelated to memory in a manual fashion.

Of course. No, thank you. I prefer a language which enables me to use 
the same logic for all resources, so I *don't have to* manage *anything* 
manually.
In other words, it's very nice that GC doesn't preclude me from doing 
some stuff manually, but that's not enough.

> Apart from that languages
> with GC often provide nice tricks to tie external ressources to their
> memory proxy and ditch them when the memory proxy is unreachable

These "nice tricks" are not so nice. Most of all, they provide no 
guarantee whatsoever, even that they will be invoked at all.
A friend of mine spent long evenings recently hunting for database 
connection leaks in a big Java application. That's telling something.

> And BTW - in
> fcuntional langauges you can do more against ressource leaks, sicne
> you can "wrap" functions:
> 
>   (with_file "output" (with_file "out_put" copy_data))
> 
> It's not always done, but a useful micro pattern.

Yes, it basically emulates something that is just natural in those 
languages that provide scope-based lifetime out of the box.

>> Languages like Ada or C++ provide more general solution, which is
>> conceptually not related to any kind of resource and can be
>> therefore applied to every one.
> 
> Since you're solving a problem here, which I deny that it exists

You might wish to tell this to my friend - the one hunting database 
connection leaks. :-)

> But I notice, that
> 
>  "Languages like C provide a more general solution (with regard to
>   accessing memory), which is conceptually not related to any kind of
>   fixed type system and can therefore implement any type and data model"
> 
> would become a valid argument if I agreed with you.

Except that it's not the point I'm making.

> In an FP I write (usually) something like:
> 
>      with_lock "/var/foo/some.lck" (fun () -> do_something1 (); do_something2 param;  ...).
> 
> The fact that Ada and C++ don't have curried functions and cannot
> construct unnamed functions or procedures is really limiting in this
> case and probably causal to your misconception that it would be
> necessary to add tons of exceaption handling at the client side.

Tons of exception handling (and not only - every way to leave a scope 
needs to be guarded, not only by exception) are necessary in those 
languages that rely on GC without providing the above possibility at the 
same time. The other possibility is to rely on scoped lifetime in the 
first place, where neither GC nor the above tricks are necessary to 
achieve proper cleanup.

> And BTW: In Ada I would encapsulate the ressource in a Controlled
> object (a ressource proxy or handle) and get the same effect (tying it
> to a scope).

Yes.

> Indeed I have already done so, to make a program which
> uses quite a number of locks, to remove locks when it terminated or
> crashes. Works nicely.

Of course. That's my point.
(except, maybe, the crashing part, when likely there is nobody to handle 
the cleanup)

>> Note also that I didn't say that references/pointers should be
>> dropped. I say that you don't need them everywhere. That's a
>> difference.
> 
> OK, so you need them _almost_ everywhere :-). I take your point.

No, you don't. I agree for references/pointers in polymorphic 
collections. That's not even close to "almost everywhere" for me, but 
your application domain may differ.

> 'Controlled' buys you a lot in Ada, but there are 2 problems
> 
>  (a) AFAIS (that is still my hypothesis, binding storage to scope is
>      not alway possible (esp. when doing GUIs and MVC and this
>      like). I cannot prove but from what I experienced I rather
>      convinced of it.

I don't follow this.

>  (b) AFAIR there are restrictions on _where_ I can define controlled
>      types. AFAIR that was a PITA.

That's a mess. I'm sorry to repeat that.

> But how does a program become less structured by removing the
> manual memory management? The GC is not magically transforming the
> program into spaghetti code ...

You get spathetti once you start adding finalizers - the spaghetti is 
then formed in both time (when something is invoked) and space (where 
the code is).

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24  8:07       ` Ludovic Brenta
  2007-01-24 12:12         ` Markus E Leypold
  2007-01-24 16:25         ` Adam Beneschan
@ 2007-02-06  9:54         ` Dave Thompson
  2007-02-06 11:01           ` Ludovic Brenta
  2 siblings, 1 reply; 397+ messages in thread
From: Dave Thompson @ 2007-02-06  9:54 UTC (permalink / raw)


On Wed, 24 Jan 2007 09:07:15 +0100, Ludovic Brenta
<ludovic@ludovic-brenta.org> wrote:
<snip>
> On the contrary, I think that Ada is the most expressive language
> around.  Consider:
> 
> procedure Set_Bit_In_Register (At_Address : in System.Address) is
>    type Register is array (1 .. 32) of Boolean;
>    pragma Pack (Register);
>    for Register'Bit_Order use System.High_Order_First;
>    pragma Volatile (Register);
>    
>    R : Register;
>    for R'Address use At_Address;
> begin
>    Register (4) := True;
> end;
> 
Even if this combination is/was actually supported (and MSB means
highest-subscript, which I don't find all that obvious) ...

> versus
> 
> void set_bit_in_register (volatile unsigned long * at_address)
> {
>    *at_address |= 2 << 3;
> }
> 
You presumably meant 1 << 3 (zero-origin numbering) to match subscript
4 (one-origin).

But to be safe in the (more) general case you need 1UL << B, since
promotion in C is strictly bottom-up (no expected-type context) and
the literal defaults to type 'int' which _might_ be only 16 bits while
'long' must be at least 32 bits. And you might even want an
  assert( B < 32 /* or other M */ );
(or environment/project-specific equivalent).

> The Ada version makes many more things explicit, that are assumed and
> implicit in C; for example, the size of the register, the fact that

Right, although C99 -- so far not that much more widely adopted and
available than Ada05 -- adds explicitly-sized integers at least for
the common cases of 8, 16, 32, 64. (Other values of N are up to the
implementation's option, and will probably be rare on common hw.)

> the parameter is an address and not a pointer (*), the endianness, and

See below.

> which bit is being set.  As 64-bit architectures become prevalent, the
> hidden assumption that C's "unsigned long" is 32 bits wide is more and
> more likely to be incorrect.
> 
> (*) consider that when we increment the address by one, it then
> references the next byte; whereas if we increment the pointer by one,
> it points to the next "unsigned long", i.e. 2, 4 or 8 bytes and not 1
> byte further.  C makes no distinction between addresses and pointers,
> lacking expressiveness in a crucial area.
> 
Wrong. C, even before C89, does know about pointer targets (strides).
Only _very_ early, Bell-Labs-only, pre-K&R1 C that was still in
transition from B had pointer === integer. 

> When calling the subprogram, we get:
> 
> Set_Bit_In_Register (At_Address => To_Address (16#DEADBEEF#));
> 
(optionally)

> versus
> 
> set_bit_in_register (0xDEADBEEF);
> 
The type mismatch is a constraint violation (which must be diagnosed)
if the prototype declaration (or definition) is visible, and undefined
behavior (~ Ada erroneous, but occurring in many more places, hence
not necessarily diagnosed) if not. You _must_ cast:
  set_bit_in_register ( (/*volatile*/unsigned long *) 0xDEADBEEF );
or (better?) ... ( (my_reg_addr_typedef) 0xDEADBEEF );

(assuming ulong aligned 3 mod 4 even works on your platform <G>)

(You don't actually need the volatile, since the call silently adds
it, but it is clearer -- i.e. more expressive -- to have it.)

Lack of parameter labelling is a shortcoming, although you can (and
people do) smush some of it into the routine name. And tools and/or
conventions can be used to find the description or doc when needed.

> Again, at the call site, the Ada version gives more information to the
> human programmer, i.e. is more expressive.
> 
> Expressiveness is not to be confused with conciseness.

Agree there. C does have _some_ expressiveness, where it happened to
be necessary or convenient, but not because it was actively sought.

- David.Thompson1 at worldnet.att.net



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06  9:54         ` Dave Thompson
@ 2007-02-06 11:01           ` Ludovic Brenta
  2007-02-26  5:47             ` Dave Thompson
  0 siblings, 1 reply; 397+ messages in thread
From: Ludovic Brenta @ 2007-02-06 11:01 UTC (permalink / raw)


On Feb 6, 10:54 am, Dave Thompson wrote:
> On Wed, 24 Jan 2007 09:07:15 +0100, Ludovic Brenta wrote:
[snip]
>> (*) consider that when we increment the address by one, it then
>> references the next byte; whereas if we increment the pointer by one,
>> it points to the next "unsigned long", i.e. 2, 4 or 8 bytes and not 1
>> byte further.  C makes no distinction between addresses and pointers,
>> lacking expressiveness in a crucial area.
>
> Wrong. C, even before C89, does know about pointer targets (strides).
> Only _very_ early, Bell-Labs-only, pre-K&R1 C that was still in
> transition from B had pointer === integer.

Wrong? I think we're in agreement here. I was explaining that a
pointer is not the same thing as an address, since incrementing a
pointer gives a different result (next object) than incrementing an
address (next byte).

In C, address arithmetic is implemented in terms of char* and size_t
only because sizeof(char) == 1; I think that's a hack. In contrast,
address arithmetic in Ada is in terms of System.Address and
Storage_Offset, which is much more explicit.

> > When calling the subprogram, we get:
>
> > Set_Bit_In_Register (At_Address => To_Address (16#DEADBEEF#));
>
> (optionally)
>
> > versus
>
> > set_bit_in_register (0xDEADBEEF);
>
> The type mismatch is a constraint violation (which must be diagnosed)
> if the prototype declaration (or definition) is visible, and undefined
> behavior (~ Ada erroneous, but occurring in many more places, hence
> not necessarily diagnosed) if not. You _must_ cast:
>   set_bit_in_register ( (/*volatile*/unsigned long *) 0xDEADBEEF );
> or (better?) ... ( (my_reg_addr_typedef) 0xDEADBEEF );

So 0xDEADBEEF is an int, and there is no implicit conversion to
unsigned long*. Is that what you're saying? OK, now I see my knowledge
of C is fading away...

> (assuming ulong aligned 3 mod 4 even works on your platform <G>)

yeah, let's ignore that :-)

Thanks for your comments.

--
Ludovic Brenta.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06  8:32                                               ` Ray Blaak
@ 2007-02-06 11:07                                                 ` Markus E Leypold
  2007-02-06 18:01                                                   ` Ray Blaak
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-06 11:07 UTC (permalink / raw)




Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:

> Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>> BTW, you've been discussing (with ... I don't remember) the possibilty
>> of switching GC off. My idea (even with ML-like languages) was always
>> to define a subset of the language which the compiler can implement
>> with a stack only. 
>
> Does that automatically preclude any access types, unbounded strings, etc.,
> then? That would be quite a limited language.

Not necessarily. It precludes any _handling_ of stuff in a way that
doesn't allow the compiler to de-allocate memory when leaving the
scope. Basically the compiler would look at what happens and instead
of using heap allocated (GC'ed) memory it would use 'Controlled'
memory (in Ada parlance).


And BTW: AFAIK that has already been done once for some obscure subset of ML.

>
> That would simplify GC right off the bat. Maybe it wouldn't be needed at all,
> right there.

:-).


>> One advantage of that would be that one could start out with code that
>> requires GC, but is "obviously" correct and then transform in
>> correctness preserving steps it into code that doesn't require
>> GC. Both implementations could be tested against each other, at least
>> as far as their computational function goes, not the real time
>> capabilities that are perhaps only possessed by the 2nd variant.
>
> My reaction is to not trust testing only to gain the measure of confidence
> needed. You must be able to prove the the equivalence well enough.

Right. Therefore (in the case where more trust is needed) the proof of
euivalence by equivalence transformations (for a suitable definition
of equivalence).

> Also, if the non-GC version was realtime, why wouldn't the first one be? 

Because the interfaces change. Only the data processing provided is
equivalent, but the way data is passed and retrieved, changes.

> Where is the difference in behaviour? If you have no heap, then
> wouldn't execution should be equivalently predictable in either
> case?

I'll try to make up a simple, but not necessarily good example. This
is not about runtime GC but about going from GC to manual memory
management, but I hope it will serve. The programs concatenate 2
strings:

With GC (naive implementation)

   let concat1 (s1:string) (s1:string) = s1 ^ s2;;

With manual MM.

  let concat2  (s1:string) (s2:string) (result: string ref) = 

      result := Some (s1 ^ s2)

I realize this might not be the beste example, since I just made it
up. But the idea is, that (string ref) is (by the compiler) translated
into a char* and that result := ... actually becomes a malloc() plus a
copying of memory. some_var := None would compile to explicit
deallocation (pointer is overwritten with null).

Both functions are equivalent (they concatenate strings), but the
second would not be required to work in a GC'ed environment.

Similarily, I think, it might be possible to avoid malloc() if
everything is done in static or local buffers.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06  9:15                                 ` Maciej Sobczak
@ 2007-02-06 11:45                                   ` Markus E Leypold
  2007-02-06 14:16                                     ` Maciej Sobczak
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-06 11:45 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

>> I think the absence of manual memory management code actually
>> furthers
>> clarity.
>
> I believe so. And I stress again - GC is not the only solution for
> manual memory management.

OK. I accept that for the moment. I'm just not convinced you can do
everything you need with scope bound memory and if you introduce
manual MM again, you'll be back to confusing manual memory
management. I say this just to restate my point of view clearly once
again -- not that I think that either of us can prove his position at
the moment.

>>> Being able to just drop things on the floor is a nice feature when
>>> considered in isolation, but not necessarily compatible with other
>>> objectives that must be met at the same time.

>> Which?
>
> Determinism in both timing and resource consumption?

Which brings me back to what I said repeatedly to other people: 

 (1) That this determinism is very often not a requirement (outside of
     embdedded programming)

 (2) The determinism is shot anyway by using a heap, not by using
     GC. Even better: GC can introduce detereminism in space
     consumption by compacting the heap, which naive heaps with manual
     MM don't do because of fragementation.

 (3) What is often needed are upper limits not determinism and thos
     upper limits can be guaranteed with GC or with an appropriate
     collector.

>>>> People who don't have GC often say that they can do anything with
>>>> manual memory management.

>>> And I say that this is misconception. 

Good. :-)

>>> Ada this point is spelled [Limited_]Controlled (it's a complete mess,
>>> but that's not the fault of the concept) and in C++ it's spelled
>>> automatic storage duration.
>> My impression was that Ada Controlled storage is actually quite a
>> clean concept compared to C++ storage duration.

> Clean? It adds tag to the type, which then becomes a controlling type
> in every primitive operation. 

> I got bitten by this recently.  Adding a destructor to C++ class
> never has any side effects like this.

I understand. But the Ada OO way is peculiar, but not unmanagable.


> Apart from this, the bare existence of *two* base types Controlled and
> Limited_Controlled means that the concepts of controlled and limited
> are not really orthogonal in the sense that adding one of these
> meta-properties affects the interface that is "shared" by the other
> aspect.

Still. Being able to add a Finalize means you need to have a tagged
type. I see no alternative.

> It's a mess. Actually, it prevents me from thinking clearly about what
> I want to achieve.

Wow.

>
>> But both tie allocation to program scope, synchronous with a stack. I
>> insist that is not always desirable: It rules out some architecture,
>> especially those where OO abounds.
>
> What architecture?

I already say in another post: That is difficult to show with a toy
system. It only shows in larger systems where you really can't / don't
want to say in any give subsystem module how long a certain peice of
data lives. So none of those can be burdened with deallocating it.

>> The problem with Controlled, BTW, is that it seems to interact with
>> the rest of the language in such a way that GNAT didn't get it right
>> even after ~10 years of development. Perhaps difficult w/o a formal
>> semantics.

> You see.

Yes, I see. But GNAT is also a political problem (see the role of
AdaCore, formerly ACT), so (public) GNAT not getting things right
might well not indicate a problem with reading the Ada standard, but
in the release politics for public version. My hint: There is no
incentive to release a high quality public version GNAT.

>>> On the other hand, most languages with GC get it wrong by relying
>>> *only* on GC, everywhere, whereas it is useful (if at all) only for
>>> memory.
>
>> Now, now. Having GC doesn't preclude you from managing ressources
>> unrelated to memory in a manual fashion.
>
> Of course. No, thank you. I prefer a language which enables me to use
> the same logic for all resources, so I *don't have to* manage
> *anything* manually.

Which as you said yourself, is difficult to do. And memory is
ressource used most frequently, whereas the temptation to e.g. drop
file descriptors is much less.

> In other words, it's very nice that GC doesn't preclude me from doing
> some stuff manually, but that's not enough.

I'm appalled: You don't want GC, but no, it doesn't do enough for you?
Of yourse YMMV. but when I have it, it works really well for me.

>> Apart from that languages
>> with GC often provide nice tricks to tie external ressources to their
>> memory proxy and ditch them when the memory proxy is unreachable
>
> These "nice tricks" are not so nice. Most of all, they provide no
> guarantee whatsoever, even that they will be invoked at all.

That's not quite true. Those tricks are building blocks to implement
ressources that are automatically finalized when becoming
unreachable. But it's up to the library author to write a complete
implementation.

> A friend of mine spent long evenings recently hunting for database
> connection leaks in a big Java application. That's telling something.

Well -- so he was naive and should have handled / understood that part
of the system better. A friend of mine spent half a month with finding
problems with manual allocation/deallocation and sneaking heap
corruption. Does that prove anything? I don't think so.

>> And BTW - in
>> fcuntional langauges you can do more against ressource leaks, sicne
>> you can "wrap" functions:
>>   (with_file "output" (with_file "out_put" copy_data))
>> It's not always done, but a useful micro pattern.

> Yes, it basically emulates something that is just natural in those
> languages that provide scope-based lifetime out of the box.

This is no emulation, but how FP does "scope based". Without the
necessity to add exception handling at the client side or without
having to introduce tagged types / classes. Isn't THAT nice? :-)

>
>>> Languages like Ada or C++ provide more general solution, which is
>>> conceptually not related to any kind of resource and can be
>>> therefore applied to every one.

>> Since you're solving a problem here, which I deny that it exists

> You might wish to tell this to my friend - the one hunting database
> connection leaks. :-)

Yes, I'll hold that up. Your friend got bitten by believing in a
mechanism where he shouldn't while I deny the the necessity to manage
other ressources by GC for the general case. It's a nice trick
sometimes, but one doesn't need it.

>> But I notice, that
>>  "Languages like C provide a more general solution (with regard to
>>   accessing memory), which is conceptually not related to any kind of
>>   fixed type system and can therefore implement any type and data model"
>> would become a valid argument if I agreed with you.
>
> Except that it's not the point I'm making.

No, but the structure of the argument is basically the same. The
analogy should help to show why it is (IMHO) invalid.

>
>> In an FP I write (usually) something like:
>>      with_lock "/var/foo/some.lck" (fun () -> do_something1 ();
>> do_something2 param;  ...).
>> The fact that Ada and C++ don't have curried functions and cannot
>> construct unnamed functions or procedures is really limiting in this
>> case and probably causal to your misconception that it would be
>> necessary to add tons of exceaption handling at the client side.

> Tons of exception handling (and not only - every way to leave a scope
> needs to be guarded, not only by exception) are necessary in those
> languages that rely on GC without providing the above possibility at
> the same time. 

No. I've done the same in Ada w/o controlled objects, but using a
generic procedure.

  procedure mark_data_records is new process_cache_with_lock( Operation => mark_record, ... );

  begin
    mark_data_records(...);
  end;


The client side has no burden with exceaption handling.


> The other possibility is to rely on scoped lifetime in
> the first place, where neither GC nor the above tricks are necessary
> to achieve proper cleanup.

Always assumed that works as a general apporach. Personally I cherish
the additional freedom I get from GC.

>> And BTW: In Ada I would encapsulate the ressource in a Controlled
>> object (a ressource proxy or handle) and get the same effect (tying it
>> to a scope).
>
> Yes.
>
>> Indeed I have already done so, to make a program which
>> uses quite a number of locks, to remove locks when it terminated or
>> crashes. Works nicely.
>
> Of course. That's my point.
> (except, maybe, the crashing part, when likely there is nobody to
> handle the cleanup)

By "crashes" I mean uncaught exceptions propagating back to the main
procedure. In those cases Finalize() runs.

I've BTW, done the same on OCaml (library for automatically
deallocating locks), so I don't see how GC prevents me from doing so.


>>> Note also that I didn't say that references/pointers should be
>>> dropped. I say that you don't need them everywhere. That's a
>>> difference.

>> OK, so you need them _almost_ everywhere :-). I take your point.

> No, you don't. I agree for references/pointers in polymorphic
> collections. That's not even close to "almost everywhere" for me, but
> your application domain may differ.

Yes. it does, abviously. You might not be aware, but code destined for
mere consumers (as opposed to embedded code and code destined as tools
for other developers) has a large amount of GUI code in it.

>> 'Controlled' buys you a lot in Ada, but there are 2 problems
>>  (a) AFAIS (that is still my hypothesis, binding storage to scope is
>>      not alway possible (esp. when doing GUIs and MVC and this
>>      like). I cannot prove but from what I experienced I rather
>>      convinced of it.
>
> I don't follow this.
>
>>  (b) AFAIR there are restrictions on _where_ I can define controlled
>>      types. AFAIR that was a PITA.
>
> That's a mess. I'm sorry to repeat that.

Yes. But does C++ do it better? The Ada restrictions AFAIK come from
the necessity of separate linking and compilation (you must be able to
relink w/o looking at the body) and C++ treats that against the
ability to add finalizers everyhwere.


>> But how does a program become less structured by removing the
>> manual memory management? The GC is not magically transforming the
>> program into spaghetti code ...

> You get spathetti once you start adding finalizers - the spaghetti is
> then formed in both time (when something is invoked) and space (where
> the code is).

No. Neither GC nore finalizers make code incomprehensible. I can only
assert it again, since are not discussing proofs here ore specific
examples.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06  8:24                                       ` Ray Blaak
@ 2007-02-06 11:50                                         ` Markus E Leypold
  2007-02-07  7:44                                           ` Ray Blaak
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-06 11:50 UTC (permalink / raw)



Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:

> Robert A Duff <bobduff@shell01.TheWorld.com> writes:
>> Did you mean "reimplementing... Lisp, right?"
>> or "reimplementing... Lisp right."?
>
> Well, both I suppose, although I imagine I would do it differently from you.
>
> I used to be a shiny eyed Ada true believer. These days I find myself seduced
> by the dynamic side of things.
>
> The main thing I would do with Lisp is make it have optional type declarations

That's why you MUST use Ocaml :-). No, joke, I do not know what is the
important thing to you in Lisp. OCaml has no macros etc. But the
typing makes things vastly more manageable.

> (Common Lisp sort of has them; Dylan has them, but what's a Dylan? And if only
> it could make up it's mind about looking like a lisp vs a pascal).

:-)

>
> I really do like C# these days, actually. You can see the Ada influence on
> it. It's a pity about the politics of Microsoft, though.


Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06 11:45                                   ` Markus E Leypold
@ 2007-02-06 14:16                                     ` Maciej Sobczak
  2007-02-06 15:44                                       ` Markus E Leypold
  2007-02-06 17:47                                       ` Ray Blaak
  0 siblings, 2 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-06 14:16 UTC (permalink / raw)


Markus E Leypold wrote:

>> And I stress again - GC is not the only solution for
>> manual memory management.
> 
> OK. I accept that for the moment. I'm just not convinced you can do
> everything you need with scope bound memory

Sure. But for the sake of discussion completeness, you might wish to 
throw an example of a situation where scoped lifetime will not make it.

>> Determinism in both timing and resource consumption?
> 
> Which brings me back to what I said repeatedly to other people: 
> 
>  (1) That this determinism is very often not a requirement (outside of
>      embdedded programming)

Java programmer wrote a loop where he opened database cursors, released 
in the cursor finalizer. All was working like a charm, unless put into 
production, when in one case the loop had to spin many more times than 
he ever cared to test. GC did not clean up the abandoned cursor objects 
fast enough and the number of unnecessarily opened cursors hit the 
server limit. That was the end of this application.

The fix was easy: write explicit close/dispose/dismiss/whatever at the 
end of the loop, so that effectively there was never more than one open 
cursor. In fact, this was *manual* resource management.

The above would be avoided altogether with scoped lifetime.

You are right that determinism is very often not a requirement. It is 
just the life that very often shows that the initial requirements were 
not complete.

>  (2) The determinism is shot anyway by using a heap, not by using
>      GC. Even better: GC can introduce detereminism in space
>      consumption by compacting the heap, which naive heaps with manual
>      MM don't do because of fragementation.

There is nothing particular in scoped lifetime that would prohibit 
compacting heaps and there is nothing particular in GC that guarantees 
it. It's just the statistics based on popular implementations, not a rule.
I can perfectly imagine compacting heaps managed by scoped lifetime.

>  (3) What is often needed are upper limits not determinism and thos
>      upper limits can be guaranteed with GC or with an appropriate
>      collector.

This refers to memory consumption only, whereas I clearly stated 
deterministic *time* as a second (first, actually) goal.

>>> My impression was that Ada Controlled storage is actually quite a
>>> clean concept compared to C++ storage duration.
> 
>> Clean? It adds tag to the type, which then becomes a controlling type
>> in every primitive operation. 
> 
>> I got bitten by this recently.  Adding a destructor to C++ class
>> never has any side effects like this.
> 
> I understand. But the Ada OO way is peculiar, but not unmanagable.

OK, I accept the word "peculiar". I only oppose "quite a clean concept" 
in your previous post. :-)

>> Apart from this, the bare existence of *two* base types Controlled and
>> Limited_Controlled means that the concepts of controlled and limited
>> are not really orthogonal in the sense that adding one of these
>> meta-properties affects the interface that is "shared" by the other
>> aspect.
> 
> Still. Being able to add a Finalize means you need to have a tagged
> type. I see no alternative.

You might want to take a look at C++.

>>> But both tie allocation to program scope, synchronous with a stack. I
>>> insist that is not always desirable: It rules out some architecture,
>>> especially those where OO abounds.
>> What architecture?
> 
> I already say in another post: That is difficult to show with a toy
> system. It only shows in larger systems where you really can't / don't
> want to say in any give subsystem module how long a certain peice of
> data lives. So none of those can be burdened with deallocating it.

OK. What about refcounting with smart pointers?

>>> The problem with Controlled, BTW, is that it seems to interact with
>>> the rest of the language in such a way that GNAT didn't get it right
>>> even after ~10 years of development. Perhaps difficult w/o a formal
>>> semantics.
> 
>> You see.
> 
> Yes, I see. But GNAT is also a political problem (see the role of
> AdaCore, formerly ACT), so (public) GNAT not getting things right
> might well not indicate a problem with reading the Ada standard, but
> in the release politics for public version. My hint: There is no
> incentive to release a high quality public version GNAT.

I get the message. Clear enough.

>> In other words, it's very nice that GC doesn't preclude me from doing
>> some stuff manually, but that's not enough.
> 
> I'm appalled: You don't want GC, but no, it doesn't do enough for you?

Exactly. It's not enough, because it doesn't solve the problem of 
resource management in a general way.

> Of yourse YMMV. but when I have it, it works really well for me.

I acknowledge that there might be some applications which are strictly 
memory-oriented. They are just not the ones I usually write.

>>> Apart from that languages
>>> with GC often provide nice tricks to tie external ressources to their
>>> memory proxy and ditch them when the memory proxy is unreachable
>> These "nice tricks" are not so nice. Most of all, they provide no
>> guarantee whatsoever, even that they will be invoked at all.
> 
> That's not quite true. Those tricks are building blocks to implement
> ressources that are automatically finalized when becoming
> unreachable. But it's up to the library author to write a complete
> implementation.

I don't understand. If the is no guarantee that the finalizer will be 
*ever* called, then what kind of building block it is?

>> A friend of mine spent long evenings recently hunting for database
>> connection leaks in a big Java application. That's telling something.
> 
> Well -- so he was naive and should have handled / understood that part
> of the system better.

Sure. In other words, be prepared that with GC you have to 
handle/understand some parts of the sytem better.

> A friend of mine spent half a month with finding
> problems with manual allocation/deallocation and sneaking heap
> corruption. Does that prove anything? I don't think so.

It does prove that your friend did not benefit from the language that 
provides scoped lifetime.

>>> And BTW - in
>>> fcuntional langauges you can do more against ressource leaks, sicne
>>> you can "wrap" functions:
>>>   (with_file "output" (with_file "out_put" copy_data))
>>> It's not always done, but a useful micro pattern.
> 
>> Yes, it basically emulates something that is just natural in those
>> languages that provide scope-based lifetime out of the box.
> 
> This is no emulation, but how FP does "scope based". Without the
> necessity to add exception handling at the client side or without
> having to introduce tagged types / classes. Isn't THAT nice? :-)

Same thing with scoped lifetime, as implemented in C++. No need for 
exception handling (unless handling is actually meaninful), nor for 
changes in the interface. That's nice, I agree.
The difference is that in languages with scoped lifetime the lifetime 
management is a property of the type (and so applies to all instances), 
whereas the "FP-trick" above is a property of the use-side. Which one is 
more robust and less prone to bugs?

BTW - please show me an example involving 10 objects of different kinds. :-)

>>> But I notice, that
>>>  "Languages like C provide a more general solution (with regard to
>>>   accessing memory), which is conceptually not related to any kind of
>>>   fixed type system and can therefore implement any type and data model"
>>> would become a valid argument if I agreed with you.
>> Except that it's not the point I'm making.
> 
> No, but the structure of the argument is basically the same. The
> analogy should help to show why it is (IMHO) invalid.

Ok, but please elaborate on the above first, so I'm sure that it relates 
to my point.

>> Tons of exception handling (and not only - every way to leave a scope
>> needs to be guarded, not only by exception) are necessary in those
>> languages that rely on GC without providing the above possibility at
>> the same time. 
> 
> No. I've done the same in Ada w/o controlled objects, but using a
> generic procedure.
> 
>   procedure mark_data_records is new process_cache_with_lock( Operation => mark_record, ... );
> 
>   begin
>     mark_data_records(...);
>   end;
> 
> The client side has no burden with exceaption handling.

Could you explain this example a bit?

>> I agree for references/pointers in polymorphic
>> collections. That's not even close to "almost everywhere" for me, but
>> your application domain may differ.
> 
> Yes. it does, abviously. You might not be aware, but code destined for
> mere consumers (as opposed to embedded code and code destined as tools
> for other developers) has a large amount of GUI code in it.

Yes.

>>>  (b) AFAIR there are restrictions on _where_ I can define controlled
>>>      types. AFAIR that was a PITA.
>> That's a mess. I'm sorry to repeat that.
> 
> Yes. But does C++ do it better? The Ada restrictions AFAIK come from
> the necessity of separate linking and compilation (you must be able to
> relink w/o looking at the body) and C++ treats that against the
> ability to add finalizers everyhwere.

I don't understand. Adding a finalizer/destructor to the type that 
didn't have it before means changes in both specs and the body. 
Relinking is not enough.


-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06 14:16                                     ` Maciej Sobczak
@ 2007-02-06 15:44                                       ` Markus E Leypold
  2007-02-06 17:40                                         ` Dmitry A. Kazakov
  2007-02-07  8:55                                         ` Maciej Sobczak
  2007-02-06 17:47                                       ` Ray Blaak
  1 sibling, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-06 15:44 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:
>
>>> And I stress again - GC is not the only solution for
>>> manual memory management.
>> OK. I accept that for the moment. I'm just not convinced you can do
>> everything you need with scope bound memory
>
> Sure. But for the sake of discussion completeness, you might wish to
> throw an example of a situation where scoped lifetime will not make it.

Model-View-Controller in GUIs. Especially trying to adapt that to GTKAda.

Sorry for being so short on this, but detailing this example would be
very long and after all perhaps not very convincing. At every local
vie of the situation one could argue that it would just be possible to
... whatever. But in the whole it is really hard to build a reusable
toolkit this way without reference counting or GC. Certainly it is
impossible (I'm convinced) with scope bound live time. Unfortunately
failure (or at least tremendous difficulties) to build something in s
specific fashion w/o a (semi-) formal proof or at least the
possibility to strip that down to a minimal example is not very
convincing, since you could always assume that more research would
have found a solution. (In my case I was happy to build a specialized
solution and note for later reference the susspicion that controlled
objects would be needed for a general one and scope bound life
wouldn't suffice).

I always intended to look a bit deeper into this issue but until now
other things were more important.

>>> Determinism in both timing and resource consumption?
>> Which brings me back to what I said repeatedly to other people:  (1)
>> That this determinism is very often not a requirement (outside of
>>      embdedded programming)
>
> Java programmer wrote a loop where he opened database cursors,
> released in the cursor finalizer. All was working like a charm, unless
> put into production, when in one case the loop had to spin many more
> times than he ever cared to test.

I'm not surprised. GC'ing ressources that are bounded doesn't spare
you knowing about the way GC works. My suggestion would have been to
either close the cursor explicitely (since I know about the problem)
or wrap the production of a new cursor in a module/class which also
looks at the number of already opened cursors and collects before
reaching certain limits (in effect introducing a new, additional,
threshold for GC).

> GC did not clean up the abandoned
> cursor objects fast enough and the number of unnecessarily opened
> cursors hit the server limit. That was the end of this application.

:-)

>
> The fix was easy: write explicit close/dispose/dismiss/whatever at the
> end of the loop, so that effectively there was never more than one
> open cursor. In fact, this was *manual* resource management.

Yes. As I said: GC can be made into an instrument to manage other
ressources, but it has to be done right. Sometimes you're just better
of assisting this mechanism by manually disposing of external
ressources at the right places. You're approach "I want it all, and if
I can't have both (memory management AND general ressource collection)
I don't want neither" is somewhat counterproductive. 

But you might well continue to believe in your policy here. I,
personally, find that it brings me a big step nearer to salvation if I
can have GC, even if I do only manual MM with it. After all: I don't
have this much other (external) ressources to care about and if I do,
it pays to have a careful look at their structure and then wrap some
abstraction around them.

> The above would be avoided altogether with scoped lifetime.

In this case yes. See -- I do not deny the advantages of 'scoped
lifetime'. It is a useful pattern, I've myself given some examples in
my last answer. But your approach is, since somebody had problems
misusing GC in a specific case in which scoped lifetime would have
worked fine, that therefore GC is useless and scoped lifetime
rules. Personally I prefer to have both approaches at hand, they are
complementary, but I certainly wouldn't want to miss GC in some
languages.

As far as the usability of GC goes, it even helps with controlled
objects: Controlled objects might be highly structured and the usual
(i.e. Ada) apporach is, that you hide the details of building and
later deallocating the structure under the hodd of the abstraction
barrier. Fine. That works. But with GC I don't even have to write a
tear-it-down procedure for everything a scoped object allocates under
the hood. I just make sure to close (e.g.) the filehandle and let the
rest to the GC.

> You are right that determinism is very often not a requirement. It is
> just the life that very often shows that the initial requirements were
> not complete.
>
>>  (2) The determinism is shot anyway by using a heap, not by using
>>      GC. Even better: GC can introduce detereminism in space
>>      consumption by compacting the heap, which naive heaps with manual
>>      MM don't do because of fragementation.
>
> There is nothing particular in scoped lifetime that would prohibit
> compacting heaps and there is nothing particular in GC that guarantees

No. But without having 3/4ths of a GC anyway compaction is pretty
pointless.

> it. It's just the statistics based on popular implementations, not a
> rule.

Sorry, that is nonsense. There are garbage collectors that are
designed to be compacting. They are moving objects around. This is
absolutely deterministic and not statistical. Whereas manual
allocation and deallocation as in Ada or C will fragment the heap and
you have NO guarantee (only statistics) about the ratio of allocated
(used) memory and presently unusable hole. None. Hows that about
reliability if you can't give space guarantees even if you know about
the memory your algorithms need, since unfortunately you cannot
perdict the exact sequence of allocations?

> I can perfectly imagine compacting heaps managed by scoped lifetime.

Yes you can do that. Since you're following pointers than and reqrite
them you might as well go the whole way and deollaocate unusable
memory while you're at it.

>
>>  (3) What is often needed are upper limits not determinism and thos
>>      upper limits can be guaranteed with GC or with an appropriate
>>      collector.


> This refers to memory consumption only, whereas I clearly stated
> deterministic *time* as a second (first, actually) goal.

This refers to both, there are real time compatible GC
algorithms. Development didn't stop in the last 20 years.

>>>> My impression was that Ada Controlled storage is actually quite a
>>>> clean concept compared to C++ storage duration.
>>
>>> Clean? It adds tag to the type, which then becomes a controlling type
>>> in every primitive operation.
>>
>>> I got bitten by this recently.  Adding a destructor to C++ class
>>> never has any side effects like this.
>> I understand. But the Ada OO way is peculiar, but not unmanagable.

> OK, I accept the word "peculiar". I only oppose "quite a clean
> concept" in your previous post. :-)

Tha Ada OO way. 'Controlled' is just the logical consequence and on
top of tagged types quite clean. 

>>> Apart from this, the bare existence of *two* base types Controlled and
>>> Limited_Controlled means that the concepts of controlled and limited
>>> are not really orthogonal in the sense that adding one of these
>>> meta-properties affects the interface that is "shared" by the other
>>> aspect.

>> Still. Being able to add a Finalize means you need to have a tagged
>> type. I see no alternative.
>
> You might want to take a look at C++.

I know C++ rather well. :-)


>>>> But both tie allocation to program scope, synchronous with a stack. I
>>>> insist that is not always desirable: It rules out some architecture,
>>>> especially those where OO abounds.
>>> What architecture?

>> I already say in another post: That is difficult to show with a toy
>> system. It only shows in larger systems where you really can't / don't
>> want to say in any give subsystem module how long a certain peice of
>> data lives. So none of those can be burdened with deallocating it.

> OK. What about refcounting with smart pointers?

(1) It ties lifetime to multiple scopes (instead of one), (2) its not
efficient, (3) It stille doesn't work for the general case, since
there is still a place where you have to decide that you don't need
that pointer copy any more, which is unscoped.

>>>> The problem with Controlled, BTW, is that it seems to interact with
>>>> the rest of the language in such a way that GNAT didn't get it right
>>>> even after ~10 years of development. Perhaps difficult w/o a formal
>>>> semantics.
>>
>>> You see.

>> Yes, I see. But GNAT is also a political problem (see the role of
>> AdaCore, formerly ACT), so (public) GNAT not getting things right
>> might well not indicate a problem with reading the Ada standard, but
>> in the release politics for public version. My hint: There is no
>> incentive to release a high quality public version GNAT.
>
> I get the message. Clear enough.

:-) Good. Whereas A. mightily profits by all the improvements in the
GCC backend, which IMHO was their prime motivation to support and
actively push reintegration into the GCC tree (they would have been
stuck with a GCC 2.8 based compiler else). Their is another theory
that they did it all out of the goodness of their hearts, but I don't
subscribe to that.

>>> In other words, it's very nice that GC doesn't preclude me from doing
>>> some stuff manually, but that's not enough.

>> I'm appalled: You don't want GC, but no, it doesn't do enough for
>> you?

> Exactly. It's not enough, because it doesn't solve the problem of
> resource management in a general way.

Poor misguided friend. :-)


>> Of yourse YMMV. but when I have it, it works really well for me.
>
> I acknowledge that there might be some applications which are strictly
> memory-oriented. They are just not the ones I usually write.

It also works for apps that are not "memory-oriented". I think you're
missing that e.g. filehandles are really simpler and differently
structured ressource from memory. A filehandle does not contain
references to memory or other filehandle. Memory does. That vastly
simplifies the problem of managing file handles indeed so much that
I'm convinced that you don't need buitlin support for this.


>>>> Apart from that languages
>>>> with GC often provide nice tricks to tie external ressources to their
>>>> memory proxy and ditch them when the memory proxy is unreachable
>>> These "nice tricks" are not so nice. Most of all, they provide no
>>> guarantee whatsoever, even that they will be invoked at all.
>> That's not quite true. Those tricks are building blocks to implement
>> ressources that are automatically finalized when becoming
>> unreachable. But it's up to the library author to write a complete
>> implementation.
>
> I don't understand. If the is no guarantee that the finalizer will be
> *ever* called, then what kind of building block it is?
>
>>> A friend of mine spent long evenings recently hunting for database
>>> connection leaks in a big Java application. That's telling something.
>> Well -- so he was naive and should have handled / understood that
>> part
>> of the system better.
>
> Sure. In other words, be prepared that with GC you have to
> handle/understand some parts of the sytem better.

So?

>
>> A friend of mine spent half a month with finding
>> problems with manual allocation/deallocation and sneaking heap
>> corruption. Does that prove anything? I don't think so.

> It does prove that your friend did not benefit from the language that
> provides scoped lifetime.

In that case, yes. But since there is a new-Operator in Ada, leaking
would have been the same problem.

>>>> And BTW - in
>>>> fcuntional langauges you can do more against ressource leaks, sicne
>>>> you can "wrap" functions:
>>>>   (with_file "output" (with_file "out_put" copy_data))
>>>> It's not always done, but a useful micro pattern.
>>
>>> Yes, it basically emulates something that is just natural in those
>>> languages that provide scope-based lifetime out of the box.
>> This is no emulation, but how FP does "scope based". Without the
>> necessity to add exception handling at the client side or without
>> having to introduce tagged types / classes. Isn't THAT nice? :-)
>
> Same thing with scoped lifetime, as implemented in C++. No need for
> exception handling (unless handling is actually meaninful), nor for
> changes in the interface. That's nice, I agree.
> The difference is that in languages with scoped lifetime the lifetime
> management is a property of the type (and so applies to all
> instances), whereas the "FP-trick" above is a property of the
> use-side. Which one is more robust and less prone to bugs?

This is, forgive me, nonsense. I might want to use a file handle in a
scoped way here and in a free floating way there. It still afile
handel. And no -- the FP way is not "more prone to bugs" and as with
George Bauhaus I simply refuse this kind of discussion (FUD and
ContraFUD).

> BTW - please show me an example involving 10 objects of different kinds. :-)

All at the same time? Well -- bad programming. You don't do everything
at the same time in FP (and in Ada ...) and I hardly ever have
functions involving 10 parameters. 


>>>> But I notice, that
>>>>  "Languages like C provide a more general solution (with regard to
>>>>   accessing memory), which is conceptually not related to any kind of
>>>>   fixed type system and can therefore implement any type and data model"
>>>> would become a valid argument if I agreed with you.
>>> Except that it's not the point I'm making.

>> No, but the structure of the argument is basically the same. The
>> analogy should help to show why it is (IMHO) invalid.
>
> Ok, but please elaborate on the above first, so I'm sure that it
> relates to my point.

You refuse mor automation and abstraction on the pretext of generality
and better control. That exactly is the point that has been made
against: Compiled languages, structured programming, type systems,
modularization, OO, etc -- name any advance you want, it has been
opposed with arguments of exactly that kind. What is missing from them
is, though, some kind of argument that the "loss of control" or "the
loss of generaity" actually is bad, or better: Does cost more than it
pays for. Your argument, I admit, might be permissible, but it needs
more groundwork.

>>> Tons of exception handling (and not only - every way to leave a scope
>>> needs to be guarded, not only by exception) are necessary in those
>>> languages that rely on GC without providing the above possibility at
>>> the same time.
>> No. I've done the same in Ada w/o controlled objects, but using a
>> generic procedure.
>>   procedure mark_data_records is new process_cache_with_lock(
>> Operation => mark_record, ... );
>>   begin
>>     mark_data_records(...);
>>   end;
>> The client side has no burden with exceaption handling.
>
> Could you explain this example a bit?

Later. Don't hesitate to ask again. I'll just cut+paste the complete
code too, but it takes some time (which I don't have now).

>>> I agree for references/pointers in polymorphic
>>> collections. That's not even close to "almost everywhere" for me, but
>>> your application domain may differ.

>> Yes. it does, abviously. You might not be aware, but code destined
>> for
>> mere consumers (as opposed to embedded code and code destined as tools
>> for other developers) has a large amount of GUI code in it.
>
> Yes.
>
>>>>  (b) AFAIR there are restrictions on _where_ I can define controlled
>>>>      types. AFAIR that was a PITA.
>>> That's a mess. I'm sorry to repeat that.

>> Yes. But does C++ do it better? The Ada restrictions AFAIK come from
>> the necessity of separate linking and compilation (you must be able to
>> relink w/o looking at the body) and C++ treats that against the
>> ability to add finalizers everyhwere.


> I don't understand. Adding a finalizer/destructor to the type that
> didn't have it before means changes in both specs and the
> body. Relinking is not enough.

I thought you talked about the restriction where a Controlled type can
be defined.


Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06 15:44                                       ` Markus E Leypold
@ 2007-02-06 17:40                                         ` Dmitry A. Kazakov
  2007-02-07  8:55                                         ` Maciej Sobczak
  1 sibling, 0 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-06 17:40 UTC (permalink / raw)


On Tue, 06 Feb 2007 16:44:17 +0100, Markus E Leypold wrote:

> Maciej Sobczak <no.spam@no.spam.com> writes:
> 
>> Sure. But for the sake of discussion completeness, you might wish to
>> throw an example of a situation where scoped lifetime will not make it.
> 
> Model-View-Controller in GUIs. Especially trying to adapt that to GTKAda.

I am not sure what you mean, but

1. GTK+ uses reference counting. GtkAda just follows it in that respect. So
all GtkAda interface objects are access to class-wide record. These objects
are scoped. Because GtkAda is thin bindings its objects are not controlled
(neither they are in GTK+). It would be quite easy to make them controlled
and perform Unref upon Finalization.

2. MVC tree view is not different in that respect and does not impose any
additional difficulties except usual awkward interfacing with C.

You can find custom tree view models for GtkAda here:

   http://www.dmitry-kazakov.de/ada/gtkada_contributions.htm#2.1

and custom cell renderers here:

   http://www.dmitry-kazakov.de/ada/gtkada_contributions.htm#2.3

>> OK, I accept the word "peculiar". I only oppose "quite a clean
>> concept" in your previous post. :-)
> 
> Tha Ada OO way. 'Controlled' is just the logical consequence and on
> top of tagged types quite clean. 

Controlled is a hack. It is the logical consequence of an apparent desire
to leave this issue open for future language extensions.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06 14:16                                     ` Maciej Sobczak
  2007-02-06 15:44                                       ` Markus E Leypold
@ 2007-02-06 17:47                                       ` Ray Blaak
  2007-02-06 18:05                                         ` Dmitry A. Kazakov
  1 sibling, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-06 17:47 UTC (permalink / raw)


Maciej Sobczak <no.spam@no.spam.com> writes:
> OK. What about refcounting with smart pointers?

This form of GC actually works against your determinism goal. It simply is a
poor form of GC, can't handle cycles, and has unpredictable execution times.

A "real" GC smoothes out the collection costs.

But your main points of needing scoped-based resource control I indeed agree
with. My thing is convenience: sometimes that is simple wrappers with
destructors to cleanup after any possible scope exit, sometimes that is the
"WithResource" wrapper function pattern.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06 11:07                                                 ` Markus E Leypold
@ 2007-02-06 18:01                                                   ` Ray Blaak
  2007-02-06 18:25                                                     ` Markus E Leypold
  2007-02-06 19:42                                                     ` Ray Blaak
  0 siblings, 2 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-06 18:01 UTC (permalink / raw)


Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
> I realize this might not be the beste example, since I just made it
> up. But the idea is, that (string ref) is (by the compiler) translated
> into a char* and that result := ... actually becomes a malloc() plus a
> copying of memory. some_var := None would compile to explicit
> deallocation (pointer is overwritten with null).

I am not convinced you can avoid GC. Can the compiler insert the appropriate
deallocation calls? How would it know the string is not in use somewhere else?

So I see only that some sort of reference counting scheme can be implemented
in the restricted case, but that is just GC all over again, and a poor one at
that.

> Similarily, I think, it might be possible to avoid malloc() if
> everything is done in static or local buffers.

I don't think so. Many algorithms have unbounded and unpredictable memory
usage patterns. These could only be handled by viewing the (large?) static
buffers as preallocated memory, and then one has the equivalent of malloc
and/or GC all over again.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06 17:47                                       ` Ray Blaak
@ 2007-02-06 18:05                                         ` Dmitry A. Kazakov
  2007-02-06 18:28                                           ` Markus E Leypold
  2007-02-07  7:54                                           ` Maciej Sobczak
  0 siblings, 2 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-06 18:05 UTC (permalink / raw)


On Tue, 06 Feb 2007 17:47:25 GMT, Ray Blaak wrote:

> Maciej Sobczak <no.spam@no.spam.com> writes:
>> OK. What about refcounting with smart pointers?
> 
> This form of GC actually works against your determinism goal. It simply is a
> poor form of GC, can't handle cycles, and has unpredictable execution times.

Determinism /= time bounded.

For handling cycles there are weak pointers.

> A "real" GC smoothes out the collection costs.

Yes, garbage men won't come when you throw the bag out of the doors. They
will come tomorrow, or the day after tomorrow, or maybe never. But they
will send you the bill anyway... (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06 18:01                                                   ` Ray Blaak
@ 2007-02-06 18:25                                                     ` Markus E Leypold
  2007-02-06 19:42                                                     ` Ray Blaak
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-06 18:25 UTC (permalink / raw)



Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:

> Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>> I realize this might not be the beste example, since I just made it
>> up. But the idea is, that (string ref) is (by the compiler) translated
>> into a char* and that result := ... actually becomes a malloc() plus a
>> copying of memory. some_var := None would compile to explicit
>> deallocation (pointer is overwritten with null).
>
> I am not convinced you can avoid GC. Can the compiler insert the appropriate
> deallocation calls? How would it know the string is not in use somewhere else?

As I said -- not the best example perhaps. GC cannot be avoided for
the whole language but for a subset. Not a syntactic subset, a
semantic one, if you like (it's in that respect similar to
eleminiating if branches and loops). 

I admit my example is not good or at least not complete. The whole
disciplin is, AFAIK, called "compile time garbage collection" and
usuallay seems to require some support from the type system or an
annotation language which embeds proofs or hints to proofs that the
values are used in a certain way.

I'm not pushing that as something to come tomorrow. But I believe it
might be the long term future just to specialize single program parts
of a program written in the full language (which has all, including
GC) instead of embedding low level languages or writing -- god forbid
-- the whole program in a low level language only because a tiny part
of it must be optimized or provide some guarantees.

> So I see only that some sort of reference counting scheme can be implemented
> in the restricted case, but that is just GC all over again, and a poor one at
> that.

In my example, yes. If I don't say about the string reference that it
is 'unique'. I hear, the Mercury type system provides such constraints
which help the compiler to do compile time GC.

>> Similarily, I think, it might be possible to avoid malloc() if
>> everything is done in static or local buffers.
>
> I don't think so. Many algorithms have unbounded and unpredictable memory
> usage patterns. 

No, no, no! Not in the general case. You'll have to write the
algorithm accordingly and then the compiler (nudged by a compile time
directive) will try hard to implement your source w/o using GC.

The point being: If (only if) you can do it in Ada / C w/o GC you can
always write a ML+1 (let's call it like this) program that the
compiler can translate into an object file that doesn't need the GC.

Have I know expressed my vision better? This is all S-F AFAIK, but
there was at least 1 experimental ML translation system that had
something like this and the whole disciplin is called compile time
GC. Don't hold me responsible for my vagueness -- this is not my
current work. Except if you have access to a research grant, then
we'll be forthcoming with more detailed plans :-)). (Joking, but only
half).

> These could only be handled by viewing the (large?) static
> buffers as preallocated memory, and then one has the equivalent of malloc
> and/or GC all over again.

Or simply reserving them during OS load time as is indeed done in a
lot of embedded software. My vision after all was to make FP possible
in embedded systems or better: To provide a seamless environment of FP
in which a embeddable sublanguage exists. Unified testing frameworks
and tools and all this would be the ROI.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06 18:05                                         ` Dmitry A. Kazakov
@ 2007-02-06 18:28                                           ` Markus E Leypold
  2007-02-07  7:54                                           ` Maciej Sobczak
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-06 18:28 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Tue, 06 Feb 2007 17:47:25 GMT, Ray Blaak wrote:
>
>> Maciej Sobczak <no.spam@no.spam.com> writes:
>>> OK. What about refcounting with smart pointers?
>> 
>> This form of GC actually works against your determinism goal. It simply is a
>> poor form of GC, can't handle cycles, and has unpredictable execution times.
>
> Determinism /= time bounded.
>
> For handling cycles there are weak pointers.

No. For handling cycles there are GC algorithms that can handle
cycles. Weak pointers are for implementing caching of costly (in terms
of memory) but recomuptable objects/structurs.

You're again talking from a parallel universe.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06 18:01                                                   ` Ray Blaak
  2007-02-06 18:25                                                     ` Markus E Leypold
@ 2007-02-06 19:42                                                     ` Ray Blaak
  1 sibling, 0 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-06 19:42 UTC (permalink / raw)


Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:
> Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
> > Similarily, I think, it might be possible to avoid malloc() if
> > everything is done in static or local buffers.
> 
> I don't think so. Many algorithms have unbounded and unpredictable memory
> usage patterns. These could only be handled by viewing the (large?) static
> buffers as preallocated memory, and then one has the equivalent of malloc
> and/or GC all over again.

To clarify: by viewing the static buffers as a preallocated memory *pool*.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-06 11:50                                         ` Markus E Leypold
@ 2007-02-07  7:44                                           ` Ray Blaak
  2007-02-07  8:54                                             ` Georg Bauhaus
  2007-02-07 11:17                                             ` Markus E Leypold
  0 siblings, 2 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-07  7:44 UTC (permalink / raw)


Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
> That's why you MUST use Ocaml :-). No, joke, I do not know what is the
> important thing to you in Lisp. OCaml has no macros etc. But the
> typing makes things vastly more manageable.

I think what I prefer about Lisp (and Scheme) vs the ML style languages is
that Lisp is a little more laid back. The syntax is cleaner, and the whole FP
obsession thing is optional.

The whole obsession in the syntax with just how functions map from inputs to
outputs just seem a little too serious, but maybe I am confusing things with
ML. I prefer to relax with the currying in declarations and just show simple
function signatures instead. Of course in Lisp, one just passes the lambas
around and lets the runtime worry about the mismatches.

The funny thing is that the whole dynamic thing just doesn't seem to fail as
badly as the static typing purists would have us believe. Now I do want my
strong static typing, espcially for parameter mismatches on function calls,
but I find it interesting that significant software can get done just fine in
Lisp.

I have always had OCaml on my list to dig into more. Maybe I should actually
get around to it.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06 18:05                                         ` Dmitry A. Kazakov
  2007-02-06 18:28                                           ` Markus E Leypold
@ 2007-02-07  7:54                                           ` Maciej Sobczak
  2007-02-07  9:42                                             ` Markus E Leypold
  2007-02-08 18:14                                             ` Dmitry A. Kazakov
  1 sibling, 2 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-07  7:54 UTC (permalink / raw)


Dmitry A. Kazakov wrote:

>>> OK. What about refcounting with smart pointers?
>> This form of GC actually works against your determinism goal. It simply is a
>> poor form of GC, can't handle cycles, and has unpredictable execution times.
> 
> Determinism /= time bounded.

Bingo.

> For handling cycles there are weak pointers.

Not only. If you have cycles, then you'd better rethink the design.
The difference is between a) graph treated as a mesh (or mess) of nodes 
which "own" each other and b) graph treated as a collection of nodes.
The former might have ownership cycles between nodes, but not the 
latter, where ownership is an acyclic relation between graph and nodes.

I agree that this kind of restructuring is not always possible, but for 
me it is conceptually cleaner and worth trying from the beginning.

>> A "real" GC smoothes out the collection costs.
> 
> Yes, garbage men won't come when you throw the bag out of the doors. They
> will come tomorrow, or the day after tomorrow, or maybe never. But they
> will send you the bill anyway... (:-))

:-)

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-07  7:44                                           ` Ray Blaak
@ 2007-02-07  8:54                                             ` Georg Bauhaus
  2007-02-07 11:19                                               ` Markus E Leypold
  2007-02-07 19:01                                               ` Ray Blaak
  2007-02-07 11:17                                             ` Markus E Leypold
  1 sibling, 2 replies; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-07  8:54 UTC (permalink / raw)


On Wed, 2007-02-07 at 07:44 +0000, Ray Blaak wrote:

> The funny thing is that the whole dynamic thing just doesn't seem to fail as
> badly as the static typing purists would have us believe. Now I do want my
> strong static typing, espcially for parameter mismatches on function calls,
> but I find it interesting that significant software can get done just fine in
> Lisp.
> 
> I have always had OCaml on my list to dig into more. Maybe I should actually
> get around to it.

If you do this, perhaps you can find the time to use a, uh, stop-watch
to measure the time it takes to produce, read, and change the programs?
I.e., to technically manage significant software. The times will be
important input for deciding whether or not static typing does indeed
help in production.
The only thing that relates to economic efficiency, if you want it :-)


 -- Georg 




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06 15:44                                       ` Markus E Leypold
  2007-02-06 17:40                                         ` Dmitry A. Kazakov
@ 2007-02-07  8:55                                         ` Maciej Sobczak
  2007-02-07  9:30                                           ` GC in Ada Martin Krischik
                                                             ` (2 more replies)
  1 sibling, 3 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-07  8:55 UTC (permalink / raw)


Markus E Leypold wrote:

>> Sure. But for the sake of discussion completeness, you might wish to
>> throw an example of a situation where scoped lifetime will not make it.
> 
> Model-View-Controller in GUIs. Especially trying to adapt that to GTKAda.

I will not repeat Dmitry's arguments here.

>> Java programmer wrote a loop where he opened database cursors,
[...]

> I'm not surprised. GC'ing ressources that are bounded doesn't spare
> you knowing about the way GC works.

Exactly. That's why I say that the solution is incomplete. If you have 
to think about the mechanics of some solution, then that solution is not 
entirely/properly automated.

> You're approach "I want it all, and if
> I can't have both (memory management AND general ressource collection)
> I don't want neither" is somewhat counterproductive. 

I find it counterproductive to apply different management strategies 
with regard to *implementation details* of different types.
I prefer solutions which enable me to hide implementation details from 
clients (sotware engineering?). If clients have to treat different types 
differently just because their implementation details differ, then it 
means that these implementation details leak out in the form of distinct 
handling methods. I want to treat String and DatabaseCursor in the same 
way - that's the prerequisite for being productive for me.

> But you might well continue to believe in your policy here.

Thanks. :-)

> I,
> personally, find that it brings me a big step nearer to salvation if I
> can have GC, even if I do only manual MM with it. After all: I don't
> have this much other (external) ressources to care about and if I do,
> it pays to have a careful look at their structure and then wrap some
> abstraction around them.

OK, I understand it. We just agree that GC is a valid solution for 
*some* class of computing problems.
So why people claim that GC-oriented languages are general-purpose?

> But your approach is, since somebody had problems
> misusing GC in a specific case

No, that's not the point. The point is that languages which are build 
around GC tend to drop the proper support for other types of resources 
altogether. It's not the particular programmer who misused GC in a 
specific case - it's language designers who closed themselves in the GC 
cage and cranked a language that fails to provide good support for wider 
class of problems.
As I've already said, the ideal would be to have both GC and scoped 
lifetime. The problem is that there is no reliable industry experience 
with such a mix, unless we treat Boehm's GC as one.

> As far as the usability of GC goes, it even helps with controlled
> objects
[...]
> I just make sure to close (e.g.) the filehandle and let the
> rest to the GC.

Of course. But then it's up to the designer of the type to decide how to 
treat each component of that type - it should be implementation detail. 
This decision should not be put on the shoulders of the final user, 
which is now the case in mainstream GC-oriented languages. This is what 
is broken.

>> There is nothing particular in scoped lifetime that would prohibit
>> compacting heaps and there is nothing particular in GC that guarantees
> 
> No. But without having 3/4ths of a GC anyway compaction is pretty
> pointless.

Why? If the goal of compaction is to avoid fragmentation, then what is 
pointless in having compacted heaps managed by scoped lifetime?

>> it. It's just the statistics based on popular implementations, not a
>> rule.
> 
> Sorry, that is nonsense. There are garbage collectors that are
> designed to be compacting.

So what? This is exactly the statistics I'm talking about, that does not 
prove that GC guarantees compacting or that the lack of GC prevents it.

> They are moving objects around. This is
> absolutely deterministic and not statistical.

By statistics I mean the number of language implementations on the 
market that choose to use compacting GC vs. the number of languages that 
use non-compacting heaps. :-)

 > Whereas manual
> allocation and deallocation as in Ada or C will fragment the heap and
> you have NO guarantee (only statistics) about the ratio of allocated
> (used) memory and presently unusable hole.

If that bothers you, then use non-fragmenting allocators.

> Hows that about
> reliability if you can't give space guarantees even if you know about
> the memory your algorithms need, since unfortunately you cannot
> perdict the exact sequence of allocations?

I use non-fragmenting allocator and I get my guarantees.

>> I can perfectly imagine compacting heaps managed by scoped lifetime.
> 
> Yes you can do that. Since you're following pointers than and reqrite
> them you might as well go the whole way and deollaocate unusable
> memory while you're at it.

Yes. Note that scoped lifetime does not preclude GC on some lower level.
Scoped lifetime provides a hook for deterministic "good bye" action - 
there is nothing more to it. Even if that "good bye" action calls 
free/delete/whatever on some memory block, there is nothing that forces 
the runtime to return the given block of memory right back to the 
operating system. Actually, none of the self-respecting allocators do 
this systematically - instead they keep the memory around for a while in 
anticipation of future allocations. I have nothing against GC at this 
level, really (and I've seen such implementations - in fact, a fully 
standard-compliant implementation of the C language could provide 
*empty* free function and GC underneath; and fully conformant C++ 
implementation could just call destructors as a result of delete and 
leave the raw memory to GC).

What I'm against is a GC "paradigm" that prevents me from having 
deterministic "good bye" hooks for scoped lifetime. The problem is that 
most GC-oriented languages I'm aware of do have this "issue".

In other words, for me GC is acceptable as an implementation detail of 
the dynamic memory allocator. I don't care *how* the allocator deals 
with memory that I free in the same sense that I don't care *how* the 
operating system deals with files that I remove from the filesystem. 
What I care about are hooks and scoped lifetime is an obvious answer for 
this.

>>>  (3) What is often needed are upper limits not determinism and thos
>>>      upper limits can be guaranteed with GC or with an appropriate
>>>      collector.
> 
>> This refers to memory consumption only, whereas I clearly stated
>> deterministic *time* as a second (first, actually) goal.
> 
> This refers to both, there are real time compatible GC
> algorithms.

I'm interested in what is their target audience. I would expect any 
decent RT system to *refrain* from using dynamic memory except in the 
initialization phase (so that the "mission phase" is performed with 
constant set of objects), in which case RT GC would be just an answer to 
the question that nobody asked.
Experts might wish to correct me and elaborate on this.

>> OK. What about refcounting with smart pointers?
> 
> (1) It ties lifetime to multiple scopes (instead of one)

With GC tracing pointers you have the same, just the tracing is hidden.

> (2) its not
> efficient

Why?

> (3) It stille doesn't work for the general case

Neither does GC, as seen in examples. :-)

>> I acknowledge that there might be some applications which are strictly
>> memory-oriented. They are just not the ones I usually write.
> 
> It also works for apps that are not "memory-oriented". I think you're
> missing that e.g. filehandles are really simpler and differently
> structured ressource from memory. A filehandle does not contain
> references to memory or other filehandle. Memory does. That vastly
> simplifies the problem of managing file handles indeed so much that
> I'm convinced that you don't need buitlin support for this.

Somehow this idea didn't work for database cursors, as already described.

>> Sure. In other words, be prepared that with GC you have to
>> handle/understand some parts of the sytem better.
> 
> So?

So the implementation details of *some* types leak out in the sense that 
they force me to understand their internal mechanics. I don't want to.
I want to say this:

declare
     Sql : Session := Open_Session("some parameters");
     Str : String := "Hello";
begin
     -- ...
end;

instead of this:

declare
     Sql : Session := Open_Session("some parameters");
     Str : String := "Hello";
begin
     -- ...
     -- damn, I have to do *something* with *some* stuff here
end;

[about FP]
>> The difference is that in languages with scoped lifetime the lifetime
>> management is a property of the type (and so applies to all
>> instances), whereas the "FP-trick" above is a property of the
>> use-side. Which one is more robust and less prone to bugs?
> 
> This is, forgive me, nonsense. I might want to use a file handle in a
> scoped way here and in a free floating way there.

What about readability and maintainability of such code?

> And no -- the FP way is not "more prone to bugs"

Unless you use a handle in a free floating way and find later that in 
production your code was called in a loop causing handles to pile up?
I have the practical example (already described) that this way of 
thinking can lead to failures. The programmer wanted to use a database 
cursor in a free floating way. That was fine. Later his code was used in 
a loop. Ah, yes - his code was used in a loop written by another 
programmer, so his judgement about whether it's OK to use anything in a 
free floating way was misguided from the very beginning.

> and as with
> George Bauhaus I simply refuse this kind of discussion (FUD and
> ContraFUD).

OK. We will just stay unconvinced. :-)

>> BTW - please show me an example involving 10 objects of different kinds. :-)
> 
> All at the same time?

Yes.

> Well -- bad programming.

I knew you would answer this. :-)


-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* GC in Ada
  2007-02-07  8:55                                         ` Maciej Sobczak
@ 2007-02-07  9:30                                           ` Martin Krischik
  2007-02-07 11:08                                             ` Markus E Leypold
  2007-02-07 11:15                                             ` Maciej Sobczak
  2007-02-07 10:10                                           ` How come Ada isn't more popular? Georg Bauhaus
  2007-02-07 10:56                                           ` Markus E Leypold
  2 siblings, 2 replies; 397+ messages in thread
From: Martin Krischik @ 2007-02-07  9:30 UTC (permalink / raw)


Maciej Sobczak schrieb:

> Yes. Note that scoped lifetime does not preclude GC on some lower level.
> Scoped lifetime provides a hook for deterministic "good bye" action - 
> there is nothing more to it. Even if that "good bye" action calls 
> free/delete/whatever on some memory block, there is nothing that forces 
> the runtime to return the given block of memory right back to the 
> operating system. Actually, none of the self-respecting allocators do 
> this systematically - instead they keep the memory around for a while in 
> anticipation of future allocations.

I believe in most systems memory is never returned.

If I understood Unix file management right only memory at the end of the 
heap can be returned. Without compaction a no go.

And on Windows I know that only the full block allocated with MemAlloc 
can be returned. Blocks are always page sized (multiple of 4kb). A smart 
Memory manager might reserve a full block for large allocations but all 
those tiny 20 byte allocations will never be returned to the OS.

> What I'm against is a GC "paradigm" that prevents me from having 
> deterministic "good bye" hooks for scoped lifetime. The problem is that 
> most GC-oriented languages I'm aware of do have this "issue".

But isn't that exactly what "Unchecked_Deallocation" and "pragma 
Controlled" is all about? Has Ada - by your rationale - not got GC right?

Martin



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-07  7:54                                           ` Maciej Sobczak
@ 2007-02-07  9:42                                             ` Markus E Leypold
  2007-02-08  8:10                                               ` Maciej Sobczak
  2007-02-08 18:14                                             ` Dmitry A. Kazakov
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-07  9:42 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> Dmitry A. Kazakov wrote:

>
>> For handling cycles there are weak pointers.
>
> Not only. 

Not at all you wanted to say.

> If you have cycles, then you'd better rethink the design.

let rec fac n = if n = 0 then 1 else n * (fac (n-1));;

a) closure
b) cycle

:-). 

A lot of OO models with callback also require cycles. E.g. A knows B
and B registers a callback with A in certain situations.

> The difference is between a) graph treated as a mesh (or mess) of
> nodes which "own" each other and b) graph treated as a collection of
> nodes.

> The former might have ownership cycles between nodes, but not the
> latter, where ownership is an acyclic relation between graph and nodes.

> I agree that this kind of restructuring is not always possible, but
                                            ^^^^^^^^^^^^^^^^^^^^^

!!.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-07  8:55                                         ` Maciej Sobczak
  2007-02-07  9:30                                           ` GC in Ada Martin Krischik
@ 2007-02-07 10:10                                           ` Georg Bauhaus
  2007-02-07 10:56                                           ` Markus E Leypold
  2 siblings, 0 replies; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-07 10:10 UTC (permalink / raw)


On Wed, 2007-02-07 at 09:55 +0100, Maciej Sobczak wrote:

> So the implementation details of *some* types leak out in the sense that 
> they force me to understand their internal mechanics. I don't want to.
> I want to say this:
> 
> declare
>      Sql : Session := Open_Session("some parameters");
>      Str : String := "Hello";
> begin
>      -- ...
> end;

Another data point: with JDBC, the recommendation is to leave database
cursor/statement handling to the implementation and have it finish for
you.
A notable exception to the rule is when the RDBMS is Oracle.
Then you have to:

> declare
>      Sql : Session := Open_Session("some parameters");
>      Str : String := "Hello";
> begin
>      -- ...
>      -- damn, I have to do *something* with *some* stuff here
> end;





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-07  8:55                                         ` Maciej Sobczak
  2007-02-07  9:30                                           ` GC in Ada Martin Krischik
  2007-02-07 10:10                                           ` How come Ada isn't more popular? Georg Bauhaus
@ 2007-02-07 10:56                                           ` Markus E Leypold
  2007-02-07 22:58                                             ` Georg Bauhaus
  2007-02-08  9:04                                             ` Maciej Sobczak
  2 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-07 10:56 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:
>
>>> Sure. But for the sake of discussion completeness, you might wish to
>>> throw an example of a situation where scoped lifetime will not make it.
>> Model-View-Controller in GUIs. Especially trying to adapt that to
>> GTKAda.
>
> I will not repeat Dmitry's arguments here.


Which was

 1) I am not sure what you mean,

and then

 2) "I'm now talking about reference counting". 

Which latter you might notice is a specialiced garbage collection
scheme and not objects with scoped lifetime. Seems to support my view.

>
>>> Java programmer wrote a loop where he opened database cursors,
> [...]
>
>> I'm not surprised. GC'ing ressources that are bounded doesn't spare
>> you knowing about the way GC works.
>
> Exactly. That's why I say that the solution is incomplete. If you have
> to think about the mechanics of some solution, then that solution is
> not entirely/properly automated.

I'd call that nonsense. Let me explain by analogy and then bury that
argument, since I obviously have another approach to that and we've
rehashed the respective arguments enough and seem not to be able to
agree.

Analogy: I don't think all the time about the way the compiler
generates code, but if I want to optimze, e.g. a loop, I have to think
about the mechanics of compilation. Certainly: A super-duper compiler
with artificial intelligence woul have seen right away the way I
optimze my loop now and would have it done qithout my interference
(actually many compilers today do quite a good job in this respect),
but that they didn't in the past didn't keep people from using
compilers and

I still experience using compilers AND using GC as tools as a vast
simplification of my work w/o many downsides. You can't deny _my_
experience but YMMV, And since you insist you cannot see my point or
do not want to use GC we're stuck :-).


>> You're approach "I want it all, and if
>> I can't have both (memory management AND general ressource collection)
>> I don't want neither" is somewhat counterproductive.
>
> I find it counterproductive to apply different management strategies
> with regard to *implementation details* of different types.

Unfortunately external ressource are no types in the type theoretic
sense, since they are not implemented in memory only. And whereas,
i.e. I produce and drop data with every function call, I find myself
opening and closing files or creating temporary file much less
often. So I profit vom GC as it is w/o feeling much of the mental pain
you seem to experience from the concept.

> I prefer solutions which enable me to hide implementation details from
> clients (sotware engineering?). If clients have to treat different

I fail to see how using no GC helps you in that, whereas using GC as
it is today hinders you.

> types differently just because their implementation details differ,
> then it means that these implementation details leak out in the form
> of distinct handling methods. I want to treat String and
> DatabaseCursor in the same way - that's the prerequisite for being
> productive for me.

If purity in any sense is your prerequisite for being productive, you
should take up Haskell.

>> But you might well continue to believe in your policy here.
>
> Thanks. :-)
>
>> I,
>> personally, find that it brings me a big step nearer to salvation if I
>> can have GC, even if I do only manual MM with it. After all: I don't
>> have this much other (external) ressources to care about and if I do,
>> it pays to have a careful look at their structure and then wrap some
>> abstraction around them.
>
> OK, I understand it. We just agree that GC is a valid solution for
> *some* class of computing problems.

For some really large class. I know, e.g. , it wouldn't solve the
halting problem.

> So why people claim that GC-oriented languages are general-purpose?

Because they can write all kinds of programs with them? I'm not trying
to sell a GC-oriented langauge to you. I'm convinced it fits many of
my problems at it does fro other people and I'm contend to leave the
rest to evolution of a sorts: If i'm wrong I'll meet sooner or later
"the unsolvable problem" whereas if you're wrong you'll never write
programs on a level of beauty and simplicity as I do.

Not that I'm a multi tool person of sorts. I've been developing
seriously in quite a number of languages, so I expect to be able to
fall back on the tool I think that is the best for the job at
hand. You, on the other side insist on excluding a tool because it
does not SEEM perfect to you (you never worked with a GC'ed language,
I gather?). Forgive me this word, but it _seems_ narrow minded to
me. But of course it's your decision.

>
>> But your approach is, since somebody had problems
>> misusing GC in a specific case
>
> No, that's not the point. The point is that languages which are build
> around GC tend to drop the proper support for other types of resources
> altogether. 

As if manual managment could be called proper support in this respect ...

> It's not the particular programmer who misused GC in a specific case
> - it's language designers who closed themselves in the GC cage and
> cranked a language that fails to provide good support for wider
> class of problems.

Big words, but given the choice between

 1) Manual ressource management
 2) GC with (according to you) imperfect support for other kinds of ressources
 3) The hypothetical system you propose should exist

I'll always prefer to go with (2) and bear the intellectual friction.

> As I've already said, the ideal would be to have both GC and scoped
> lifetime. The problem is that there is no reliable industry experience
> with such a mix, unless we treat Boehm's GC as one.

You know, the (functional) with_ressource wrappers have been around
for some time -- dating even back to Lisp -- and since Lisp has been
used quite extensively "in the industry" for some time, I'd say there
is reilable industry experience. You're just inventing problems where
none exist.

>
>> As far as the usability of GC goes, it even helps with controlled
>> objects
> [...]
>> I just make sure to close (e.g.) the filehandle and let the
>> rest to the GC.
>
> Of course. But then it's up to the designer of the type to decide how
> to treat each component of that type - it should be implementation
> detail. This decision should not be put on the shoulders of the final
> user, which is now the case in mainstream GC-oriented languages. This

"Treat each component of the type?" We're not talking about
components, but closing file handles, that is, switching over to
scoped ressource management in case where you know the automatic
collection won't be up to it, because your consumption will hit a
limit before the next collection is triggers and that there is no
trigger tied to this particular ressource limit.

> is what is broken.

And it will stay broken, since there are enough cases where we don't
want to trigger a major GC cylce every 64 file open()s only because
the programmer is dropping file descriptors on the floor in a
loop. Awareness to ressource restriction cannot be give up altogether
since the naive approach of first openeing 500 files and storing their
handles in an array seems also possible for a programmer who isn't
aware of the restriction -- and GC won't be able to fix that problem
anyway.

>
>>> There is nothing particular in scoped lifetime that would prohibit
>>> compacting heaps and there is nothing particular in GC that guarantees
>> No. But without having 3/4ths of a GC anyway compaction is pretty
>> pointless.
>
> Why? If the goal of compaction is to avoid fragmentation, then what is
> pointless in having compacted heaps managed by scoped lifetime?

Heap(s) -- I even resent the plural here -- are not about scoped but
about indeterminate lifetime. And a "compacted" heap has all the
unpredictability (in timing) as that of a garbage collector and would
provide garbage collection almost for free. 

So you want the downside -- perhaps loose real time capability (with
som algorithms) -- pay for it in moving memory object (makes the FFI
more complicated) and the you don't want to have the advantage of
freeing unused memory?

Strange...


>
>>> it. It's just the statistics based on popular implementations, not a
>>> rule.
>> Sorry, that is nonsense. There are garbage collectors that are
>> designed to be compacting.
>
> So what? This is exactly the statistics I'm talking about, that does

No no no no no.

> not prove that GC guarantees compacting or that the lack of GC
> prevents it.

2-space garbage collectors are compacting. Full stop.

Forgive me, I know you're only uninformed, but what youre spouting is
exactly the kind of FUD that has hindered the wide spread adoption of
GC into the main stream for years. People don't know how GC works,
don't know what it does, but they are somehow convinced that (a) it
disenfranchises them of the control of the computer, (b) cannot cope
with real time requiremens (as if they had any) and (c) is statistical
and unpredictable.

I'm sure I've forgotten some points on the list, but you get my
drift. FUD. And it's not even that you're selling a system w/o GC for
money.

>> They are moving objects around. This is
>> absolutely deterministic and not statistical.
>
> By statistics I mean the number of language implementations on the
> market that choose to use compacting GC vs. the number of languages
> that use non-compacting heaps. :-)

And that proves what?

>
>  > Whereas manual
>> allocation and deallocation as in Ada or C will fragment the heap and
>> you have NO guarantee (only statistics) about the ratio of allocated
>> (used) memory and presently unusable hole.
>
> If that bothers you, then use non-fragmenting allocators.


There are, as I can see, no non-fragmenting (heap) allocators for
unpredictable allocation patterns.

>> Hows that about
>> reliability if you can't give space guarantees even if you know about
>> the memory your algorithms need, since unfortunately you cannot
>> perdict the exact sequence of allocations?
>
> I use non-fragmenting allocator and I get my guarantees.

See above. Apart from the fact that there are no nonfragmenting
allocators being shipped with, e.g. Ada. So do it yourself. Wow: Avoid
compacting GC, get more work, do it by hand. You can see why that
doesn't attract me.

>>> I can perfectly imagine compacting heaps managed by scoped lifetime.
>> Yes you can do that. Since you're following pointers than and reqrite
>> them you might as well go the whole way and deollaocate unusable
>> memory while you're at it.
>
> Yes. Note that scoped lifetime does not preclude GC on some lower level.

You admit it, finally?

> Scoped lifetime provides a hook for deterministic "good bye" action -
> there is nothing more to it. Even if that "good bye" action calls
> free/delete/whatever on some memory block, there is nothing that
> forces the runtime to return the given block of memory right back to
> the operating system. Actually, none of the self-respecting allocators
> do this systematically - instead they keep the memory around for a
> while in anticipation of future allocations. I have nothing against GC
> at this level, really (and I've seen such implementations - in fact, a
> fully standard-compliant implementation of the C language could
> provide *empty* free function and GC underneath; and fully conformant
> C++ implementation could just call destructors as a result of delete
> and leave the raw memory to GC).

> What I'm against is a GC "paradigm" that prevents me from having
> deterministic "good bye" hooks for scoped lifetime. The problem is

There is no such GC paradigm. I wonder what we were talking about the
whole time.

> that most GC-oriented languages I'm aware of do have this "issue".
>
> In other words, for me GC is acceptable as an implementation detail of
> the dynamic memory allocator. I don't care *how* the allocator deals

Unfortunately GC is no implementation detail, since you see wether
there a free() or dispose() calls in the source.

> with memory that I free in the same sense that I don't care *how* the
> operating system deals with files that I remove from the
> filesystem. What I care about are hooks and scoped lifetime is an
> obvious answer for this.
>
>>>>  (3) What is often needed are upper limits not determinism and thos
>>>>      upper limits can be guaranteed with GC or with an appropriate
>>>>      collector.
>>
>>> This refers to memory consumption only, whereas I clearly stated
>>> deterministic *time* as a second (first, actually) goal.

>> This refers to both, there are real time compatible GC
>> algorithms.

> I'm interested in what is their target audience. I would expect any
> decent RT system to *refrain* from using dynamic memory except in the
> initialization phase (so that the "mission phase" is performed with
> constant set of objects), in which case RT GC would be just an answer
> to the question that nobody asked.

Yes, that is the old answer.

> Experts might wish to correct me and elaborate on this.

Ask Ray Blaake.

>>> OK. What about refcounting with smart pointers?
>> (1) It ties lifetime to multiple scopes (instead of one)
>
> With GC tracing pointers you have the same, just the tracing is hidden.

Yeah, but -- you had problems with that, you wanted true and pure
scoped lifetime, I don't. And if I don't want that I don't use smart
pointers as a hidden GC scheme: I just use GC and do away with
reference counting.

>
>> (2) its not
>> efficient
>
> Why?

See other posts in this thread. 

>> (3) It stille doesn't work for the general case
>
> Neither does GC, as seen in examples. :-)

Yes, but smart pointers where YOUR better answer to GC. You were
dissatisfied with GC. GC doesn't work for you, since it's not general
enough etc etc. Then you come up with smart pointers and ref counting
as an alternative -- which doesn't work either. Ooops. Why bother at
all, then?

>
>>> I acknowledge that there might be some applications which are strictly
>>> memory-oriented. They are just not the ones I usually write.
>> It also works for apps that are not "memory-oriented". I think you're
>> missing that e.g. filehandles are really simpler and differently
>> structured ressource from memory. A filehandle does not contain
>> references to memory or other filehandle. Memory does. That vastly
>> simplifies the problem of managing file handles indeed so much that
>> I'm convinced that you don't need buitlin support for this.

> Somehow this idea didn't work for database cursors, as already described.

Somehow ... you missed my point. It did work. File handles are in my view not
supposed to be handled by GC since they are structurally and
semantically different from memory. They should be closes explicitely.


>>> Sure. In other words, be prepared that with GC you have to
>>> handle/understand some parts of the sytem better.

>> So?

> So the implementation details of *some* types leak out in the sense
> that they force me to understand their internal mechanics. I don't
> want to.

No. You just need to stick to the rules. Close your ***** filehandles
manually. All the time.

If you want to be smart, though, it pays to think about the
interaction with underlying system (the implementation). Same as with
the loop optimization. And



> I want to say this:
>
> declare
>      Sql : Session := Open_Session("some parameters");
>      Str : String := "Hello";
> begin
>      -- ...
> end;
>
> instead of this:
>
> declare
>      Sql : Session := Open_Session("some parameters");
>      Str : String := "Hello";
> begin
>      -- ...
>      -- damn, I have to do *something* with *some* stuff here
> end;
>
> [about FP]


>>> The difference is that in languages with scoped lifetime the lifetime
>>> management is a property of the type (and so applies to all
>>> instances), whereas the "FP-trick" above is a property of the
>>> use-side. Which one is more robust and less prone to bugs?
>> This is, forgive me, nonsense. I might want to use a file handle in a
>> scoped way here and in a free floating way there.
>
> What about readability and maintainability of such code?

Nothing. It's OK. 

  with_file "foo" ( fun fd -> ... );;

or

  let blaba = ... 
  and fd    = file_open "..."
  in
    yadda ();
    blubb ();
    let oops = ... 
    in ...
       fileclose();
       oops             (* that's the return value for non ML programmers *)

So?

>
>> And no -- the FP way is not "more prone to bugs"
>

> Unless you use a handle in a free floating way and find later that in
> production your code was called in a loop causing handles to pile up?

I didn't do that, so I can't defend against it. Perhaps your friend
simply is not the Java software engineer he wants to be. I never did
have that problem ... -- so can I stop now defending my approach to
programming against errors other people have committed in another
language because they didn't read the JDBC reference manual?

> I have the practical example (already described) that this way of
> thinking can lead to failures.

If you think you never need to use a file handle in a scoped type, you
_can_ wrap it either in a with_file_do wrapper or into a scoped
type. But given e.g. the question how to build other primitives like
open_server_connection() from file handles I doubt it's a winning
situation to do that from the beginning.

And BTW: Excluding human error is only possible to certain extend. I
see Controlled and scoped life times only a tool to structure programs
in an understandable way but by no account as a way to enforce
programming _style_. Quality is better served by (a) reviews and (b)
coaching structures within larger teams. Both probably would have
caught your friends mistake.

That said, it probably would be a good idea to flag objects as
containing external resources with additional resource limits and
generate a compiler warning if the user doesn't deinitialize them
within the scope or doesn't return them.


> The programmer wanted to use a database
> cursor in a free floating way. That was fine. Later his code was used
> in a loop. 

Reusing code in another context without re-viewing it, was what cost
the ESA that famous Ariane 5 launch.

> Ah, yes - his code was used in a loop written by another
> programmer, so his judgement about whether it's OK to use anything in
> a free floating way was misguided from the very beginning.

Reviews, reviews, reviews. Understand what you use!

>> and as with
>> George Bauhaus I simply refuse this kind of discussion (FUD and
>> ContraFUD).
>
> OK. We will just stay unconvinced. :-)

:-) I think so. We just don't have a common basis to slug it out and
come to a rational decision. Not surprising, give that Ada and C++ are
still around AND the GCed languages are also living

>
>>> BTW - please show me an example involving 10 objects of different kinds. :-)
>> All at the same time?
>
> Yes.
>
>> Well -- bad programming.
>
> I knew you would answer this. :-)

let foozle a b c = with_thingy1 a (frobnicate b c);;

let foobar x y   = with_thingy2 y x (defrobnicate (foozle a b));;

... 


let do_it_now = with_thingy10 "/etc/passwd"
                   (bla "thing1" 12 13)
                   (baz "thing2" (troon "thing3 123 123) unfroth))
;;


I think you get the drift: It depends on the structure of the problem ...

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-07  9:30                                           ` GC in Ada Martin Krischik
@ 2007-02-07 11:08                                             ` Markus E Leypold
  2007-02-07 11:15                                             ` Maciej Sobczak
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-07 11:08 UTC (permalink / raw)



Martin Krischik <krischik@users.sourceforge.net> writes:

> Maciej Sobczak schrieb:
>
>> Yes. Note that scoped lifetime does not preclude GC on some lower level.
>> Scoped lifetime provides a hook for deterministic "good bye" action
>> -
>> there is nothing more to it. Even if that "good bye" action calls
>> free/delete/whatever on some memory block, there is nothing that
>> forces the runtime to return the given block of memory right back to
>> the operating system. Actually, none of the self-respecting
>> allocators do this systematically - instead they keep the memory
>> around for a while in anticipation of future allocations.
>
> I believe in most systems memory is never returned.
>
> If I understood Unix file management right only memory at the end of
> the heap can be returned. Without compaction a no go.

You're partly right. It depends on the heap implementation. If done
using sbrk() you're right. If done using mmap() you'd have --
theoretically -- the possibility wo return sufficiently large holes to
the OS. 

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-07  9:30                                           ` GC in Ada Martin Krischik
  2007-02-07 11:08                                             ` Markus E Leypold
@ 2007-02-07 11:15                                             ` Maciej Sobczak
  2007-02-07 11:53                                               ` Martin Krischik
  2007-02-07 12:19                                               ` Markus E Leypold
  1 sibling, 2 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-07 11:15 UTC (permalink / raw)


Martin Krischik wrote:

>> What I'm against is a GC "paradigm" that prevents me from having 
>> deterministic "good bye" hooks for scoped lifetime. The problem is 
>> that most GC-oriented languages I'm aware of do have this "issue".
> 
> But isn't that exactly what "Unchecked_Deallocation" and "pragma 
> Controlled" is all about? Has Ada - by your rationale - not got GC right?

By my rationale Ada and C++ got it perfectly right ([Limited_]Controlled 
mess aside).

The only difference between them in this regard is that Ada explicitly 
allows GC on the low level without requiring it (so that implementations 
can ignore the whole idea) and that C++ is traditionally silent about 
the concept altogether (so that implementations can provide it). ;-)

(Note that GC will likely be formalized in the upcoming C++ standard.)

My criticism is targeted at those languages which bring GC to the top 
level obstructing the visible part of the object model.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-07  7:44                                           ` Ray Blaak
  2007-02-07  8:54                                             ` Georg Bauhaus
@ 2007-02-07 11:17                                             ` Markus E Leypold
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-07 11:17 UTC (permalink / raw)


Ray Blaak <rAYblaaK@STRIPCAPStelus.net> writes:

> Markus E Leypold <development-2006-8ecbb5cc8aREMOVETHIS@ANDTHATm-e-leypold.de> writes:
>> That's why you MUST use Ocaml :-). No, joke, I do not know what is the
>> important thing to you in Lisp. OCaml has no macros etc. But the
>> typing makes things vastly more manageable.
>
> I think what I prefer about Lisp (and Scheme) vs the ML style languages is
> that Lisp is a little more laid back. The syntax is cleaner, and the whole FP
> obsession thing is optional.
>
> The whole obsession in the syntax with just how functions map from inputs to
> outputs just seem a little too serious, but maybe I am confusing things with
> ML. I prefer to relax with the currying in declarations and just show simple

Ah, I understand. In OCaml 'let f a b c = ...' defines a curried
function so the "function ... -> function ... -> function" is greatly
diminished even if you want curried functions.

> function signatures instead. Of course in Lisp, one just passes the lambas
> around and lets the runtime worry about the mismatches.

> The funny thing is that the whole dynamic thing just doesn't seem to fail as
> badly as the static typing purists would have us believe. Now I do want my

No certainly not. I've even been heard to say "Python is not such abd
language" sometimes. It's basically a testing thing and if you build
slowly enough (not too many new _library_ components) or do really
good code reviews and dcoumentation) it doesn't seem to make too much
difference for systems not too large (I don't know about really large
systems since I'm missing the opprotunity to compare -- one really
large system is written in Lisp: Emacs :-).

> strong static typing, espcially for parameter mismatches on function calls,
> but I find it interesting that significant software can get done just fine in
> Lisp.

Lisp is not such a bad language. And the List support is excellent :-).

> I have always had OCaml on my list to dig into more. Maybe I should actually
> get around to it.
 
Do it! The infrastructure around it (bytecode + native compilers,
abilitiy to build interactive top levels, good library bindings etc)
are also something not to be missed. It has dome downsides to the
purist or if you're looking for specific things: Strings are always
mutable (has been criticized but will not/never be changed) which is a
source for some errors and there are no subrange types.

But overall: Recommended.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-07  8:54                                             ` Georg Bauhaus
@ 2007-02-07 11:19                                               ` Markus E Leypold
  2007-02-07 23:32                                                 ` Georg Bauhaus
  2007-02-07 19:01                                               ` Ray Blaak
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-07 11:19 UTC (permalink / raw)



Georg Bauhaus <bauhaus@futureapps.de> writes:

> On Wed, 2007-02-07 at 07:44 +0000, Ray Blaak wrote:
>
>> The funny thing is that the whole dynamic thing just doesn't seem to fail as
>> badly as the static typing purists would have us believe. Now I do want my
>> strong static typing, espcially for parameter mismatches on function calls,
>> but I find it interesting that significant software can get done just fine in
>> Lisp.
>> 
>> I have always had OCaml on my list to dig into more. Maybe I should actually
>> get around to it.
>

> If you do this, perhaps you can find the time to use a, uh, stop-watch
> to measure the time it takes to produce, read, and change the programs?
> I.e., to technically manage significant software. The times will be
> important input for deciding whether or not static typing does indeed
> help in production.
> The only thing that relates to economic efficiency, if you want it :-)

So the how long a beginner in a specific language takes to do
something is "important input for deciding ..." if compared to what,
exactly?

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-07 11:15                                             ` Maciej Sobczak
@ 2007-02-07 11:53                                               ` Martin Krischik
  2007-02-07 12:22                                                 ` Markus E Leypold
                                                                   ` (2 more replies)
  2007-02-07 12:19                                               ` Markus E Leypold
  1 sibling, 3 replies; 397+ messages in thread
From: Martin Krischik @ 2007-02-07 11:53 UTC (permalink / raw)


Maciej Sobczak schrieb:
> Martin Krischik wrote:
> 
>>> What I'm against is a GC "paradigm" that prevents me from having 
>>> deterministic "good bye" hooks for scoped lifetime. The problem is 
>>> that most GC-oriented languages I'm aware of do have this "issue".
>>
>> But isn't that exactly what "Unchecked_Deallocation" and "pragma 
>> Controlled" is all about? Has Ada - by your rationale - not got GC right?
> 
> By my rationale Ada and C++ got it perfectly right ([Limited_]Controlled 
> mess aside).
> 
> The only difference between them in this regard is that Ada explicitly 
> allows GC on the low level without requiring it (so that implementations 
> can ignore the whole idea) and that C++ is traditionally silent about 
> the concept altogether (so that implementations can provide it). ;-)

Only that C++ does not have pragma Controlled to switch the collector 
off. And Unchecked_Deallocation should deallocate even when a collector 
is present.

> (Note that GC will likely be formalized in the upcoming C++ standard.)

Which could solve the above.

> My criticism is targeted at those languages which bring GC to the top 
> level obstructing the visible part of the object model.

On my Weblogic course I could not stop shaking my head about all the 
problems which brings the "all is pointer" concept of Java.

Martin



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-07 11:15                                             ` Maciej Sobczak
  2007-02-07 11:53                                               ` Martin Krischik
@ 2007-02-07 12:19                                               ` Markus E Leypold
  2007-02-08  7:54                                                 ` Maciej Sobczak
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-07 12:19 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> Martin Krischik wrote:
>
>>> What I'm against is a GC "paradigm" that prevents me from having
>>> deterministic "good bye" hooks for scoped lifetime. The problem is
>>> that most GC-oriented languages I'm aware of do have this "issue".
>> But isn't that exactly what "Unchecked_Deallocation" and "pragma
>> Controlled" is all about? Has Ada - by your rationale - not got GC
>> right?
>
> By my rationale Ada and C++ got it perfectly right
> ([Limited_]Controlled mess aside).
>
> The only difference between them in this regard is that Ada explicitly
> allows GC on the low level without requiring it (so that
> implementations can ignore the whole idea) and that C++ is
> traditionally silent about the concept altogether (so that
> implementations can provide it). ;-)
>
> (Note that GC will likely be formalized in the upcoming C++ standard.)
>
> My criticism is targeted at those languages which bring GC to the top
> level obstructing the visible part of the object model.

You mean like Smalltalk -- a language which carries the label OO
wrongly, because it's GC obstructs the object model.

Hm.

Actually I think, real OO (as opposed to tagged types) need GC and
unbounded life time of objects. That's indeed all OO is about.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-07 11:53                                               ` Martin Krischik
@ 2007-02-07 12:22                                                 ` Markus E Leypold
  2007-02-08  7:26                                                   ` Martin Krischik
  2007-02-08  7:48                                                 ` Maciej Sobczak
  2007-02-08 18:38                                                 ` Dmitry A. Kazakov
  2 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-07 12:22 UTC (permalink / raw)



Martin Krischik <krischik@users.sourceforge.net> writes:

> Maciej Sobczak schrieb:
>> Martin Krischik wrote:
>>
>>>> What I'm against is a GC "paradigm" that prevents me from having
>>>> deterministic "good bye" hooks for scoped lifetime. The problem is
>>>> that most GC-oriented languages I'm aware of do have this "issue".
>>>
>>> But isn't that exactly what "Unchecked_Deallocation" and "pragma
>>> Controlled" is all about? Has Ada - by your rationale - not got GC
>>> right?
>> By my rationale Ada and C++ got it perfectly right
>> ([Limited_]Controlled mess aside).
>> The only difference between them in this regard is that Ada
>> explicitly allows GC on the low level without requiring it (so that
>> implementations can ignore the whole idea) and that C++ is
>> traditionally silent about the concept altogether (so that
>> implementations can provide it). ;-)
>
> Only that C++ does not have pragma Controlled to switch the collector
> off. And Unchecked_Deallocation should deallocate even when a
> collector is present.
>
>> (Note that GC will likely be formalized in the upcoming C++ standard.)
>
> Which could solve the above.
>
>> My criticism is targeted at those languages which bring GC to the
>> top level obstructing the visible part of the object model.
>
> On my Weblogic course I could not stop shaking my head about all the
> problems which brings the "all is pointer" concept of Java.

Fortunately you can ignore this "all is pointer" by exposing only a
read-only interface to the client and leaving the rest to the GC. That
feels exactly like passing records around with in, out and in/out.  I
don't see the problem. Not to doubt your experience on this, but just
because I'm curious: Can you provide a hint or example what the
problems are?

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-07  8:54                                             ` Georg Bauhaus
  2007-02-07 11:19                                               ` Markus E Leypold
@ 2007-02-07 19:01                                               ` Ray Blaak
  1 sibling, 0 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-07 19:01 UTC (permalink / raw)


Georg Bauhaus <bauhaus@futureapps.de> writes:
> If you do this, perhaps you can find the time to use a, uh, stop-watch
> to measure the time it takes to produce, read, and change the programs?
> I.e., to technically manage significant software. The times will be
> important input for deciding whether or not static typing does indeed
> help in production.

No, I am a strong believer in strong static typing and I have no interest in
abandoning it.

C/C++ failures are clear, pretty much because they can easily have "no typing"
and low level memory corruption.

No, my point was merely the observation that strong dynamically typed systems
like Lisp don't seem to fail that badly, due to the robustness of the VM, most
likely, and the flexibilities they are allow to bring to bear.

It was merely out loud pondering, which always seems to get me in trouble.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-07 10:56                                           ` Markus E Leypold
@ 2007-02-07 22:58                                             ` Georg Bauhaus
  2007-02-08  9:04                                             ` Maciej Sobczak
  1 sibling, 0 replies; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-07 22:58 UTC (permalink / raw)


On Wed, 2007-02-07 at 11:56 +0100, Markus E Leypold wrote:
> Maciej Sobczak <no.spam@no.spam.com> writes:

> >> and as with
> >> George Bauhaus I simply refuse this kind of discussion (FUD and
> >> ContraFUD).
> >
> > OK. We will just stay unconvinced. :-)
> 
> :-) I think so. We just don't have a common basis to slug it out and
> come to a rational decision. Not surprising, give that Ada and C++ are
> still around AND the GCed languages are also living

My favorite Ada compiler produces GCing programs.


  -- Georg 






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-07 11:19                                               ` Markus E Leypold
@ 2007-02-07 23:32                                                 ` Georg Bauhaus
  2007-02-08  8:49                                                   ` Markus E Leypold
  2007-02-08  9:24                                                   ` Markus E Leypold
  0 siblings, 2 replies; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-07 23:32 UTC (permalink / raw)


On Wed, 2007-02-07 at 12:19 +0100, Markus E Leypold wrote:
> Georg Bauhaus <bauhaus@futureapps.de> writes:
> 

> > If you do this, perhaps you can find the time to use a, uh, stop-watch
> > to measure the time it takes to produce, read, and change the programs?
> > I.e., to technically manage significant software. The times will be
> > important input for deciding whether or not static typing does indeed
> > help in production.
> > The only thing that relates to economic efficiency, if you want it :-)
> 
> So the how long a beginner in a specific language takes to do
> something is "important input for deciding ..." if compared to what,
> exactly?

I don't think Ray is a beginner in functional programming.

Now before I try singing old wisdom from the neighbourhood of
The Mythical Man Month ...
How will a piece of source code, your own or someone else's, and
a language with tools fit these and the market as we might think
it is now?

project size,
project duration,
forced language,
availability of programmers (need to hire?),
job changes,
programmer skills,
multiple projects at the same time,
language switching e.g. between projects every other week,
time needed for education,
documentation needs,
future modification done by others,
future modification to be done by the author,
...


A rich set of opportunities for collecting some input, if only
in order to convince ourselves that some things might have changed
in software production (due to language changes?) while others
haven't changed --for example we might get carried away by a
subjective estimate of the usability of our preferred language.
In accord with the findings of Leon Festinger (1957).


 -- Georg 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-07 12:22                                                 ` Markus E Leypold
@ 2007-02-08  7:26                                                   ` Martin Krischik
  2007-02-08  9:33                                                     ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Martin Krischik @ 2007-02-08  7:26 UTC (permalink / raw)


Markus E Leypold schrieb:

> Martin Krischik <krischik@users.sourceforge.net> writes:

>> Maciej Sobczak schrieb:

>> On my Weblogic course I could not stop shaking my head about all the
>> problems which brings the "all is pointer" concept of Java.

> Fortunately you can ignore this "all is pointer" by exposing only a
> read-only interface to the client and leaving the rest to the GC

Only Java has no const keyword. It is supposed to get one but how long 
until it is actually used?

> That
> feels exactly like passing records around with in, out and in/out.  I
> don't see the problem. Not to doubt your experience on this, but just
> because I'm curious: Can you provide a hint or example what the
> problems are?

I can give you the solution, which is not used all that widely because 
of the performance impact:

class X
   {
   Date date = new Date;

   Date
   get_Date ()
      {
      return new Date (date);
      }

   void
   set_Date (Date new_Date)
      {
      date = new Date (new_Date);
      }
   }

Got it? If get_Date would just return date it would return a modifiable 
pointer - which could be used and - well - modified almost everywhere.

As said the solution above is not used all that often and so the 
interesting part in the weblogic course was that the hard core java 
programmers had to learn that for remote call object those object might 
be copied behind the scenes and not passes by (non const) reference. And 
so to those objects modifications might be lost!

This was the moment where a hole horror scenario unfolded to me: 
programmers which actually modified objects returned by a geta function 
instead of using the appropriate seta function!

Martin



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-07 11:53                                               ` Martin Krischik
  2007-02-07 12:22                                                 ` Markus E Leypold
@ 2007-02-08  7:48                                                 ` Maciej Sobczak
  2007-02-08  8:20                                                   ` Martin Krischik
                                                                     ` (2 more replies)
  2007-02-08 18:38                                                 ` Dmitry A. Kazakov
  2 siblings, 3 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-08  7:48 UTC (permalink / raw)


Martin Krischik wrote:

> And Unchecked_Deallocation should deallocate even when a collector 
> is present.

I don't understand. There is no legal way for the program to verify that 
anything was indeed deallocated, so it doesn't make much sense to say 
that this behaviour is required.

As far as I understand it, Unchecked_Deallocation is allowed to do 
nothing. That wouldn't be a very competitive language implementation, 
but AARM does not require it either. :-)

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-07 12:19                                               ` Markus E Leypold
@ 2007-02-08  7:54                                                 ` Maciej Sobczak
  2007-02-08  9:49                                                   ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-08  7:54 UTC (permalink / raw)


Markus E Leypold wrote:

> Actually I think, real OO (as opposed to tagged types) need GC and
> unbounded life time of objects. That's indeed all OO is about.

That is quite novel definition of OO. Any references?

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-07  9:42                                             ` Markus E Leypold
@ 2007-02-08  8:10                                               ` Maciej Sobczak
  0 siblings, 0 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-08  8:10 UTC (permalink / raw)


Markus E Leypold wrote:

>> If you have cycles, then you'd better rethink the design.

> A lot of OO models with callback also require cycles. E.g. A knows B
> and B registers a callback with A in certain situations.

That's not an ownership cycle.
I have no problems with callbacks. Interestingly, "a lot of OO models" 
(in particular those which depend on callbacks) don't even require 
dynamic allocation, so I don't see what this argument has to do with GC.

>> I agree that this kind of restructuring is not always possible, but
>                                             ^^^^^^^^^^^^^^^^^^^^^
> 
> !!.

??

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08  7:48                                                 ` Maciej Sobczak
@ 2007-02-08  8:20                                                   ` Martin Krischik
  2007-02-08  8:43                                                   ` Markus E Leypold
  2007-02-08 18:24                                                   ` Jeffrey R. Carter
  2 siblings, 0 replies; 397+ messages in thread
From: Martin Krischik @ 2007-02-08  8:20 UTC (permalink / raw)


Maciej Sobczak schrieb:

> Martin Krischik wrote:

>> And Unchecked_Deallocation should deallocate even when a collector is 
>> present.

> I don't understand. There is no legal way for the program to verify that 
> anything was indeed deallocated, so it doesn't make much sense to say 
> that this behaviour is required.

should /= must

> As far as I understand it, Unchecked_Deallocation is allowed to do 
> nothing. That wouldn't be a very competitive language implementation, 
> but AARM does not require it either. :-)

Just read it up [1] and Unchecked_Deallocation may not be a no-op - you 
forgot finalization. Apart from that:

---------------------
Implementation Advice

For a standard storage pool, Free should actually reclaim the storage.
---------------------

Again "Should /= must" and - you are right - there are no references as 
to when reclaim should happen.

Martin

[1] http://www.adaic.com/standards/05rm/html/RM-13-11-2.html



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08  7:48                                                 ` Maciej Sobczak
  2007-02-08  8:20                                                   ` Martin Krischik
@ 2007-02-08  8:43                                                   ` Markus E Leypold
  2007-02-09 14:20                                                     ` Maciej Sobczak
  2007-02-08 18:24                                                   ` Jeffrey R. Carter
  2 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-08  8:43 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> Martin Krischik wrote:
>
>> And Unchecked_Deallocation should deallocate even when a collector
>> is present.
>
> I don't understand. There is no legal way for the program to verify
> that anything was indeed deallocated, so it doesn't make much sense to
> say that this behaviour is required.

Oh yes. Deallocating immeditately and deallocating later makes a
difference in time and space behaviour -- which IS measurable outside
the program (BTW: Exactly what you've been harping upon in you
opposition against GC :-)) )

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-07 23:32                                                 ` Georg Bauhaus
@ 2007-02-08  8:49                                                   ` Markus E Leypold
  2007-02-09 14:09                                                     ` Georg Bauhaus
  2007-02-08  9:24                                                   ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-08  8:49 UTC (permalink / raw)



Georg Bauhaus <bauhaus@arcor.de> writes:

> On Wed, 2007-02-07 at 12:19 +0100, Markus E Leypold wrote:
>> Georg Bauhaus <bauhaus@futureapps.de> writes:
>> 
>
>> > If you do this, perhaps you can find the time to use a, uh, stop-watch
>> > to measure the time it takes to produce, read, and change the programs?
>> > I.e., to technically manage significant software. The times will be
>> > important input for deciding whether or not static typing does indeed
>> > help in production.
>> > The only thing that relates to economic efficiency, if you want it :-)
>> 
>> So the how long a beginner in a specific language takes to do
>> something is "important input for deciding ..." if compared to what,
>> exactly?
>
> I don't think Ray is a beginner in functional programming.

>> So the how long a beginner in a specific language takes to do
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

I should think I have been quite clear on that. 

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-07 10:56                                           ` Markus E Leypold
  2007-02-07 22:58                                             ` Georg Bauhaus
@ 2007-02-08  9:04                                             ` Maciej Sobczak
  2007-02-08 10:01                                               ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-08  9:04 UTC (permalink / raw)


Markus E Leypold wrote:

I will cut ad-persona ramblings and your suggestions about my 
competence, and try to restate major points.

>> I prefer solutions which enable me to hide implementation details from
>> clients (sotware engineering?). If clients have to treat different
> 
> I fail to see how using no GC helps you in that, whereas using GC as
> it is today hinders you.

You probably didn't read my previous posts in detail.
I claimed that mainstream GCed languages put so much faith in the GC 
concept that they drop scoped lifetime altogether. *This* hinders me 
from solving my problems in a uniform way.
I'm not happier with "no GC" and having GC is irrelevant to my problems.
What is relevant to my problems is scoped lifetime and this is missing 
from mainstream GCed languages. Got the point?

>> So why people claim that GC-oriented languages are general-purpose?
> 
> Because they can write all kinds of programs with them?

Assembly language has this property as well. Actually, computational 
completeness can be achieved with two instructions only, but I don't 
consider it as a valid argument for being general-purpose.

>> The point is that languages which are build
>> around GC tend to drop the proper support for other types of resources
>> altogether. 
> 
> As if manual managment could be called proper support in this respect ...

Then I will restate it again: the world is not partitioned between GC 
and manual management. As I've already pointed, you will not find manual 
management of anything in my regular code and since I'm also not using 
GC, then your view must be incomplete.

// warning: this is not Ada
{
     Session sql(oracle, "some-params");
     string name;
     sql << "select name from persons where id=7", into(name);
     cout << name << '\n';
}

Do you see *any* manual management of anything in the above? I stare at 
this code and I don't. Do you see GC here? I don't.
Then please stop building your arguments on the presumption that only 
these two options are possible.

Do you require from the user the understanding of implementation details 
of any of the above constructs? I don't.
We (software engineers) call it encapsulation.

> Big words, but given the choice between
> 
>  1) Manual ressource management
>  2) GC with (according to you) imperfect support for other kinds of ressources
>  3) The hypothetical system you propose should exist

The point is that you are not given only these choices.


> Heap(s) -- I even resent the plural here -- are not about scoped but
> about indeterminate lifetime. And a "compacted" heap has all the
> unpredictability (in timing) as that of a garbage collector and would
> provide garbage collection almost for free. 

Yes. My point is that compacting abilities are not restricted to GC.

> People don't know how GC works,
> don't know what it does

It does not matter. What does matter is that not having scoped lifetime 
doesn't work. If you want to sell GC without scoped lifetime, I won't 
buy it.

>>> Whereas manual
>>> allocation and deallocation as in Ada or C will fragment the heap and
>>> you have NO guarantee (only statistics) about the ratio of allocated
>>> (used) memory and presently unusable hole.
>> If that bothers you, then use non-fragmenting allocators.
> 
> There are, as I can see, no non-fragmenting (heap) allocators for
> unpredictable allocation patterns.

There are. Write on a piece of paper all the reasons that in conjunction 
lead to heap fragmentation and think how to get rid of at least one of 
them. That's actually easy.

> Apart from the fact that there are no nonfragmenting
> allocators being shipped with, e.g. Ada. So do it yourself. Wow: Avoid
> compacting GC, get more work, do it by hand. You can see why that
> doesn't attract me.

You see, the point is that in Ada or C++ I *can* and *know how* to 
improve anything I want based on the foundations that these languages 
give me (even if that means work), whereas in languages that have no 
support for scoped lifetime there is simply no foundation on which any 
improvements might be built on. Wow: Prevent people from achieving their 
goals, so they have less to do. You can see why that doesn't attract me.

>> Note that scoped lifetime does not preclude GC on some lower level.
> 
> You admit it, finally?

Admit what? Scoped lifetime does not preclude GC on some lower level. 
The lack of scoped lifetime sucks, no matter whether GC is provided or not.

> Unfortunately GC is no implementation detail, since you see wether
> there a free() or dispose() calls in the source.

You admit it, finally?

Then look again at the snipped I provided above. Do you see *any* free() 
or dispose() calls in the source? I don't. We call it encapsulation.

> Yes, but smart pointers where YOUR better answer to GC.

No. My answer is scoped lifetime. Look again at the snipped above, there 
are no smart pointers in sight.

Smart pointers are useful as a solution for resource management built on 
top of scoped lifetime in cases where the resource is not tied to a 
single scope. I use them, but *very* rarely.

> You were
> dissatisfied with GC.

I'm dissatified with languages that provide GC *instead of* scoped 
lifetime. The focus should be put on "addition", not "instead".

> File handles are in my view not
> supposed to be handled by GC since they are structurally and
> semantically different from memory. They should be closes explicitely.

Bingo. But do you explicitly operate on all file handles that your 
program manages? Then you must be using some extremeley low-level language.
Consider a class ConfigurationData (or whatever). Do you have to 
explicitly dispose of it? Hm... That depends on what ConfigurationData 
has *inside*. If it's bound to some file handle (say, using simple 
persistency scheme based on filesystem), then yes.
This means that your reasoning has to be transitively closed based on 
the relations between implementation details of all dependent entities.
Do you call it encapsulation?

Consider another complication. A team that is responsible for 
ConfigurationData ships the 1.0 version of the package with in-memory 
database as a backend. They specify the interface.
Some time later they ship 2.0 with files as backends. The specs didn't 
change (wow! that's what encapsulation is supposed to look like, no?), 
so according to software engineering practices you are not obliged to 
change anything in your code. Oops. Where's encapsulation?

> No. You just need to stick to the rules. Close your ***** filehandles
> manually. All the time.

Thank you for clearly describing this paradigm. No, I'm not convinced.

> If you want to be smart, though, it pays to think about the
> interaction with underlying system (the implementation).

I don't want to be smart. I want to be productive and have less bugs in 
my code. I noticed that scoped lifetime allows me to achieve this more 
effectively than without it.

> I didn't do that, so I can't defend against it. Perhaps your friend
> simply is not the Java software engineer he wants to be. I never did
> have that problem ... -- so can I stop now defending my approach to
> programming against errors other people have committed in another
> language because they didn't read the JDBC reference manual?

Should the user of ConfigurationData read the JDBC manual? Should he 
ever need to be aware that JDBC is used *somewhere there* deep in the 
bowels of ConfigurationData?

> Reusing code in another context without re-viewing it, was what cost
> the ESA that famous Ariane 5 launch.

Good point, but where's encapsulation here? What about decoupling 
implementation from specification?

> Reviews, reviews, reviews. Understand what you use!

Then maybe we should drop this decoupling altogether?

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-07 23:32                                                 ` Georg Bauhaus
  2007-02-08  8:49                                                   ` Markus E Leypold
@ 2007-02-08  9:24                                                   ` Markus E Leypold
  2007-02-09 15:08                                                     ` Georg Bauhaus
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-08  9:24 UTC (permalink / raw)



Georg Bauhaus <bauhaus@arcor.de> writes:

> On Wed, 2007-02-07 at 12:19 +0100, Markus E Leypold wrote:
>> Georg Bauhaus <bauhaus@futureapps.de> writes:
>> 
>
>> > If you do this, perhaps you can find the time to use a, uh, stop-watch
>> > to measure the time it takes to produce, read, and change the programs?
>> > I.e., to technically manage significant software. The times will be
>> > important input for deciding whether or not static typing does indeed
>> > help in production.
>> > The only thing that relates to economic efficiency, if you want it :-)
>> 
>> So the how long a beginner in a specific language takes to do
>> something is "important input for deciding ..." if compared to what,
>> exactly?

<snipped>

> Now before I try singing old wisdom from the neighbourhood of
> The Mythical Man Month ...
> How will a piece of source code, your own or someone else's, and
> a language with tools fit these and the market as we might think
> it is now?

I find myself a bit confused on what you're talking about now --
exactly against what do you propose that I defend myself? Since Ray
Blaak expressed past interest in OCaml I had the cheek to encourage
him to really go and have a deeper look into OCaml when he finds
time. 

That was it. I certainly miss, how all that here comes in:

> project size,
> project duration,
> forced language,
> availability of programmers (need to hire?),
> job changes,
> programmer skills,
> multiple projects at the same time,
> language switching e.g. between projects every other week,
> time needed for education,
> documentation needs,
> future modification done by others,
> future modification to be done by the author,
> ...

But let me give a real short answer to your request as it stands:

> How will a piece of source code, your own or someone else's, and
> a language with tools fit these and the market as we might think
> it is now?

It will depend on the ability of mine or someone else to code in this
unnamed language with arbitrary tools. Note that I doubt that "we
think" the same about the market and what counts -- finally -- is how
the market is, not how we think about it, which might be a different
thing altogether.

Does that answer satisfy you? No? I thought so. You have to ask better
questions then.

> A rich set of opportunities for collecting some input, if only

Questions as those rather wide ranging ones you just posed above are
not "opportunities for collecting some input". They just stay
questions. I fail to see the opportunity. Even if R.B. would use the
suggested "stop watch" (and we all know how little the time alone will
tell you), I hope you don't try to draw conclusion on all those topics:


> project size,
> project duration,
> forced language,
> availability of programmers (need to hire?),
> job changes,
> programmer skills,
> multiple projects at the same time,
> language switching e.g. between projects every other week,
> time needed for education,
> documentation needs,
> future modification done by others,
> future modification to be done by the author,
> ...

> in order to convince ourselves that some things might have changed
> in software production (due to language changes?) while others

Well, things have changed, but you're probably missing how.

> haven't changed --for example we might get carried away by a

Care not to use that all-inclusive sick-nurse style "we" here?  If you
get carried away: Your problem. If you insist I did, then say so.

And note that I didn't give a specific factor of improvement of
productivity. Brooks excludes "magical" improvements for large
projects in orders of magnitude, not

  (a) for small projects (in large projects it's not the
      implementation that dominates)

  (b) not for reasonable factors.

In small day-to-day development I don't think a factor of

> subjective estimate of the usability of our preferred language.

> In accord with the findings of Leon Festinger (1957).

Good. That might be. How's than that C is rumoured to be so much less
productive than Ada?

Bah!

George, you're not trying to find out things, but to win. At, I might
say, every price. 

As I already said: Just ignore what I say and we'll be fine. I'm not
conducting paid scientific research here on language usability and
I've no interest in making converts. Just say that what I say cannot
be proven, you refuse to believe anything of it and let it be
done. _I_ do not see any way to help you here, esp. since above you
just sketched a many man year research program as the things you need
to get answered. How to you propose I answer to 

> How will a piece of source code, your own or someone else's, and
> a language with tools fit these and the market as we might think
> it is now? (...)

Say "They fit well?". Would I be done then? Or would you call me
deluded (not with that word of course) and we'd argue over what I say
until I bring sworn affidavits? You know, I'm already convinced
(though I doubt you exactly grasped of what), so I don't need to
conduct any experiments or research in the directions _you_
require. You're not convinced and IMO unconvincable, but as I said:
YMMV and I'll let you you're point of view and practice of
programming. Since I don't work for or with you I don't have to find a
common style with you. And for the rest: Well -- I just don't have the
time. As I already said: I'm not even sure I know what you want.

I reserve the right, though, to give some, perhaps not well recieved
or well like answers to questions like "Why is Ada not more popular"
or oppose some views on GC or type inference. Oppose, mind you. That
means some statement of dissenting opinion and experience (which
anybody can ignore at their will), but that doesn't mean I've to slug
it out. That is what scientific studies are for and they usually take
a bit more time than 30 minutes on usenet. I know a number of people
who'd happily accept some research grants on these topics, so if you
just happen to have some spare money and the need to answer

> project size,
> project duration,
> forced language,
> availability of programmers (need to hire?),
> job changes,
> programmer skills,
> multiple projects at the same time,
> language switching e.g. between projects every other week,
> time needed for education,
> documentation needs,
> future modification done by others,
> future modification to be done by the author,
> ...

is just so urgent and important to you, you might consider sponsoring them.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08  7:26                                                   ` Martin Krischik
@ 2007-02-08  9:33                                                     ` Markus E Leypold
  2007-02-09 13:37                                                       ` Martin Krischik
  2007-02-09 13:47                                                       ` Georg Bauhaus
  0 siblings, 2 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-08  9:33 UTC (permalink / raw)




Martin Krischik <krischik@users.sourceforge.net> writes:

> Markus E Leypold schrieb:
>
>> Martin Krischik <krischik@users.sourceforge.net> writes:
>
>>> Maciej Sobczak schrieb:
>
>>> On my Weblogic course I could not stop shaking my head about all the
>>> problems which brings the "all is pointer" concept of Java.
>
>> Fortunately you can ignore this "all is pointer" by exposing only a
>> read-only interface to the client and leaving the rest to the GC

> Only Java has no const keyword. It is supposed to get one but how long
> until it is actually used?


Excuse me, but ... "a read only interface" means:

   (a) make all field of objects private 
   (b) allow access only my methods
   (c) provide only Get_*-methods, no Set_*-methods

(a+b) is actually standard good practice in software engineering,
since it allows to hide representation and allows for maintaining
cached attributes/data (which direct access to fields won't allow).

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08  7:54                                                 ` Maciej Sobczak
@ 2007-02-08  9:49                                                   ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-08  9:49 UTC (permalink / raw)



Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:

>> Actually I think, real OO (as opposed to tagged types) need GC and
>> unbounded life time of objects. That's indeed all OO is about.

> That is quite novel definition of OO. Any references?

Ian Graham: "Object-Oriented Methods: Principles & Practice". 

Footnotes and general comments there make that quite clear I
think. (But I can't quote chapter and verse here, since I DON'T intend
to spent the rest of the morning with researching literature). I
remeber that this was not the first source which noted that a delete
operator is quite contrary to the spirit of stateful OOP (as opposed
to the functional OO models proposed by Abadi and others (Cardelli?).

And then of course there is me, myself, as a source :-). The reasoning
fo the assertion above is basically that OO is about having
(projected) views of a total system. In no view you can say that an
object leaves the view -- you can only say that it becomes unimportant
for the view (goes into the kernel of the projection). And that is so
for all views -- whose sum only gives the complete system. Then (when
viewing the whole system) you can decide where to delete. Object
deletion is a global property/function. Rooting it in any subsystem
will destroy modularity in the OO sense (which is different from
modularity in the modula2 or Ada-sense, which is derived from the
ideas of structured programming).

But note that I'm not really interested in discussing these
propositions. Either you profit from it or you don't -- I don't think
that it is easy to see what I mean and it was hard for me to get that
insights at the beginning (since there is really lots of bad
literature on OO (mostly the ad-hoc approach) and few usable formal or
semi-formal approaches). Discussing those on a serious level would take
lots of time. So I have just to assert my guru status here for lack of
more time :-).

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-08  9:04                                             ` Maciej Sobczak
@ 2007-02-08 10:01                                               ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-08 10:01 UTC (permalink / raw)




Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:
>
> I will cut ad-persona ramblings and your suggestions about my
> competence, and try to restate major points.

I actually cannot find those "ad-persona ramblings" in my post. I can
only imagine that you mean any of those 2 things:

 (a) You repeated popular misconceptions on GC (only statistical, does
     not compact, etc) and I called that FUD.

 (b) I asserted, that "We just don't have a common basis to slug it out and
     come to a rational decision."

Both are not very ad-hominem (indeed they aren't, since it's not an
insult to not have a common basis with me -- I might be a total crank
after all).

But since we are slowly drifting now into that meta-level of "you're
not arguing right, I accuse you of misbehaviour" (and yes the
"ad-persona ramblings" annoys me somewhat), we can as well stop the
discussion here to prevent more damage.

After all: We just don't have a common basis to slug it out and this
is all totally OT in c.l.a. Finally I don't think that repetition will
help to make anything clearer any more.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-07  7:54                                           ` Maciej Sobczak
  2007-02-07  9:42                                             ` Markus E Leypold
@ 2007-02-08 18:14                                             ` Dmitry A. Kazakov
  2007-02-09  8:17                                               ` Maciej Sobczak
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-08 18:14 UTC (permalink / raw)


On Wed, 07 Feb 2007 08:54:58 +0100, Maciej Sobczak wrote:

> Dmitry A. Kazakov wrote:
> 
>> For handling cycles there are weak pointers.
> 
> Not only. If you have cycles, then you'd better rethink the design.
> The difference is between a) graph treated as a mesh (or mess) of nodes 
> which "own" each other and b) graph treated as a collection of nodes.
> The former might have ownership cycles between nodes, but not the 
> latter, where ownership is an acyclic relation between graph and nodes.
> I agree that this kind of restructuring is not always possible, but for 
> me it is conceptually cleaner and worth trying from the beginning.

I would say that in all cases all references within it should be
non-controlled. The argumentation could go as follows:

A controlled reference (subject of GC) expresses the life-time relation
"not before me." [*] This relation is obviously transitive (due to nature
of time). Ergo, there cannot be any cycles, per definition.

From this stand point I would claim that a cycle of controlled references
manifests either a bug or a design problem. That a buggy program continues
to function as-if there were no bug, barely should be attributed as an
advantage of GC systems.

-----------------
*  One cannot say that the relation is "before or together with me,"
because nothing can happen simultaneous. Even if two [collection] actions A
and B could happen concurrently, then A and B were independent, thus there
could be no any time constraint between them. 

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08  7:48                                                 ` Maciej Sobczak
  2007-02-08  8:20                                                   ` Martin Krischik
  2007-02-08  8:43                                                   ` Markus E Leypold
@ 2007-02-08 18:24                                                   ` Jeffrey R. Carter
  2007-02-09  8:57                                                     ` Jean-Pierre Rosen
  2 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-02-08 18:24 UTC (permalink / raw)


Maciej Sobczak wrote:
> 
> As far as I understand it, Unchecked_Deallocation is allowed to do 
> nothing. That wouldn't be a very competitive language implementation, 
> but AARM does not require it either. :-)

Unchecked_Deallocation has to set its parameter to null.

The Rolm-Data General compiler, the 1st validated Ada-83 compiler, did 
only that. The argument was that every program had a 4 GB virtual memory 
space, so there was no need to actually reclaim memory.

In reality, I think it was skipped to save time so they could have the 
1st validated compiler.

-- 
Jeff Carter
"What I wouldn't give for a large sock with horse manure in it."
Annie Hall
42



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-07 11:53                                               ` Martin Krischik
  2007-02-07 12:22                                                 ` Markus E Leypold
  2007-02-08  7:48                                                 ` Maciej Sobczak
@ 2007-02-08 18:38                                                 ` Dmitry A. Kazakov
  2007-02-09  7:58                                                   ` Maciej Sobczak
  2007-02-09 10:07                                                   ` Martin Krischik
  2 siblings, 2 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-08 18:38 UTC (permalink / raw)


On Wed, 07 Feb 2007 12:53:27 +0100, Martin Krischik wrote:

> On my Weblogic course I could not stop shaking my head about all the 
> problems which brings the "all is pointer" concept of Java.

Which is the same sort of rubbish as "all is object."

Clearly it is impossible to have all types referents. I don't mean here
small fundamental types like Boolean etc, but the pointers themselves. To
incorporate then one needs to introduce pointers-to-pointers, then
pointers-to-pointers-to-pointers ad infinitum, which is impossible. The
"concept" leaks. 

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08 18:38                                                 ` Dmitry A. Kazakov
@ 2007-02-09  7:58                                                   ` Maciej Sobczak
  2007-02-09 10:07                                                   ` Martin Krischik
  1 sibling, 0 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-09  7:58 UTC (permalink / raw)


Dmitry A. Kazakov wrote:

>> On my Weblogic course I could not stop shaking my head about all the 
>> problems which brings the "all is pointer" concept of Java.
> 
> Which is the same sort of rubbish as "all is object."
> 
> Clearly it is impossible to have all types referents. I don't mean here
> small fundamental types like Boolean etc, but the pointers themselves. To
> incorporate then one needs to introduce pointers-to-pointers, then
> pointers-to-pointers-to-pointers ad infinitum, which is impossible. The
> "concept" leaks. 

Not only it leaks, but combined with other defficiencies often leads to 
"consciusness split".

Consider: int, Integer, IntHolder.
That gives three types for the same logical domain.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-02  8:42                         ` Maciej Sobczak
  2007-02-02  9:32                           ` Alex R. Mosteo
  2007-02-02 13:57                           ` Markus E Leypold
@ 2007-02-09  8:01                           ` adaworks
  2007-02-09  9:07                             ` Jean-Pierre Rosen
  2007-02-09  9:21                             ` Markus E Leypold
  2 siblings, 2 replies; 397+ messages in thread
From: adaworks @ 2007-02-09  8:01 UTC (permalink / raw)



"Maciej Sobczak" <no.spam@no.spam.com> wrote in message 
news:eputhf$p8u$1@cernne03.cern.ch...
> Markus E Leypold wrote:
>
> Ada 95 *is* terrible. It doesn't have containers nor unbounded strings and it 
> cannot even return limited types from functions. Yuck! ;-)
>
Both statements are untrue.

There are numerous container libraries available for Ada 95.
However, there is no standard container library as part of
the language standard.

Three string libraries are available: Fixed, Bounded and Unbounded.

While the inability to return limited types from a function is a bit of
an inconvenience, it does not prevent one from writing subprograms
that support the equivalent kind of thing using an in out mode parameter
in a procedure.   Further, one is not required to use limited types when
they are inconvenient.  That being said, the recent change allowing
limited types in return statements is, potentially, an improvement.

Richard Riehle 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-08 18:14                                             ` Dmitry A. Kazakov
@ 2007-02-09  8:17                                               ` Maciej Sobczak
  2007-02-09 14:02                                                 ` Dmitry A. Kazakov
  2007-02-09 18:03                                                 ` Ray Blaak
  0 siblings, 2 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-09  8:17 UTC (permalink / raw)


Dmitry A. Kazakov wrote:

>>> For handling cycles there are weak pointers.
>> Not only. If you have cycles, then you'd better rethink the design.
>> The difference is between a) graph treated as a mesh (or mess) of nodes 
>> which "own" each other and b) graph treated as a collection of nodes.
>> The former might have ownership cycles between nodes, but not the 
>> latter, where ownership is an acyclic relation between graph and nodes.
>> I agree that this kind of restructuring is not always possible, but for 
>> me it is conceptually cleaner and worth trying from the beginning.
> 
> I would say that in all cases all references within it should be
> non-controlled. The argumentation could go as follows:
> 
> A controlled reference (subject of GC) expresses the life-time relation
> "not before me." [*] This relation is obviously transitive (due to nature
> of time). Ergo, there cannot be any cycles, per definition.
> 
> From this stand point I would claim that a cycle of controlled references
> manifests either a bug or a design problem. That a buggy program continues
> to function as-if there were no bug, barely should be attributed as an
> advantage of GC systems.

Amen!

Still, there is a strong argument is that for some class of algorithms 
it might be beneficial to be able to "drop on the floor" a bigger part 
of the graph altogether. Consider a situation where an algorithm breaks 
a connection between two nodes in a graph and just looses the interest 
in the part that was on the other side of the broken connecion. It might 
be a single node, but it might be as well million, with mesh (mess) of 
connections between them. Reclaiming that abandoned part might require 
implementing the same tracking logic that GC already provides out of the 
box and therefore the argument goes that the use of off-the-shelf GC can 
be beneficial for the memory-management aspect of such an algorithm. 
(Any thoughts on this?)

Personally, I accept this reasoning and I'm happy that I *can* plug for 
example the Boehm collector if I find it useful - but at the same time I 
don't find this class of algorithms to be so widespread as to justify GC 
as a general "paradigm", worth exposure as a driving language feature.


-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08 18:24                                                   ` Jeffrey R. Carter
@ 2007-02-09  8:57                                                     ` Jean-Pierre Rosen
  2007-02-09 12:57                                                       ` Robert A Duff
  2007-02-09 18:35                                                       ` Jeffrey R. Carter
  0 siblings, 2 replies; 397+ messages in thread
From: Jean-Pierre Rosen @ 2007-02-09  8:57 UTC (permalink / raw)


Jeffrey R. Carter a �crit :

> The Rolm-Data General compiler, the 1st validated Ada-83 compiler
Although DG claimed that, their certificate shows #2.
#1 was Ada-ED, from NYU, as everybody should know.


-- 
---------------------------------------------------------
            J-P. Rosen (rosen@adalog.fr)
Visit Adalog's web site at http://www.adalog.fr



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09  8:01                           ` adaworks
@ 2007-02-09  9:07                             ` Jean-Pierre Rosen
  2007-02-09 10:36                               ` Maciej Sobczak
  2007-02-09  9:21                             ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Jean-Pierre Rosen @ 2007-02-09  9:07 UTC (permalink / raw)


adaworks@sbcglobal.net a �crit :
> While the inability to return limited types from a function is a bit of
> an inconvenience, ...
And you can always return a *pointer* to a limited type, which is what 
you would do in most other languages.

-- 
---------------------------------------------------------
            J-P. Rosen (rosen@adalog.fr)
Visit Adalog's web site at http://www.adalog.fr



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09  8:01                           ` adaworks
  2007-02-09  9:07                             ` Jean-Pierre Rosen
@ 2007-02-09  9:21                             ` Markus E Leypold
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-09  9:21 UTC (permalink / raw)



<adaworks@sbcglobal.net> writes:

> "Maciej Sobczak" <no.spam@no.spam.com> wrote in message 
> news:eputhf$p8u$1@cernne03.cern.ch...
>> Markus E Leypold wrote:
>>
>> Ada 95 *is* terrible. It doesn't have containers nor unbounded strings and it 
>> cannot even return limited types from functions. Yuck! ;-)
>>

Hey! Again I was not writing that ... I find that irritating -- I've
been arguing enough controversial positions without getting attributed
the statements of others.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08 18:38                                                 ` Dmitry A. Kazakov
  2007-02-09  7:58                                                   ` Maciej Sobczak
@ 2007-02-09 10:07                                                   ` Martin Krischik
  2007-02-09 14:10                                                     ` Dmitry A. Kazakov
  1 sibling, 1 reply; 397+ messages in thread
From: Martin Krischik @ 2007-02-09 10:07 UTC (permalink / raw)


Dmitry A. Kazakov schrieb:
> On Wed, 07 Feb 2007 12:53:27 +0100, Martin Krischik wrote:
> 
>> On my Weblogic course I could not stop shaking my head about all the 
>> problems which brings the "all is pointer" concept of Java.
> 
> Which is the same sort of rubbish as "all is object."
> 
> Clearly it is impossible to have all types referents. I don't mean here
> small fundamental types like Boolean etc, but the pointers themselves. To
> incorporate then one needs to introduce pointers-to-pointers, then
> pointers-to-pointers-to-pointers ad infinitum, which is impossible. The
> "concept" leaks. 

Well, really it is "all objects and arrays handled by reference". But 
still a silly concept.

Martin




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09  9:07                             ` Jean-Pierre Rosen
@ 2007-02-09 10:36                               ` Maciej Sobczak
  2007-02-09 12:50                                 ` Robert A Duff
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-09 10:36 UTC (permalink / raw)


Jean-Pierre Rosen wrote:

>> While the inability to return limited types from a function is a bit of
>> an inconvenience, ...
> And you can always return a *pointer* to a limited type, which is what 
> you would do in most other languages.

Which is entirely missing the point.

The purpose of returning limited types from a function is to implement 
the constructor functionality for those types that are considered 
critical and cannot be left uninitialized.

Initialization by procedure is prone to bugs that cannot be checked at 
compile-time (forgetting or bypassing the initialization), same for 
initialization by pointer. Both "solutions" fail to meet the initial 
goal which is to enforce at compile time the proper initialization for 
selected important types.

The ideal solution is called "constructor", everything else is just 
patchwork. Returning limited types is a compromise that can be 
satisfying in most cases - that's why it's good that Ada 2005 provides 
at least this feature.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 10:36                               ` Maciej Sobczak
@ 2007-02-09 12:50                                 ` Robert A Duff
  2007-02-09 14:02                                   ` Dmitry A. Kazakov
  2007-02-09 14:12                                   ` Maciej Sobczak
  0 siblings, 2 replies; 397+ messages in thread
From: Robert A Duff @ 2007-02-09 12:50 UTC (permalink / raw)


Maciej Sobczak <no.spam@no.spam.com> writes:

> The ideal solution is called "constructor", everything else is just
> patchwork. Returning limited types is a compromise that can be
> satisfying in most cases - that's why it's good that Ada 2005 provides
> at least this feature.

I presume by "constructor", you mean the C++ style feature, right?

Why is that superior to just using functions as constructors?
Ada has:

    type T(<>) is ...

to indicate that clients MUST call some constructor function when
creating objects.  And it now allows constructor functions (and
aggregates) for limited types.

I'm talking about Ada 2005, of course.  I agree that limited types were
rather painful in Ada 95.

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09  8:57                                                     ` Jean-Pierre Rosen
@ 2007-02-09 12:57                                                       ` Robert A Duff
  2007-02-09 14:44                                                         ` Jean-Pierre Rosen
  2007-02-09 18:35                                                       ` Jeffrey R. Carter
  1 sibling, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-02-09 12:57 UTC (permalink / raw)


Jean-Pierre Rosen <rosen@adalog.fr> writes:

> Jeffrey R. Carter a �crit :
>
>> The Rolm-Data General compiler, the 1st validated Ada-83 compiler
> Although DG claimed that, their certificate shows #2.
> #1 was Ada-ED, from NYU, as everybody should know.

I think it's fair to say that Ada-ED was the first validated
implementation of Ada 83, and that the Rolm-Data General implementation
was the first validated Ada compiler for Ada 83.

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08  9:33                                                     ` Markus E Leypold
@ 2007-02-09 13:37                                                       ` Martin Krischik
  2007-02-09 13:47                                                       ` Georg Bauhaus
  1 sibling, 0 replies; 397+ messages in thread
From: Martin Krischik @ 2007-02-09 13:37 UTC (permalink / raw)


Markus E Leypold schrieb:

>> Only Java has no const keyword. It is supposed to get one but how long
>> until it is actually used?
> 
> Excuse me, but ... "a read only interface" means:
> 
>    (a) make all field of objects private 
>    (b) allow access only my methods
>    (c) provide only Get_*-methods, no Set_*-methods

you mean as in:

java.lang.Date
get_Date ()
   {
   return this.date;
   }

True enough, with a Get_*-method like this you never need a Set_*-method 
;-).

Martin



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08  9:33                                                     ` Markus E Leypold
  2007-02-09 13:37                                                       ` Martin Krischik
@ 2007-02-09 13:47                                                       ` Georg Bauhaus
  2007-02-09 15:29                                                         ` Maciej Sobczak
  1 sibling, 1 reply; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-09 13:47 UTC (permalink / raw)


On Thu, 2007-02-08 at 10:33 +0100, Markus E Leypold wrote:
> 
> Martin Krischik <krischik@users.sourceforge.net> writes:

> > Only Java has no const keyword. It is supposed to get one but how long
> > until it is actually used?
> 
> 
> Excuse me, but ... "a read only interface" means:
> 
>    (a) make all field of objects private 
>    (b) allow access only my methods

What Eiffel does by design.

>    (c) provide only Get_*-methods, no Set_*-methods

This lets programmers design type interfaces and wrappers so that
objects are effectively read-only.
Wouldn't it be nice to just export a constant view where
needed? Like C++'s const& or Ada's access-to-constant?
Or to have C++ const view and Ada in mode parameters
that extend to the referred object?

procedure a is

    type J is
        record
            x: Integer;
        end record;

    procedure nope(this: in J; new_x: Integer) is
    begin
        this.x := new_x; -- compile time error
    end nope;

begin
    nope(42);
end a;


struct J
{
  int x;
};

int main()
{
  J wj;
  const J rj = wj;

  wj.x = 42;
  rj.x = 42;  // compile time error
  return 0;
}





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09  8:17                                               ` Maciej Sobczak
@ 2007-02-09 14:02                                                 ` Dmitry A. Kazakov
  2007-02-09 18:08                                                   ` Ray Blaak
  2007-02-09 18:03                                                 ` Ray Blaak
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-09 14:02 UTC (permalink / raw)


On Fri, 09 Feb 2007 09:17:47 +0100, Maciej Sobczak wrote:

> Still, there is a strong argument is that for some class of algorithms 
> it might be beneficial to be able to "drop on the floor" a bigger part 
> of the graph altogether. Consider a situation where an algorithm breaks 
> a connection between two nodes in a graph and just looses the interest 
> in the part that was on the other side of the broken connecion. It might 
> be a single node, but it might be as well million, with mesh (mess) of 
> connections between them.

It is still broken, because when you remove a connection (non-directed)
between two nodes you don't know which of them should/may/can fall on the
floor. This information is locally missing. It could be deduced from some
transitive client-master/reference-target relation, but we have demolished
that relation just before by introducing cycles.

In essence, there are two graphs. G is the mesh-mess. R is the graph
induced by the reference-target relation. The problem is that these two
graphs are sufficiently different. R (=>GC) cannot work as a complete model
of G. It could, if it were a part of a larger model of G.

> Reclaiming that abandoned part might require 
> implementing the same tracking logic that GC already provides out of the 
> box and therefore the argument goes that the use of off-the-shelf GC can 
> be beneficial for the memory-management aspect of such an algorithm. 
> (Any thoughts on this?)
> 
> Personally, I accept this reasoning and I'm happy that I *can* plug for 
> example the Boehm collector if I find it useful - but at the same time I 
> don't find this class of algorithms to be so widespread as to justify GC 
> as a general "paradigm", worth exposure as a driving language feature.

I think that your initial point is perfectly valid here, GC is fine when it
is an implementation detail. GC as a "paradigm" could turn bad, because it
is too narrow to gasp all cases. Therefore any language which restricts
itself to GC is weakened per design. What is worse, it forces the
programmer to use inherently wrong abstractions.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 12:50                                 ` Robert A Duff
@ 2007-02-09 14:02                                   ` Dmitry A. Kazakov
  2007-02-10 18:21                                     ` adaworks
  2007-02-09 14:12                                   ` Maciej Sobczak
  1 sibling, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-09 14:02 UTC (permalink / raw)


On Fri, 09 Feb 2007 07:50:43 -0500, Robert A Duff wrote:

> Maciej Sobczak <no.spam@no.spam.com> writes:
> 
>> The ideal solution is called "constructor", everything else is just
>> patchwork. Returning limited types is a compromise that can be
>> satisfying in most cases - that's why it's good that Ada 2005 provides
>> at least this feature.
> 
> I presume by "constructor", you mean the C++ style feature, right?
> 
> Why is that superior to just using functions as constructors?
> Ada has:
> 
>     type T(<>) is ...
> 
> to indicate that clients MUST call some constructor function when
> creating objects.  And it now allows constructor functions (and
> aggregates) for limited types.

Because it does not compose upon inheritance and aggregation.

When inherited the constructing function need to be overridden, even if
null record was the only thing added. Further one can barely publicly
derive anything from T, which breaks encapsulation.

When aggregated:

   type Outer  is record
      Inner : T .= Constructor_T (...);
   end 

there is no way to evaluate the parameters of T's constructing function
from ones of Outer.

IMO, the constructor concept is cleaner and more general than constructing
functions, which are as little functions as ":=" in X : T := 123; is an
assignment.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-08  8:49                                                   ` Markus E Leypold
@ 2007-02-09 14:09                                                     ` Georg Bauhaus
  2007-02-09 16:17                                                       ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-09 14:09 UTC (permalink / raw)


On Thu, 2007-02-08 at 09:49 +0100, Markus E Leypold wrote:
> Georg Bauhaus <bauhaus@arcor.de> writes:
> 
> > On Wed, 2007-02-07 at 12:19 +0100, Markus E Leypold wrote:
> >> Georg Bauhaus <bauhaus@futureapps.de> writes:
> >> 
> >
> >> > If you do this, perhaps you can find the time to use a, uh, stop-watch
> >> > to measure the time it takes to produce, read, and change the programs?
> >> > I.e., to technically manage significant software. The times will be
> >> > important input for deciding whether or not static typing does indeed
> >> > help in production.
> >> > The only thing that relates to economic efficiency, if you want it :-)
> >> 
> >> So the how long a beginner in a specific language takes to do
> >> something is "important input for deciding ..." if compared to what,
> >> exactly?
> >
> > I don't think Ray is a beginner in functional programming.
> 
> >> So the how long a beginner in a specific language takes to do
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> 
> I should think I have been quite clear on that. 

If your question is asking how much time a beginner will need
to start coding productively in either dynamically typed language
or statically typed language, then this is a somewhat related question.

As for an answer, we have been shown an interesting article about
experiences with Ada vs C in teaching model train control. 

* Equate the expressiveness of Ada base types in source code to explicit
and static typing of OCaml on the one hand.

* Equate the typical use of int or char* in C to the typical absence
of static type expression in Lisp.

Then the article shows that explicit static typing helps beginners
find a working solution faster.

(I know the above equations are not technically correct. But my
viewpoint here is on beginning programmers who are mostly guided
by what they see written, not by their knowledge of type system
effects etc.)





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09 10:07                                                   ` Martin Krischik
@ 2007-02-09 14:10                                                     ` Dmitry A. Kazakov
  0 siblings, 0 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-09 14:10 UTC (permalink / raw)


On Fri, 09 Feb 2007 11:07:44 +0100, Martin Krischik wrote:

> Dmitry A. Kazakov schrieb:
>> On Wed, 07 Feb 2007 12:53:27 +0100, Martin Krischik wrote:
>> 
>>> On my Weblogic course I could not stop shaking my head about all the 
>>> problems which brings the "all is pointer" concept of Java.
>> 
>> Which is the same sort of rubbish as "all is object."
>> 
>> Clearly it is impossible to have all types referents. I don't mean here
>> small fundamental types like Boolean etc, but the pointers themselves. To
>> incorporate then one needs to introduce pointers-to-pointers, then
>> pointers-to-pointers-to-pointers ad infinitum, which is impossible. The
>> "concept" leaks. 
> 
> Well, really it is "all objects and arrays handled by reference". But 
> still a silly concept.

But what is so special in Boolean which makes it non-object? Such
irregularities are always language design faults.

[ As for Ada there should be Boolean'Class. ]

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 12:50                                 ` Robert A Duff
  2007-02-09 14:02                                   ` Dmitry A. Kazakov
@ 2007-02-09 14:12                                   ` Maciej Sobczak
  2007-02-09 19:41                                     ` Randy Brukardt
  1 sibling, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-09 14:12 UTC (permalink / raw)


Robert A Duff wrote:

> I presume by "constructor", you mean the C++ style feature, right?

The feature in general. C++ is just one example from many languages that 
got it right.

> Why is that superior to just using functions as constructors?

The constructor, as a language feature, more clearly describes its own 
purpose. It's also entirely independent on other properties of the type.

> Ada has:
> 
>     type T(<>) is ...
> 
> to indicate that clients MUST call some constructor function when
> creating objects.

This indicates that the type has unknown discriminants.
The concept of "discriminant" is completely irrelevant to the intent 
here, even though in this particular case the combination of this 
concept and other language features leads to the fact that there must be 
some constructor function called.

I treat it as an idiom that allows me to finally achieve the goal, but 
not as a direct support for the feature that would be most accurate.

> I'm talking about Ada 2005, of course.  I agree that limited types were
> rather painful in Ada 95.

Yes. And this answers your question why constructors are superior - they 
are superior, because as a direct feature related to value 
initialization it would be completely independent on other properties of 
the type, like whether it is limited or not. Ada 2005 has patched this, 
but with direct suppor for constructors there would be nothing to patch 
- constructor initializes the value "in place", not as a result of 
returning value from somewhere and special-casing it for limited types.

There is also a question of how to initialize a generic type. Again, the 
question is really about a direct language feature.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-08  8:43                                                   ` Markus E Leypold
@ 2007-02-09 14:20                                                     ` Maciej Sobczak
  2007-02-09 16:23                                                       ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-09 14:20 UTC (permalink / raw)


Markus E Leypold wrote:

>>> And Unchecked_Deallocation should deallocate even when a collector
>>> is present.
>> I don't understand. There is no legal way for the program to verify
>> that anything was indeed deallocated, so it doesn't make much sense to
>> say that this behaviour is required.
> 
> Oh yes. Deallocating immeditately and deallocating later makes a
> difference in time and space behaviour -- which IS measurable outside
> the program

With a small issues that this possibility is not formalized by the 
language standard (please read carefully my sentence above: "there is no 
*legal* way *for the program*").

And that is why it is *not* measurable, because there is no sensible way 
to define at which level of memory management it should be measured.

As was already pointed out in this thread, with some operating systems 
memory reclamation might not be meaningful at all unless the whole 
program is terminated.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09 12:57                                                       ` Robert A Duff
@ 2007-02-09 14:44                                                         ` Jean-Pierre Rosen
  2007-02-10 13:38                                                           ` Robert A Duff
  0 siblings, 1 reply; 397+ messages in thread
From: Jean-Pierre Rosen @ 2007-02-09 14:44 UTC (permalink / raw)


Robert A Duff a �crit :
> Jean-Pierre Rosen <rosen@adalog.fr> writes:
> I think it's fair to say that Ada-ED was the first validated
> implementation of Ada 83, and that the Rolm-Data General implementation
> was the first validated Ada compiler for Ada 83.
> 
I fail to see why Ada-ED does not deserve the name "compiler". I may 
admit that Rolm was the first /industrial/ compiler, i.e. usable for 
real programs...

-- 
---------------------------------------------------------
            J-P. Rosen (rosen@adalog.fr)
Visit Adalog's web site at http://www.adalog.fr



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-08  9:24                                                   ` Markus E Leypold
@ 2007-02-09 15:08                                                     ` Georg Bauhaus
  0 siblings, 0 replies; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-09 15:08 UTC (permalink / raw)


On Thu, 2007-02-08 at 10:24 +0100, Markus E Leypold wrote:
> Georg Bauhaus <bauhaus@arcor.de> writes:
> 

> > How will a piece of source code, your own or someone else's, and
> > a language with tools fit these and the market as we might think
> > it is now?
> 
> It will depend on the ability of mine or someone else to code in this
> unnamed language with arbitrary tools. Note that I doubt that "we
> think" the same about the market and what counts -- finally -- is how
> the market is, not how we think about it, which might be a different
> thing altogether.
> 
> Does that answer satisfy you? No? I thought so.

Yes, it does!
And more or less uncertainty about causes and effects of PLs
in production use is the only common base we have for making
decisions, right or wrong. Partial knowledge is all we
will ever have, still it is something to build on.

Nevertheless, this is where *Ada* started, AFAIKT:
Some of the effects of using more than 400 PLs in one large
organization were pretty well diagnosed. A catalogue of desirable
PL properties was drafted, and reviewed. Looking at
the outcome, my conclusion is that this large research effort
did suggest answers to some of the questions of this co-thread
about redundancy, explicit typing, or robust syntax (To see
highly sensitive syntax typical of the Bell labs tradition ;-),
try

struct s {
   int a;
}

int main()
{
   struct s x;
   return 0;
}



> Well, things have changed, but you're probably missing how.

> George, you're not trying to find out things, but to win. At, I might
> say, every price. 

I'm only trying to draw attention to some findings when
programming is observed as caused by programmers using programming
languages. This includes findings about old and new languages. [*]
 Any findings are important because they can affect decisions.
As a consequence of these, they will potentially affect the amount
of money spent, the technical qualities of solutions and the
time needed to produce. I'm trying to allude to data to again be
placed in a frame of reference for language design decisions.

But I understand you are convinced that OCaml, if you let me name
it as a representative of a few things, answers many of the
questions best. So do I, to some extent.

Yet, if conviction is all there is, then indeed we loose the
common ground for a discussion.


[*] Comparing, say, Ada and Haskell wrt understandability,
neither is intuitive enough for my taste, though not always
for the same reasons.
Some language properties are hidden away, even when they are useful,
or important to know for runtime behavior.

- Like Ada tagged types being aliased therefore you can
have 'Unchecked_Access to function parameters...

- Like the Haskell expressions
 sum [1 .. n] or foldr (+) 0 [1 .. n] should almost certainly
be rewritten. The programmer has to find out why and how. Yet,
the first expression is so nice and straight forward and reflects
almost exactly what Hughes has suggested in 1984...





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09 13:47                                                       ` Georg Bauhaus
@ 2007-02-09 15:29                                                         ` Maciej Sobczak
  2007-02-09 20:52                                                           ` Georg Bauhaus
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-09 15:29 UTC (permalink / raw)


Georg Bauhaus wrote:

> Or to have C++ const view and Ada in mode parameters
> that extend to the referred object?

>   J wj;
>   const J rj = wj;
            ^

const J &rj = wj;
         ^

I guess you wanted to use const reference. This is closer to the concept 
of "view".

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-09 14:09                                                     ` Georg Bauhaus
@ 2007-02-09 16:17                                                       ` Markus E Leypold
  2007-02-09 20:51                                                         ` Georg Bauhaus
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-09 16:17 UTC (permalink / raw)



Georg Bauhaus <bauhaus@futureapps.de> writes:

> On Thu, 2007-02-08 at 09:49 +0100, Markus E Leypold wrote:
>> Georg Bauhaus <bauhaus@arcor.de> writes:
>> 
>> > On Wed, 2007-02-07 at 12:19 +0100, Markus E Leypold wrote:
>> >> Georg Bauhaus <bauhaus@futureapps.de> writes:
>> >> 
>> >
>> >> > If you do this, perhaps you can find the time to use a, uh, stop-watch
>> >> > to measure the time it takes to produce, read, and change the programs?
>> >> > I.e., to technically manage significant software. The times will be
>> >> > important input for deciding whether or not static typing does indeed
>> >> > help in production.
>> >> > The only thing that relates to economic efficiency, if you want it :-)
>> >> 
>> >> So the how long a beginner in a specific language takes to do
>> >> something is "important input for deciding ..." if compared to what,
>> >> exactly?
>> >
>> > I don't think Ray is a beginner in functional programming.
>> 
>> >> So the how long a beginner in a specific language takes to do
>                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>> 
>> I should think I have been quite clear on that. 
>
> If your question is asking how much time a beginner will need

If that is you only question, fine. It wasn't mine.

> to start coding productively in either dynamically typed language
> or statically typed language, then this is a somewhat related question.

In just the same way that a GUI makes it easy for a beginner to start
using applications and "surfing" the internet. Does give us reason to
suspect that GUIs have a larger "power" in the sense that you can do
more with them? "Somewhat related" does not seem to be related enough
for the questions you had.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09 14:20                                                     ` Maciej Sobczak
@ 2007-02-09 16:23                                                       ` Markus E Leypold
  2007-02-12  8:52                                                         ` Maciej Sobczak
  0 siblings, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-09 16:23 UTC (permalink / raw)




Maciej Sobczak <no.spam@no.spam.com> writes:

>> Oh yes. Deallocating immeditately and deallocating later makes a
>> difference in time and space behaviour -- which IS measurable outside
>> the program
>
> With a small issues that this possibility is not formalized by the
> language standard (please read carefully my sentence above: "there is
> no *legal* way *for the program*").
>
> And that is why it is *not* measurable, because there is no sensible
> way to define at which level of memory management it should be
> measured.

You said:

>>> I don't understand. There is no legal way for the program to verify
>>> that anything was indeed deallocated, so it doesn't make much sense to
>>> say that this behaviour is required.

This is a 'non sequitur', since it makes sense to say the behaviour is
required to fix certain real time properties. Regardless of wether it
can be detected in the program (and it could, by observing the wall
clock).

> As was already pointed out in this thread, with some operating systems
> memory reclamation might not be meaningful at all unless the whole
> program is terminated.

I don't even ask to be shown such an operating system...

Regards -- Markus






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09  8:17                                               ` Maciej Sobczak
  2007-02-09 14:02                                                 ` Dmitry A. Kazakov
@ 2007-02-09 18:03                                                 ` Ray Blaak
  2007-02-09 18:47                                                   ` Randy Brukardt
  1 sibling, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-09 18:03 UTC (permalink / raw)


Maciej Sobczak <no.spam@no.spam.com> writes:
> Still, there is a strong argument is that for some class of algorithms it
> might be beneficial to be able to "drop on the floor" a bigger part of the
> graph altogether. [...] Reclaiming that abandoned part might require
> implementing the same tracking logic that GC already provides out of the box
> and therefore the argument goes that the use of off-the-shelf GC can be
> beneficial for the memory-management aspect of such an algorithm. (Any
> thoughts on this?)

My thoughts: not "might", but "definitely". Also, the advantage is not
restricted to some class of algorithms -- it is in fact the common case.

You don't need a complex mesh for this kind of advantage to kick in, even
regular tree cleanup is greatly simplified: just let go of the subbranch you
no longer need, and avoid that whole cleanup traversal.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 14:02                                                 ` Dmitry A. Kazakov
@ 2007-02-09 18:08                                                   ` Ray Blaak
  2007-02-09 18:43                                                     ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-09 18:08 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> It is still broken, because when you remove a connection (non-directed)
> between two nodes you don't know which of them should/may/can fall on the
> floor. This information is locally missing. It could be deduced from some
> transitive client-master/reference-target relation, but we have demolished
> that relation just before by introducing cycles.

Not really. You program according to a principle of locality of use. Some code
says "I no longer need this data", and lets go of it, without regard to
whether or not it is still in use somewhere else.

If every bit of code relinquishes its resources according to its local needs,
then the global effect via GC is that the resource is in fact properly
managed. The local effect is easier to prove and code to since at every local
step the "cleanup" work is much simpler in the presence of GC.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09  8:57                                                     ` Jean-Pierre Rosen
  2007-02-09 12:57                                                       ` Robert A Duff
@ 2007-02-09 18:35                                                       ` Jeffrey R. Carter
  2007-02-10 19:01                                                         ` Martin Krischik
  2007-02-11 15:22                                                         ` Pascal Obry
  1 sibling, 2 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-02-09 18:35 UTC (permalink / raw)


Jean-Pierre Rosen wrote:
> 
> Although DG claimed that, their certificate shows #2.
> #1 was Ada-ED, from NYU, as everybody should know.

Ada-ED was an interpreter, not a compiler, as everybody should know.

-- 
Jeff Carter
"Sheriff murdered, crops burned, stores looted,
people stampeded, and cattle raped."
Blazing Saddles
35



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 18:08                                                   ` Ray Blaak
@ 2007-02-09 18:43                                                     ` Dmitry A. Kazakov
  2007-02-09 18:57                                                       ` Ray Blaak
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-09 18:43 UTC (permalink / raw)


On Fri, 09 Feb 2007 18:08:06 GMT, Ray Blaak wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

>> It is still broken, because when you remove a connection (non-directed)
>> between two nodes you don't know which of them should/may/can fall on the
>> floor. This information is locally missing. It could be deduced from some
>> transitive client-master/reference-target relation, but we have demolished
>> that relation just before by introducing cycles.
> 
> Not really. You program according to a principle of locality of use. Some code
> says "I no longer need this data", and lets go of it, without regard to
> whether or not it is still in use somewhere else.

The point is that "I don't need the node A here" /= "There is no link
between A and B."

> If every bit of code relinquishes its resources according to its local needs,
> then the global effect via GC is that the resource is in fact properly
> managed. The local effect is easier to prove and code to since at every local
> step the "cleanup" work is much simpler in the presence of GC.

No, it just does not follow. In question is very model of graph nodes as
"resources," because of the inequality above.

You are making a jump from resource management to program correctness. An
analogous statement would be: "every computation is correct because the CPU
does not make errors while summing numbers."

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* RE: How come Ada isn't more popular?
  2007-02-09 18:03                                                 ` Ray Blaak
@ 2007-02-09 18:47                                                   ` Randy Brukardt
  2007-02-09 19:02                                                     ` Ray Blaak
  2007-02-09 22:05                                                     ` Markus E Leypold
  0 siblings, 2 replies; 397+ messages in thread
From: Randy Brukardt @ 2007-02-09 18:47 UTC (permalink / raw)
  To: comp.lang.ada

> Maciej Sobczak <no.spam@no.spam.com> writes:
> > Still, there is a strong argument is that for some class of algorithms
it
> > might be beneficial to be able to "drop on the floor" a bigger part of
the
> > graph altogether. [...] Reclaiming that abandoned part might require
> > implementing the same tracking logic that GC already provides out of the
box
> > and therefore the argument goes that the use of off-the-shelf GC can be
> > beneficial for the memory-management aspect of such an algorithm. (Any
> > thoughts on this?)
>
> My thoughts: not "might", but "definitely". Also, the advantage is not
> restricted to some class of algorithms -- it is in fact the common case.

YMMV, of course, but it's not my experience. It's rare to destroy only part
of a data structure; it's usually the entire structure that needs to be
discarded.

> You don't need a complex mesh for this kind of advantage to kick in, even
> regular tree cleanup is greatly simplified: just let go of the
> subbranch you no longer need, and avoid that whole cleanup traversal.

I'm primarily interested in destroying the entire structure, and often I
need no destruction at all: the structures exist until the termination of
the program. There's no point of earlier cleanup in such programs, and
surely no point in non-zero overhead to support such cleanup. I'm not aware
of any zero-overhead GC algorithm, because the basic tenet of GC is that you
can find all reachable objects: doing that requires some time and/or space
overhead.

                       Randy.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 18:43                                                     ` Dmitry A. Kazakov
@ 2007-02-09 18:57                                                       ` Ray Blaak
  0 siblings, 0 replies; 397+ messages in thread
From: Ray Blaak @ 2007-02-09 18:57 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> On Fri, 09 Feb 2007 18:08:06 GMT, Ray Blaak wrote:
> 
> > "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
> >> It is still broken, because when you remove a connection (non-directed)
> >> between two nodes you don't know which of them should/may/can fall on the
> >> floor. This information is locally missing. It could be deduced from some
> >> transitive client-master/reference-target relation, but we have demolished
> >> that relation just before by introducing cycles.
> > 
> > Not really. You program according to a principle of locality of use. Some code
> > says "I no longer need this data", and lets go of it, without regard to
> > whether or not it is still in use somewhere else.
> 
> The point is that "I don't need the node A here" /= "There is no link
> between A and B."

In practice it doesn't matter. A can easily "know" it is not using B. That B is
still using A is not an issue, at least in terms of memory management.

> You are making a jump from resource management to program correctness. An
> analogous statement would be: "every computation is correct because the CPU
> does not make errors while summing numbers."

I am not talking "program" correctness, only "resource" correctness.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 18:47                                                   ` Randy Brukardt
@ 2007-02-09 19:02                                                     ` Ray Blaak
  2007-02-09 19:35                                                       ` Randy Brukardt
  2007-02-09 22:05                                                     ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-09 19:02 UTC (permalink / raw)


"Randy Brukardt" <randy@rrsoftware.com> writes:
> I'm primarily interested in destroying the entire structure, and often I
> need no destruction at all: the structures exist until the termination of
> the program. There's no point of earlier cleanup in such programs, and
> surely no point in non-zero overhead to support such cleanup. I'm not aware
> of any zero-overhead GC algorithm, because the basic tenet of GC is that you
> can find all reachable objects: doing that requires some time and/or space
> overhead.

I don't think anyone anywhere claims zero-overhead for GC. The whole point is
reasonable performance (in fact competitive with manual management) and a much
easier job for the programmer, in situations with highly dynamic memory usage
patterns.

At any rate, one uses the tools that fit their needs. If you find yourself in
situations with stable data structures, then it sounds like you don't need GC,
plain and simple, or maybe even any cleanup at all.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* RE: How come Ada isn't more popular?
  2007-02-09 19:02                                                     ` Ray Blaak
@ 2007-02-09 19:35                                                       ` Randy Brukardt
  2007-02-09 19:52                                                         ` Ray Blaak
  2007-02-09 22:11                                                         ` Markus E Leypold
  0 siblings, 2 replies; 397+ messages in thread
From: Randy Brukardt @ 2007-02-09 19:35 UTC (permalink / raw)
  To: comp.lang.ada

Ray Blaak writes:
> "Randy Brukardt" <randy@rrsoftware.com> writes:
> > I'm primarily interested in destroying the entire structure, and often I
> > need no destruction at all: the structures exist until the termination
of
> > the program. There's no point of earlier cleanup in such programs, and
> > surely no point in non-zero overhead to support such cleanup. I'm not
aware
> > of any zero-overhead GC algorithm, because the basic tenet of GC is that
you
> > can find all reachable objects: doing that requires some time and/or
space
> > overhead.
>
> I don't think anyone anywhere claims zero-overhead for GC. The whole point
is
> reasonable performance (in fact competitive with manual management) and a
much
> easier job for the programmer, in situations with highly dynamic
> memory usage patterns.

That's the crux of the disagrement, I think. You claim that "highly dynamic
memory usage" is the normal case; I think it is *not* the usual case (or, at
least, doesn't have to be the usual case - languages like Java make it that
unnecessarily).

> At any rate, one uses the tools that fit their needs. If you find yourself
in
> situations with stable data structures, then it sounds like you don't need
GC,
> plain and simple, or maybe even any cleanup at all.

Right, but then why should it be a required and central part of general
purpose programming languages. Surely, it should be available as an option
for types that need it, but it certainly should not apply to all types.

My programs tend to be plenty dynamic after all, but the dynamic is in the
construction of the data structures. Destruction of them is unusual, and
usually happens in a single operation (not piecemeal). And locals and
temporaries are on the stack - no action at all is required to destroy them
(you keep refering to this as "manual management", but there is nothing
manual about it -- the compiler does all of the work).

                               Randy.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* RE: How come Ada isn't more popular?
  2007-02-09 14:12                                   ` Maciej Sobczak
@ 2007-02-09 19:41                                     ` Randy Brukardt
  2007-02-12  9:07                                       ` Maciej Sobczak
  0 siblings, 1 reply; 397+ messages in thread
From: Randy Brukardt @ 2007-02-09 19:41 UTC (permalink / raw)
  To: comp.lang.ada

Maciej Sobczak writes:
...
> I treat it as an idiom that allows me to finally achieve the goal, but
> not as a direct support for the feature that would be most accurate.

Fair enough, but then why are you here? Ada is all about providing building
blocks that allow you to accomplish a purpose, rather than whiz-bang
features that provide the entire purpose in a neat but inflexible package.

There are many examples of that: tagged types, derivation, packages (which
combine to make a "class"), Finalization via controlled types, (<>),
protected types (which can be use to write a lock, but isn't a lock by
itself), etc.

                     Randy.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 19:35                                                       ` Randy Brukardt
@ 2007-02-09 19:52                                                         ` Ray Blaak
  2007-02-12  7:20                                                           ` Harald Korneliussen
  2007-02-09 22:11                                                         ` Markus E Leypold
  1 sibling, 1 reply; 397+ messages in thread
From: Ray Blaak @ 2007-02-09 19:52 UTC (permalink / raw)


"Randy Brukardt" <randy@rrsoftware.com> writes:
> > At any rate, one uses the tools that fit their needs. If you find yourself
> > in situations with stable data structures, then it sounds like you don't
> > need GC, plain and simple, or maybe even any cleanup at all.
> 
> Right, but then why should it be a required and central part of general
> purpose programming languages. Surely, it should be available as an option
> for types that need it, but it certainly should not apply to all types.

GC should be a required and central part of general purpose language because
it saves the programmer so much time and effort, and does a better job at
memory cleanup, which means less bugs.

A simple way out of the problem is to have a language that readily provides
the options to do things either way.

The religious argument then boils down to whether it should be on by default
or not.

Or even better, one can have a language where one has GC for heap based
values, while still providing scope based constructs.

> And locals and temporaries are on the stack - no action at all is required
> to destroy them (you keep refering to this as "manual management", but there
> is nothing manual about it -- the compiler does all of the work).

I must clarify this. I fully understand how stack values work. I have used
them in Ada and in C++ after all.

What I refer to as manual management is for heap values, such that that for
every "new" one must remember eventually to call "deallocate".

When I refer to locals and memory management, I am talking about the
allocation of linked data structures assigned to local *pointers* and the need
to cleanup those pointers before returning.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
rAYblaaK@STRIPCAPStelus.net                    The Rhythm has my soul.



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-09 16:17                                                       ` Markus E Leypold
@ 2007-02-09 20:51                                                         ` Georg Bauhaus
  2007-02-09 22:19                                                           ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-09 20:51 UTC (permalink / raw)


On Fri, 2007-02-09 at 17:17 +0100, Markus E Leypold wrote:
> Georg Bauhaus <bauhaus@futureapps.de> writes:
> 
> > On Thu, 2007-02-08 at 09:49 +0100, Markus E Leypold wrote:
> >> Georg Bauhaus <bauhaus@arcor.de> writes:
> >> 
> >> > On Wed, 2007-02-07 at 12:19 +0100, Markus E Leypold wrote:
> >> >> Georg Bauhaus <bauhaus@futureapps.de> writes:
> >> >> 
> >> >
> >> >> > If you do this, perhaps you can find the time to use a, uh, stop-watch
> >> >> > to measure the time it takes to produce, read, and change the programs?
> >> >> > I.e., to technically manage significant software. The times will be
> >> >> > important input for deciding whether or not static typing does indeed
> >> >> > help in production.
> >> >> > The only thing that relates to economic efficiency, if you want it :-)
> >> >> 
> >> >> So the how long a beginner in a specific language takes to do
> >> >> something is "important input for deciding ..." if compared to what,
> >> >> exactly?
> >> >
> >> > I don't think Ray is a beginner in functional programming.
> >> 
> >> >> So the how long a beginner in a specific language takes to do
> >                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> >> 
> >> I should think I have been quite clear on that. 
> >
> > If your question is asking how much time a beginner will need
> 
> If that is you only question, fine. It wasn't mine.

OK, I admit I have had some difficulty in parsing the beginner-sentence,
I tried my best which is never enough. Does your question have to do
with beginners at all?





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09 15:29                                                         ` Maciej Sobczak
@ 2007-02-09 20:52                                                           ` Georg Bauhaus
  0 siblings, 0 replies; 397+ messages in thread
From: Georg Bauhaus @ 2007-02-09 20:52 UTC (permalink / raw)


On Fri, 2007-02-09 at 16:29 +0100, Maciej Sobczak wrote:
> Georg Bauhaus wrote:

> >   J wj;
> >   const J rj = wj;
>             ^
> 
> const J &rj = wj;
>          ^
> 
> I guess you wanted to use const reference. This is closer to the concept 
> of "view".

Yes, thanks for the patch.






^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 18:47                                                   ` Randy Brukardt
  2007-02-09 19:02                                                     ` Ray Blaak
@ 2007-02-09 22:05                                                     ` Markus E Leypold
  2007-02-10  1:31                                                       ` Randy Brukardt
  1 sibling, 1 reply; 397+ messages in thread
From: Markus E Leypold @ 2007-02-09 22:05 UTC (permalink / raw)



"Randy Brukardt" <randy@rrsoftware.com> writes:

>> Maciej Sobczak <no.spam@no.spam.com> writes:
>> > Still, there is a strong argument is that for some class of algorithms
> it
>> > might be beneficial to be able to "drop on the floor" a bigger part of
> the
>> > graph altogether. [...] Reclaiming that abandoned part might require
>> > implementing the same tracking logic that GC already provides out of the
> box
>> > and therefore the argument goes that the use of off-the-shelf GC can be
>> > beneficial for the memory-management aspect of such an algorithm. (Any
>> > thoughts on this?)
>>
>> My thoughts: not "might", but "definitely". Also, the advantage is not
>> restricted to some class of algorithms -- it is in fact the common case.
>
> YMMV, of course, but it's not my experience. It's rare to destroy only part
> of a data structure; it's usually the entire structure that needs to be
> discarded.

"it's usually the entire structure" => In a world without
representation sharing. In one with representation sharing (and that
makes the production of new trees from existing ones really
efficient), one would have to traverse the whole subtree any time a
tree referencing this subtree is deleted - if only to adjust the
reference counters. In a world with GC you just overwrite one
reference to some subtree. When the GC hits it just traverses the tree
once (and then perhaps decides which parts to deallocate). Dependend
on the application domain (e.g. if you produce and drop many shared
representions between GC cycles) that might be much more efficient.


>> You don't need a complex mesh for this kind of advantage to kick in, even
>> regular tree cleanup is greatly simplified: just let go of the
>> subbranch you no longer need, and avoid that whole cleanup traversal.
>
> I'm primarily interested in destroying the entire structure, and often I
> need no destruction at all: the structures exist until the termination of
> the program. 

This certainly is a special case? E.g a chess playing program (any
playing program) would have to create and destroy game trees at a
pretty good rate.

> There's no point of earlier cleanup in such programs, and
> surely no point in non-zero overhead to support such cleanup. I'm not aware
> of any zero-overhead GC algorithm, because the basic tenet of GC is that you
> can find all reachable objects: doing that requires some time and/or space
> overhead.

I can imagine to allocate some memory in a special, non garabage
collected heap. Furthermore generational garbage collection AFAIK has
the property to touch old objects seldom -- so there is not so much
cost (as it might seem at first glance) for keeping large structures
around unchanged. And if the program is short running you could also
increase the garbage collection threshold(s), so that the collection
never sets in.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 19:35                                                       ` Randy Brukardt
  2007-02-09 19:52                                                         ` Ray Blaak
@ 2007-02-09 22:11                                                         ` Markus E Leypold
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-09 22:11 UTC (permalink / raw)



"Randy Brukardt" <randy@rrsoftware.com> writes:

> Ray Blaak writes:
>> "Randy Brukardt" <randy@rrsoftware.com> writes:
>> > I'm primarily interested in destroying the entire structure, and often I
>> > need no destruction at all: the structures exist until the termination
> of
>> > the program. There's no point of earlier cleanup in such programs, and
>> > surely no point in non-zero overhead to support such cleanup. I'm not
> aware
>> > of any zero-overhead GC algorithm, because the basic tenet of GC is that
> you
>> > can find all reachable objects: doing that requires some time and/or
> space
>> > overhead.
>>
>> I don't think anyone anywhere claims zero-overhead for GC. The whole point
> is
>> reasonable performance (in fact competitive with manual management) and a
> much
>> easier job for the programmer, in situations with highly dynamic
>> memory usage patterns.
>

> That's the crux of the disagrement, I think. 

Yes indeed.

> You claim that "highly dynamic memory usage" is the normal case; I

So do I: Therefore my initial suggestion that list processing and
support to do so efficiently matters for languages that want to play
in the arena of application programming.

> think it is *not* the usual case (or, at least, doesn't have to be
> the usual case - languages like Java make it that unnecessarily).


>> At any rate, one uses the tools that fit their needs. If you find yourself
> in
>> situations with stable data structures, then it sounds like you don't need
> GC,
>> plain and simple, or maybe even any cleanup at all.
>


> Right, but then why should it be a required and central part of general
> purpose programming languages. 

There I disagree, but we know that.

> Surely, it should be available as an option
> for types that need it, but it certainly should not apply to all types.
>
> My programs tend to be plenty dynamic after all, but the dynamic is in the
> construction of the data structures. Destruction of them is unusual, and
> usually happens in a single operation (not piecemeal). And locals and
> temporaries are on the stack - no action at all is required to destroy them
> (you keep refering to this as "manual management", but there is nothing
> manual about it -- the compiler does all of the work).
>
>                                Randy.

I think we should find an application case (example), but we'd
probably not agree wether it is typical. May I suggest XML processing?


Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: in defense of GC
  2007-02-09 20:51                                                         ` Georg Bauhaus
@ 2007-02-09 22:19                                                           ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-09 22:19 UTC (permalink / raw)



Georg Bauhaus <bauhaus@futureapps.de> writes:

> On Fri, 2007-02-09 at 17:17 +0100, Markus E Leypold wrote:
>> Georg Bauhaus <bauhaus@futureapps.de> writes:
>> 
>> > On Thu, 2007-02-08 at 09:49 +0100, Markus E Leypold wrote:
>> >> Georg Bauhaus <bauhaus@arcor.de> writes:
>> >> 
>> >> > On Wed, 2007-02-07 at 12:19 +0100, Markus E Leypold wrote:
>> >> >> Georg Bauhaus <bauhaus@futureapps.de> writes:
>> >> >> 
>> >> >
>> >> >> > If you do this, perhaps you can find the time to use a, uh, stop-watch
>> >> >> > to measure the time it takes to produce, read, and change the programs?
>> >> >> > I.e., to technically manage significant software. The times will be
>> >> >> > important input for deciding whether or not static typing does indeed
>> >> >> > help in production.
>> >> >> > The only thing that relates to economic efficiency, if you want it :-)
>> >> >> 
>> >> >> So the how long a beginner in a specific language takes to do
>> >> >> something is "important input for deciding ..." if compared to what,
>> >> >> exactly?
>> >> >
>> >> > I don't think Ray is a beginner in functional programming.
>> >> 
>> >> >> So the how long a beginner in a specific language takes to do
>> >                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>> >> 
>> >> I should think I have been quite clear on that. 
>> >
>> > If your question is asking how much time a beginner will need
>> 
>> If that is you only question, fine. It wasn't mine.
>
> OK, I admit I have had some difficulty in parsing the beginner-sentence,
> I tried my best which is never enough. Does your question have to do
> with beginners at all?


In this subthread I didn't have a question: I just encouraged Ray to
have a look into OCaml. More like a recommendation between people that
are lokking for good restaurants: Tastes differ, but still one is
listening to other peoples advice and one is still giving references
where one liked it :-).

In the superthread the open question seemd to be yours: Typing - is it
good or bad, is type inference good or bad etc -- so I related your
request for measurements to your question(s) and was a bit surprised
that suddenly we're only talking about suitability for beginners.

But as I said in other posts (responses to your's and Dmitry's): I'm
not sure any more at all what we're talking about, except (rather
general) likes disklikes and feelings of good and bad vibrations ... 

I'm admittedly not sure I'd like to posit any thesis here now or
suggest that anyone else does: After all, we don't have the
instruments here to check / test "the truth" or find empirical data
that definitely doesn't exist. 

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* RE: How come Ada isn't more popular?
  2007-02-09 22:05                                                     ` Markus E Leypold
@ 2007-02-10  1:31                                                       ` Randy Brukardt
  2007-02-10  2:18                                                         ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Randy Brukardt @ 2007-02-10  1:31 UTC (permalink / raw)
  To: comp.lang.ada

Markus E Leypold writes:
> "Randy Brukardt" <randy@rrsoftware.com> writes:
> >> Maciej Sobczak <no.spam@no.spam.com> writes:
> >> > Still, there is a strong argument is that for some class of
algorithms it
> >> > might be beneficial to be able to "drop on the floor" a bigger part
of the
> >> > graph altogether. [...] Reclaiming that abandoned part might require
> >> > implementing the same tracking logic that GC already provides out of
the box
> >> > and therefore the argument goes that the use of off-the-shelf GC can
be
> >> > beneficial for the memory-management aspect of such an algorithm.
(Any
> >> > thoughts on this?)
> >>
> >> My thoughts: not "might", but "definitely". Also, the advantage is not
> >> restricted to some class of algorithms -- it is in fact the common
case.
> >
> > YMMV, of course, but it's not my experience. It's rare to destroy only
part
> > of a data structure; it's usually the entire structure that needs to be
> > discarded.
>
> "it's usually the entire structure" => In a world without
> representation sharing. In one with representation sharing (and that
> makes the production of new trees from existing ones really
> efficient), one would have to traverse the whole subtree any time a
> tree referencing this subtree is deleted - if only to adjust the
> reference counters. In a world with GC you just overwrite one
> reference to some subtree. When the GC hits it just traverses the tree
> once (and then perhaps decides which parts to deallocate). Dependend
> on the application domain (e.g. if you produce and drop many shared
> representions between GC cycles) that might be much more efficient.

In an imperative language like Ada (and this *is* the Ada newsgroup!),
representation sharing has to be done completely behind an abstraction that
presents the appearance of deep copying (that is, no sharing). Anything else
is a receipe for bugs, as modifications occuring to one tree that magically
appear in another (supposedly independent) cannot be tolarated. The net
effect is it can be used only in limited ways.

That's not horrible, since one of the goals of Ada is to provide the ability
(not the requirement, remember) to get the performance level of the machine.
For that to be possible, you cannot have distributed overhead (that is,
overhead from features that you don't use).

Deep copying of trees has unacceptable overhead, as you note. But GC has a
non-zero overhead on every reference in the program. That too can be an
unacceptable overhead - thus it cannot be mandated on all types.

> >> You don't need a complex mesh for this kind of advantage to kick in,
even
> >> regular tree cleanup is greatly simplified: just let go of the
> >> subbranch you no longer need, and avoid that whole cleanup traversal.
> >
> > I'm primarily interested in destroying the entire structure, and often I
> > need no destruction at all: the structures exist until the termination
of
> > the program.
>
> This certainly is a special case? E.g a chess playing program (any
> playing program) would have to create and destroy game trees at a
> pretty good rate.

I doubt that very much, at least one that is trying to maximize the
"thinking". This is a very performance dependent application, and "creating"
and "destroying" much of anything is likely to be overhead that you can't
afford.

I haven't written a Chess playing program, but I have written Freecell and
Solitare solvers (and, in the more distant past, programs for playing
Othello and a couple of other games). I'm pretty sure I've never created or
destroyed any trees. The Freecell solver is based on queues of pending moves
to try. The queues are implemented mainly with lists of arrays; the queue
blocks are never destroyed and new ones are allocated only if there are no
unused ones to use. Moves in the queues are ordered by heuristics so the
most likely ones that lead to a solution are tried first. Positions are
stored (compressed) in a large hash table in which each bucket is a pointer
to an array which is allocated from a custom storage pool that directly uses
Windows memory management (and supports expansion with a special call). It
can solve most initial positions in under a minute on a 500 MHZ machine with
256Meg memory (harder ones take longer, possibly much longer).

> > There's no point of earlier cleanup in such programs, and
> > surely no point in non-zero overhead to support such cleanup. I'm not
aware
> > of any zero-overhead GC algorithm, because the basic tenet of GC is that
you
> > can find all reachable objects: doing that requires some time and/or
space
> > overhead.
>
> I can imagine to allocate some memory in a special, non garabage
> collected heap. Furthermore generational garbage collection AFAIK has
> the property to touch old objects seldom -- so there is not so much
> cost (as it might seem at first glance) for keeping large structures
> around unchanged. And if the program is short running you could also
> increase the garbage collection threshold(s), so that the collection
> never sets in.

The overhead of GC that I'm worrying about here is not that of the collector
itself (which is trivial in this case because nothing will need to be
collected), but rather the part that is usually hand-waved in GC papers:
determining what is actually reachable. This is a non-trivial exercise which
requires time and space overhead for every reference - whether or not it
will ever be possible to collect anything.

As Ray said, you could argue whether or not GC should be the default. It's
not an argument for Ada -- it isn't the default, and would never be the
default simply because making existing programs run slower is not an option
in all but the most serious cases. I could see a predefined GC storage pool
being added to the language, or something like that. But the language would
need other fixes to really allow GC (although I doubt anyone will stop you
from implementing a GC in Ada, it's likely to be wrong in some subtle - some
would say pathological - cases).

                         Randy.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-10  1:31                                                       ` Randy Brukardt
@ 2007-02-10  2:18                                                         ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-10  2:18 UTC (permalink / raw)



"Randy Brukardt" <randy@rrsoftware.com> writes:

> Markus E Leypold writes:
>> "Randy Brukardt" <randy@rrsoftware.com> writes:
>> >> Maciej Sobczak <no.spam@no.spam.com> writes:
>> >> > Still, there is a strong argument is that for some class of
> algorithms it
>> >> > might be beneficial to be able to "drop on the floor" a bigger part
> of the
>> >> > graph altogether. [...] Reclaiming that abandoned part might require
>> >> > implementing the same tracking logic that GC already provides out of
> the box
>> >> > and therefore the argument goes that the use of off-the-shelf GC can
> be
>> >> > beneficial for the memory-management aspect of such an algorithm.
> (Any
>> >> > thoughts on this?)
>> >>
>> >> My thoughts: not "might", but "definitely". Also, the advantage is not
>> >> restricted to some class of algorithms -- it is in fact the common
> case.
>> >
>> > YMMV, of course, but it's not my experience. It's rare to destroy only
> part
>> > of a data structure; it's usually the entire structure that needs to be
>> > discarded.
>>
>> "it's usually the entire structure" => In a world without
>> representation sharing. In one with representation sharing (and that
>> makes the production of new trees from existing ones really
>> efficient), one would have to traverse the whole subtree any time a
>> tree referencing this subtree is deleted - if only to adjust the
>> reference counters. In a world with GC you just overwrite one
>> reference to some subtree. When the GC hits it just traverses the tree
>> once (and then perhaps decides which parts to deallocate). Dependend
>> on the application domain (e.g. if you produce and drop many shared
>> representions between GC cycles) that might be much more efficient.
>
> In an imperative language like Ada (and this *is* the Ada newsgroup!),

Yes, yes. But I *had* the impression that we're now talking about
languages in general and general purpose languages. We've come to that
point from the discussion what distinguishes other contenders from
Ada, which in turn came from the question "Why isn't Ada more
popular?". Now we are discussing the up and downs of GC and those
opposed on GV discuss the problems on general ground -- I wonder why I
should now discuss the merits only on the grounds of how that would
fit into Ada :-).

> representation sharing has to be done completely behind an abstraction that
> presents the appearance of deep copying (that is, no sharing). 

Right. I've been talking about the implementation of this sharing, how
using representation sharing with reference counting compares with
using it in a GC'ed system.

I hope I've expressed myself clearly there, since I hate to be
criticized for errors I didn't make.

So the following doesn't apply to what I said ...

  > Anything else is a receipe for bugs, as modifications occuring to
  > one tree that magically appear in another (supposedly independent)
  > cannot be tolarated. The net effect is it can be used only in
  > limited ways.


  > That's not horrible, since one of the goals of Ada is to provide the ability
  > (not the requirement, remember) to get the performance level of the machine.
  > For that to be possible, you cannot have distributed overhead (that is,
  > overhead from features that you don't use).

> Deep copying of trees has unacceptable overhead, as you note. But GC has a

I did not talk about deep copying. Yes, it's unacceptable, but that
was taken as a given, I didn't discuss it. The problem I have been
talking about was the necessity of maintaining reference counters, but
I was wrong concerning the traversal (its not necessary, I've not been
paying attention, but after all that is not what you're criticizing
... :-) Still, AFAIK ref counting is less efficient than other GC
algorithms, but I'd like to point out, that ref counting is
essentially a specialized GC.

> non-zero overhead on every reference in the program. 

That is true. I suggested, though, that in generational GC the
overhead is not as large as it seems at first (long living structures
are less frequently traversed) and it would be possible to specially
allocate such structures outside the heap (my approach is somewhat
inverted from yours or Maciej's: I'd have GC as the default and scope
bound or manually managed memory allocated outside the GC'ed heap.

> That too can be an unacceptable overhead - thus it cannot be
> mandated on all types.

The limits, where "unacceptable" begins, vary. I doubt that many that
discuss GC as having way too much overhead actually have these
requirement.


>
>> >> You don't need a complex mesh for this kind of advantage to kick in,
> even
>> >> regular tree cleanup is greatly simplified: just let go of the
>> >> subbranch you no longer need, and avoid that whole cleanup traversal.
>> >
>> > I'm primarily interested in destroying the entire structure, and often I
>> > need no destruction at all: the structures exist until the termination
> of
>> > the program.
>>
>> This certainly is a special case? E.g a chess playing program (any
>> playing program) would have to create and destroy game trees at a
>> pretty good rate.

> I doubt that very much, at least one that is trying to maximize the
> "thinking". This is a very performance dependent application, and "creating"
> and "destroying" much of anything is likely to be overhead that you can't
> afford.


I think you're wrong, but I'm also sure other examples can be
found. Your position basically boils down to: Memory needs hardly ever
be deallocated and you can always leave it to the operating system to
destroy non-scoped memory. I don't think this position is really
tenable in general, but I don't see an approach to decide it
(here). All we can do, is to insist that our respective experience
suggests it.

> I haven't written a Chess playing program, but I have written Freecell and
> Solitare solvers (and, in the more distant past, programs for playing
> Othello and a couple of other games). I'm pretty sure I've never created or
> destroyed any trees.

Good. I do think this is a bit different from chess (the tree of
positions is actually the call tree of functions, which means,
positions in the future have to be recreated at every draw. I just
suspect that is not what one wants when playing chess or GO) -- but I
don't want to open up another avenue of discussion.


> The Freecell solver is based on queues of pending moves
> to try. The queues are implemented mainly with lists of arrays; the queue
> blocks are never destroyed and new ones are allocated only if there are no
> unused ones to use. Moves in the queues are ordered by heuristics so the
> most likely ones that lead to a solution are tried first. Positions are
> stored (compressed) in a large hash table in which each bucket is a pointer
> to an array which is allocated from a custom storage pool that directly uses
> Windows memory management (and supports expansion with a special call). It
> can solve most initial positions in under a minute on a 500 MHZ machine with
> 256Meg memory (harder ones take longer, possibly much longer).
>
>> > There's no point of earlier cleanup in such programs, and
>> > surely no point in non-zero overhead to support such cleanup. I'm not
> aware
>> > of any zero-overhead GC algorithm, because the basic tenet of GC is that
> you
>> > can find all reachable objects: doing that requires some time and/or
> space
>> > overhead.
>>
>> I can imagine to allocate some memory in a special, non garabage
>> collected heap. Furthermore generational garbage collection AFAIK has
>> the property to touch old objects seldom -- so there is not so much
>> cost (as it might seem at first glance) for keeping large structures
>> around unchanged. And if the program is short running you could also
>> increase the garbage collection threshold(s), so that the collection
>> never sets in.

> The overhead of GC that I'm worrying about here is not that of the collector
> itself (which is trivial in this case because nothing will need to be
> collected), but rather the part that is usually hand-waved in GC papers:
> determining what is actually reachable. 

I'm talking about exactly this: The process of determining what is
reachable. This involves traversing heap allocated memory and the
effort is (roughly) proportional to the allocated memory in the
heap. If there is no memory in the heap (no allocated blocks), there
is no overhead for traversal. So moving your large static structures
from GC'ed heap

> This is a non-trivial exercise which requires time and space
> overhead for every reference - whether or not it will ever be
> possible to collect anything.

I _do_ think that this concern for overhead in most cases is a case of
premature optimization. As I already noted in this thread: 

For all the increase in processing power we have seen in the last
decades I once would like to buy myself the convenient luxury of using
GC because it's easier on the developer (candy for developers). For
once I don't want to see that processing power go into useless
blinking and flickering of smooth GUIs (candy for users). After all
the users (indirectly) pay the development time and it's their money
that is being saved if the developers spent less time with getting the
memory manangement right.

BTW: Hoe large is the overhead really? Performance of OCaml compared
with C++/C compiled by Gcc suggest a factor of 2, measurements
performed by J Skaller suggest a factor of around 1.2 - 1.5 at first
glance for computation.

> As Ray said, you could argue whether or not GC should be the default. 

> It's not an argument for Ada -- it isn't the default, and would

Yes, the discussion is seriously OT, since we are mostly not
discussing how GC will be supported in Ada in future (or not), but the
general merits of GC.

> never be the default simply because making existing programs run
> slower is not an option in all but the most serious cases.

> I could see a predefined GC storage pool
> being added to the language, or something like that. But the language would
> need other fixes to really allow GC (although I doubt anyone will stop you
> from implementing a GC in Ada, it's likely to be wrong in some subtle - some
> would say pathological - cases).

Yes. Despite the standard allowing GC, it doesn't align too well with
the rest of Ada. 

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* RE: How come Ada isn't more popular?
@ 2007-02-10  4:18 Randy Brukardt
  2007-02-10  9:15 ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Randy Brukardt @ 2007-02-10  4:18 UTC (permalink / raw)
  To: comp.lang.ada

Dmitry A. Kazakov writes:

...
> Because it does not compose upon inheritance and aggregation.
>
> When inherited the constructing function need to be overridden, even if
> null record was the only thing added.

That's not true in Ada 2007. (That's a change I strongly opposed, BTW,
because it makes prototyping much harder. I usually start an extension with
a null record, figure out what needs to be overridden (giving those null
bodies), and then implement the actual routines and data. This no longer
works, because as soon as you add an extension component you are back to
square one.)

> Further one can barely publicly
> derive anything from T, which breaks encapsulation.

Not sure what you mean by this. It's certainly possible for a deivation to
break encapsulation (by deriving in a child unit, for instance), but you
really have to work at it.

It is annoying that we couldn't find a way to get composition to work for
anything other than :=, =, and stream attributes. (The lack of it for
finalization routines is especially aggrevating and error prone.) But the
workarounds for constructors aren't particularly bad, especially as you can
use extension aggregates in your overridings.

      (F with <new components>);

means that all of the new components are given; and you can pass whatever
parameters down to F.

                  Randy.





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-10  4:18 Randy Brukardt
@ 2007-02-10  9:15 ` Dmitry A. Kazakov
  2007-02-10 13:22   ` Robert A Duff
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-10  9:15 UTC (permalink / raw)


On Fri, 9 Feb 2007 22:18:51 -0600, Randy Brukardt wrote:

> Dmitry A. Kazakov writes:
> ...
>> Because it does not compose upon inheritance and aggregation.
>>
>> When inherited the constructing function need to be overridden, even if
>> null record was the only thing added.
> 
> That's not true in Ada 2007. (That's a change I strongly opposed, BTW,
> because it makes prototyping much harder. I usually start an extension with
> a null record, figure out what needs to be overridden (giving those null
> bodies), and then implement the actual routines and data. This no longer
> works, because as soon as you add an extension component you are back to
> square one.)

Yuck. Do I understand correctly, primitive operations are no more covariant
in the result?

   type T is tagged ...;
   function Foo return T;

   type S is new T ...;
      -- I don't need to override Foo????

>> Further one can barely publicly
>> derive anything from T, which breaks encapsulation.
> 
> Not sure what you mean by this. It's certainly possible for a deivation to
> break encapsulation (by deriving in a child unit, for instance),

Which is not always desirable or even possible, as we don't have multiple
parents (alas!).

> It is annoying that we couldn't find a way to get composition to work for
> anything other than :=, =, and stream attributes.

I think that the problem is that we have no unified composition model for
primitive operations. Things composed per magic (like ":=" or dispatching
subprograms) use different rules. Why don't we try to expose this mechanics
in the language? There certainly exist a common denominator of how
operations are composed. It is either of:

    g := f o g  -- f extends g
    g := g o f  -- f prepends g
    g := f       -- f replaces g
    g := e o g o p -- prologue and epilogue (conversion)
                   -- (like in out Natural accepting actual Integer)

All primitive operations including assignment, attributes, constructors
(from raw memory to object's invariant), destructors (backwards),
aggregates, conversions fall under some combination of these patterns.

This could radically simplify the language (fully backward compatible of
course).

> (The lack of it for
> finalization routines is especially aggrevating and error prone.)

Yes, especially, because there is no name for parent's Initialize/Finalize.

> But the
> workarounds for constructors aren't particularly bad, especially as you can
> use extension aggregates in your overridings.
> 
>       (F with <new components>);

Yes, aggregates compose (which is good), but they aren't functions. There
is an underlying thing which IMO should be named and exposed.

BTW, the aggregate above is almost equivalent to C++'s constructor
definition

    Derived::Derived (...) : F(...), <new components> {}

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-10  9:15 ` Dmitry A. Kazakov
@ 2007-02-10 13:22   ` Robert A Duff
  2007-02-10 15:54     ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-02-10 13:22 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> On Fri, 9 Feb 2007 22:18:51 -0600, Randy Brukardt wrote:
>
>> Dmitry A. Kazakov writes:
>> ...
>>> Because it does not compose upon inheritance and aggregation.
>>>
>>> When inherited the constructing function need to be overridden, even if
>>> null record was the only thing added.
>> 
>> That's not true in Ada 2007. (That's a change I strongly opposed, BTW,
>> because it makes prototyping much harder. I usually start an extension with
>> a null record, figure out what needs to be overridden (giving those null
>> bodies), and then implement the actual routines and data. This no longer
>> works, because as soon as you add an extension component you are back to
>> square one.)
>
> Yuck. Do I understand correctly, primitive operations are no more covariant
> in the result?
>
>    type T is tagged ...;
>    function Foo return T;
>
>    type S is new T ...;
>       -- I don't need to override Foo????

It depends what the "..." is.  You don't need to override Foo if it's
"with null record", which is what you were wanting above ("even if ...").

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09 14:44                                                         ` Jean-Pierre Rosen
@ 2007-02-10 13:38                                                           ` Robert A Duff
  2007-02-12  8:47                                                             ` Jean-Pierre Rosen
  0 siblings, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-02-10 13:38 UTC (permalink / raw)


Jean-Pierre Rosen <rosen@adalog.fr> writes:

> Robert A Duff a �crit :
>> Jean-Pierre Rosen <rosen@adalog.fr> writes:
>> I think it's fair to say that Ada-ED was the first validated
>> implementation of Ada 83, and that the Rolm-Data General implementation
>> was the first validated Ada compiler for Ada 83.
>>
> I fail to see why Ada-ED does not deserve the name "compiler". 

Because the Ada-ED implementation was highly interpretive.

>...I may
> admit that Rolm was the first /industrial/ compiler, i.e. usable for
> real programs...

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-10 13:22   ` Robert A Duff
@ 2007-02-10 15:54     ` Dmitry A. Kazakov
  2007-02-12 14:23       ` Robert A Duff
  0 siblings, 1 reply; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-10 15:54 UTC (permalink / raw)


On Sat, 10 Feb 2007 08:22:59 -0500, Robert A Duff wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> On Fri, 9 Feb 2007 22:18:51 -0600, Randy Brukardt wrote:
>>
>>> Dmitry A. Kazakov writes:
>>> ...
>>>> Because it does not compose upon inheritance and aggregation.
>>>>
>>>> When inherited the constructing function need to be overridden, even if
>>>> null record was the only thing added.
>>> 
>>> That's not true in Ada 2007. (That's a change I strongly opposed, BTW,
>>> because it makes prototyping much harder. I usually start an extension with
>>> a null record, figure out what needs to be overridden (giving those null
>>> bodies), and then implement the actual routines and data. This no longer
>>> works, because as soon as you add an extension component you are back to
>>> square one.)
>>
>> Yuck. Do I understand correctly, primitive operations are no more covariant
>> in the result?
>>
>>    type T is tagged ...;
>>    function Foo return T;
>>
>>    type S is new T ...;
>>       -- I don't need to override Foo????
> 
> It depends what the "..." is.  You don't need to override Foo if it's
> "with null record", which is what you were wanting above ("even if ...").

No, I didn't mean that. "Even if" was related to the model where
constructors were functions. This model is sometimes unnatural and the
requirement to override constructing function is a consequence of this
unnaturalness to me.

As for covariant functions, IMO the initial Ada 95 design is absolutely
correct, because whether a record was physically extended or not looks much
like an implementation detail to me. [*]

Constructors need not to be overridden when there is a way to generate them
safely. Though, I must admit that the difference looks too subtle.

---------
* Ideally, when deriving the programmer should be able to specify his
intent: whether the new type is equivalent, a generalization (extension), a
specialization (constraining) or both/neither. 

For generalization all primitive operations with out, in out parameters and
results of the base type should be required to be overridden.

For specialization all primitive operations with in and in out parameters
should be overridden.

For both everything need to be overridden.

(for overriding with the same body one could provide some syntax sugar)

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 14:02                                   ` Dmitry A. Kazakov
@ 2007-02-10 18:21                                     ` adaworks
  2007-02-10 18:41                                       ` Markus E Leypold
  2007-02-10 20:29                                       ` Dmitry A. Kazakov
  0 siblings, 2 replies; 397+ messages in thread
From: adaworks @ 2007-02-10 18:21 UTC (permalink / raw)



"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> wrote in message 
news:1sh134r4r8piq$.wh1w0uitf99t$.dlg@40tude.net...
> On Fri, 09 Feb 2007 07:50:43 -0500, Robert A Duff wrote:
>
>> Maciej Sobczak <no.spam@no.spam.com> writes:
>>
>>
>> Why is that superior to just using functions as constructors?
>> Ada has:
>>
>>     type T(<>) is ...
>>
>> to indicate that clients MUST call some constructor function when
>> creating objects.  And it now allows constructor functions (and
>> aggregates) for limited types.
>
> Because it does not compose upon inheritance and aggregation.
>
> When inherited the constructing function need to be overridden, even if
> null record was the only thing added. Further one can barely publicly
> derive anything from T, which breaks encapsulation.
>
I guess I don't see this as a serious problem.    Perhaps this is because
I frequently take the Modula-2 approach and create opaque types.
These support extensible inheritance, enforce encapsulation, and
separate all the implementation details within the package body.

Granted, the use of opaque types requires the supporting capability
of access types.   But the use of access types in the pursuit of good
programming is no vice.  Rather, it is consistent with the counsel given
us over two score years ago by Maurice Wilkes.

From my perspective, opaque types solve a lot of problems that are
inherent in C++ and Java and support an increased level of separation
of concerns in my application architecture.  That being said, one can,
with a little effort, create C++ opaque types.   I have not thought about
how this would be done in Java. Pehaps through the interface mechanism.

Richard 





^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-10 18:21                                     ` adaworks
@ 2007-02-10 18:41                                       ` Markus E Leypold
  2007-02-10 20:29                                       ` Dmitry A. Kazakov
  1 sibling, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-10 18:41 UTC (permalink / raw)



<adaworks@sbcglobal.net> writes:
>
> From my perspective, opaque types solve a lot of problems that are
> inherent in C++ and Java and support an increased level of separation
> of concerns in my application architecture.  That being said, one can,
> with a little effort, create C++ opaque types.   I have not thought about
> how this would be done in Java. Pehaps through the interface mechanism.

That is what I would advocate.

Regards -- Markus




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09 18:35                                                       ` Jeffrey R. Carter
@ 2007-02-10 19:01                                                         ` Martin Krischik
  2007-02-11 15:22                                                         ` Pascal Obry
  1 sibling, 0 replies; 397+ messages in thread
From: Martin Krischik @ 2007-02-10 19:01 UTC (permalink / raw)


Jeffrey R. Carter wrote:

> Jean-Pierre Rosen wrote:
>> 
>> Although DG claimed that, their certificate shows #2.
>> #1 was Ada-ED, from NYU, as everybody should know.
> 
> Ada-ED was an interpreter, not a compiler, as everybody should know.

Ada interpreter, cool.

Martin
-- 
mailto://krischik@users.sourceforge.net
Ada programming at: http://ada.krischik.com



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-10 18:21                                     ` adaworks
  2007-02-10 18:41                                       ` Markus E Leypold
@ 2007-02-10 20:29                                       ` Dmitry A. Kazakov
  1 sibling, 0 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-10 20:29 UTC (permalink / raw)


On Sat, 10 Feb 2007 18:21:56 GMT, adaworks@sbcglobal.net wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> wrote in message 
> news:1sh134r4r8piq$.wh1w0uitf99t$.dlg@40tude.net...
>> On Fri, 09 Feb 2007 07:50:43 -0500, Robert A Duff wrote:
>>
>>> Maciej Sobczak <no.spam@no.spam.com> writes:
>>>
>>>
>>> Why is that superior to just using functions as constructors?
>>> Ada has:
>>>
>>>     type T(<>) is ...
>>>
>>> to indicate that clients MUST call some constructor function when
>>> creating objects.  And it now allows constructor functions (and
>>> aggregates) for limited types.
>>
>> Because it does not compose upon inheritance and aggregation.
>>
>> When inherited the constructing function need to be overridden, even if
>> null record was the only thing added. Further one can barely publicly
>> derive anything from T, which breaks encapsulation.
>>
> I guess I don't see this as a serious problem.    Perhaps this is because
> I frequently take the Modula-2 approach and create opaque types.
> These support extensible inheritance, enforce encapsulation, and
> separate all the implementation details within the package body.
> 
> Granted, the use of opaque types requires the supporting capability
> of access types.   But the use of access types in the pursuit of good
> programming is no vice.  Rather, it is consistent with the counsel given
> us over two score years ago by Maurice Wilkes.
> 
> From my perspective, opaque types solve a lot of problems that are
> inherent in C++ and Java and support an increased level of separation
> of concerns in my application architecture.  That being said, one can,
> with a little effort, create C++ opaque types.   I have not thought about
> how this would be done in Java. Pehaps through the interface mechanism.

Opaque types are nice, but I don't think that they solve the problem.
Everything is OK as long as the types are independent. But when they are
not, one would like to have operations defined on access T be ones of
access S, (type S is new T with ...). The problem just reappears as
composition of constructing functions for access T and access S. For
example,

GtkAda represents a practical example of this problem. It goes as follows:

type Gtk_T_Record is ...;
type Gtk_T is access Gtk_T_Record'Class; -- The opaque type

procedure Gtk_New (X : out Gtk_T); -- Constructing "function"
procedure Initialize (X : access Gtk_T_Record'Class); -- Constructor
... -- All operations are defined in terms of access Gtk_T_Record.

-- The implementation of "constructors" goes as follows
procedure Gtk_New (X : out Gtk_T) is
begin
   X := new Gtk_T_Record;
   Initialize (X);
exception
   when others => -- We don't want it leaking
      Free (X);
      raise;
end Gtk_New;
procedure Initialize (X : access Gtk_T_Record'Class) is
begin
   ... -- Initialization
end Initialize;

Now with a derived type S:

type Gtk_S_Record is new Gtk_T_Record with ...;
type Gtk_S is access Gtk_s_Record'Class; -- The opaque type

procedure Gtk_New (X : out Gtk_S);
procedure Initialize (X : access Gtk_S_Record'Class);

procedure Gtk_New (X : out Gtk_S) is
begin
   X := new Gtk_S_Record;
   Initialize (X);
exception
   when others =>
      Free (X);
      raise;
end Gtk_New;

procedure Initialize (X : access Gtk_S_Record'Class) is
begin
   Parent_Package.Initialize (X);
   ... -- Custom initialization
end Initialize;

This all is quite boring, at least. Neither Gtk_New nor Initialize are
composable. Further, specifically to opaque types, it is impossible to hide
Gtk_T implementation as an access type, which breaks encapsulation by
exposing Gtk_T_Record to everybody.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09 18:35                                                       ` Jeffrey R. Carter
  2007-02-10 19:01                                                         ` Martin Krischik
@ 2007-02-11 15:22                                                         ` Pascal Obry
  2007-02-11 20:30                                                           ` Jeffrey R. Carter
  1 sibling, 1 reply; 397+ messages in thread
From: Pascal Obry @ 2007-02-11 15:22 UTC (permalink / raw)
  To: Jeffrey R. Carter

Jeffrey R. Carter a �crit :
> Jean-Pierre Rosen wrote:
>>
>> Although DG claimed that, their certificate shows #2.
>> #1 was Ada-ED, from NYU, as everybody should know.
> 
> Ada-ED was an interpreter, not a compiler, as everybody should know.

Well one could argue that Ada-ED was a compiler to a virtual machine.

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|              http://www.obry.net
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver wwwkeys.pgp.net --recv-key C1082595



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-11 15:22                                                         ` Pascal Obry
@ 2007-02-11 20:30                                                           ` Jeffrey R. Carter
  2007-02-13 18:47                                                             ` Pascal Obry
  0 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-02-11 20:30 UTC (permalink / raw)


Pascal Obry wrote:
> 
> Well one could argue that Ada-ED was a compiler to a virtual machine.

I never used it, but my understanding is that that isn't an accurate 
description.

-- 
Jeff Carter
"Many times we're given rhymes that are quite unsingable."
Monty Python and the Holy Grail
57



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 19:52                                                         ` Ray Blaak
@ 2007-02-12  7:20                                                           ` Harald Korneliussen
  2007-02-12 14:12                                                             ` Robert A Duff
  0 siblings, 1 reply; 397+ messages in thread
From: Harald Korneliussen @ 2007-02-12  7:20 UTC (permalink / raw)


On 9 Feb, 20:52, Ray Blaak <rAYbl...@STRIPCAPStelus.net> wrote:
> "Randy Brukardt" <r...@rrsoftware.com> writes:
> What I refer to as manual management is for heap values, such that that for
> every "new" one must remember eventually to call "deallocate".
>
And I think it should be an easy choice that _if_ you need such values
at all (never mind that the ati-GC folks claim that they hardly ever
need them, or that they are design faults - as if design not
structured around memory management was somehow sinful), then they
should be garbage collected. Relying on Unchecked_deallocation is
against the spirit of Ada, as it (re)introduces almost all the manual
memory management issues that make C/C++ such a pain.

As for the complaints that garbage collectors aren't general purpose
enough, I'm reminded of Babbage's (allegorical) lament when the
British government cut funding for the analytical engine: "Propose to
an Englishman any principle, or any instrument, however admirable, and
you will observe that the whole effort of the English mind is directed
to find a difficulty, a defect, or an impossibility in it. If you
speak to him of a machine for peeling a potato, he will pronounce it
impossible: if you peel a potato with it before his eyes, he will
declare it useless, because it will not slice a pineapple."
I've yet to see an explanation why higher-order wrappers for these
resources don't do the job well enough. You write a function F that
takes a filename and a function argument telling what to do with it. F
then opens the file, does its thing, and closes the file. In scheme
this is readily avaliable as the call-with-input-file / call-with-
output-file, in Haskell it's trivially done with the more general
bracket function.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-10 13:38                                                           ` Robert A Duff
@ 2007-02-12  8:47                                                             ` Jean-Pierre Rosen
  2007-02-12 15:31                                                               ` Jeffrey R. Carter
  0 siblings, 1 reply; 397+ messages in thread
From: Jean-Pierre Rosen @ 2007-02-12  8:47 UTC (permalink / raw)


Robert A Duff a �crit :
>> I fail to see why Ada-ED does not deserve the name "compiler". 
> 
> Because the Ada-ED implementation was highly interpretive.
> 
So what? You still have to transform source text into another 
representation, and do numerous checks at the same time. That's what I 
call "compiling".

-- 
---------------------------------------------------------
            J-P. Rosen (rosen@adalog.fr)
Visit Adalog's web site at http://www.adalog.fr



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-09 16:23                                                       ` Markus E Leypold
@ 2007-02-12  8:52                                                         ` Maciej Sobczak
  2007-02-12 12:56                                                           ` Markus E Leypold
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-12  8:52 UTC (permalink / raw)


Markus E Leypold wrote:

>>>> I don't understand. There is no legal way for the program to verify
>>>> that anything was indeed deallocated, so it doesn't make much sense to
>>>> say that this behaviour is required.
> 
> This is a 'non sequitur', since it makes sense to say the behaviour is
> required to fix certain real time properties. Regardless of wether it
> can be detected in the program (and it could, by observing the wall
> clock).

Observing the wall clock does not help much in a language where even 
null; can raise exceptions. Standard does not even guarantee that any 
given sequence of instructions will give consistent timings when run twice.

Definitely, observice the wall clock is of no use to verify memory 
deallocation, since deallocation might have positive effect on the 
timing as well as negative or none at all.

>> As was already pointed out in this thread, with some operating systems
>> memory reclamation might not be meaningful at all unless the whole
>> program is terminated.
> 
> I don't even ask to be shown such an operating system...

On systems with virtual memory, deallocations that don't span at least 
one full page (that condition can be met after joining with other free 
blocks, though) will certainly not deallocate anything to the operating 
system.


-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-09 19:41                                     ` Randy Brukardt
@ 2007-02-12  9:07                                       ` Maciej Sobczak
  2007-02-12 20:56                                         ` Randy Brukardt
  0 siblings, 1 reply; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-12  9:07 UTC (permalink / raw)


Randy Brukardt wrote:

>> I treat it as an idiom that allows me to finally achieve the goal, but
>> not as a direct support for the feature that would be most accurate.
> 
> Fair enough, but then why are you here?

I don't understand. Is the fact that I miss some language feature 
disqualifying me from learning more about the language? :-)

> Ada is all about providing building
> blocks that allow you to accomplish a purpose, rather than whiz-bang
> features that provide the entire purpose in a neat but inflexible package.

Is a direct support for constructors a whiz-bang but inflexible package?
In what way?

> There are many examples of that: [...]

Sure, but there are also examples of missing features.
For every language there is at least one programmer that will miss 
something. I miss (for example) constructors in Ada.


-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-12  8:52                                                         ` Maciej Sobczak
@ 2007-02-12 12:56                                                           ` Markus E Leypold
  0 siblings, 0 replies; 397+ messages in thread
From: Markus E Leypold @ 2007-02-12 12:56 UTC (permalink / raw)


Maciej Sobczak <no.spam@no.spam.com> writes:

> Markus E Leypold wrote:
>
>>>>> I don't understand. There is no legal way for the program to verify
>>>>> that anything was indeed deallocated, so it doesn't make much sense to
>>>>> say that this behaviour is required.
>> This is a 'non sequitur', since it makes sense to say the behaviour
>> is
>> required to fix certain real time properties. Regardless of wether it
>> can be detected in the program (and it could, by observing the wall
>> clock).
>
> Observing the wall clock does not help much in a language where even
> null; can raise exceptions. Standard does not even guarantee that any
> given sequence of instructions will give consistent timings when run
> twice.
>
> Definitely, observice the wall clock is of no use to verify memory
> deallocation, since deallocation might have positive effect on the
> timing as well as negative or none at all.

If you don't deallocate memory "really" in Unchecked_Deallocation then
you aither run out of memory sooner or later (verifiable and it makes
sense to require that this doesn't occur) or you have a garbage
collector and intermittend garbage collection runs which are visible
in the real time behaviour. So it makes sense to say the behaviour
(real deallocation to return memory to the free list) is required and
checkable, regardless of the question wether _the program_ can verify this. 

And as I said, from "no legal way for the program to verify" to
"doesn't make much sense to say that this behaviour is required" is a
non sequitur since behaviour might entail aspects that cannot only be
described in the production of data.

>>> As was already pointed out in this thread, with some operating systems
>>> memory reclamation might not be meaningful at all unless the whole
>>> program is terminated.
>> I don't even ask to be shown such an operating system...
>
> On systems with virtual memory, deallocations that don't span at least
> one full page (that condition can be met after joining with other free
> blocks, though) will certainly not deallocate anything to the
> operating system.

Excuse me: I misread "memory reclamation might not be meaningful at
all unless the whole program is terminated" as: No memory is ever
deallocated. Which is still the only meaningful reading of your
sentence since deallocation to the internal free block list is still
meaningful and not a no-op.

Regards -- Markus



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-12  7:20                                                           ` Harald Korneliussen
@ 2007-02-12 14:12                                                             ` Robert A Duff
  0 siblings, 0 replies; 397+ messages in thread
From: Robert A Duff @ 2007-02-12 14:12 UTC (permalink / raw)


"Harald Korneliussen" <vintermann@gmail.com> writes:

> As for the complaints that garbage collectors aren't general purpose
> enough, I'm reminded of Babbage's (allegorical) lament when the
> British government cut funding for the analytical engine: "Propose to
> an Englishman any principle, or any instrument, however admirable, and
> you will observe that the whole effort of the English mind is directed
> to find a difficulty, a defect, or an impossibility in it. If you
> speak to him of a machine for peeling a potato, he will pronounce it
> impossible: if you peel a potato with it before his eyes, he will
> declare it useless, because it will not slice a pineapple."

I like that quote!  Thank you!

GC is for managing memory, and it can work well in many situations.
It is indeed rather silly to complain that GC is no good because it
doesn't manage other resources very well.

On the other hand, it is a common occurrence in our industry that
somebody invents a good potato peeler, and then wants to throw out every
other tool, and use potato peelers for everything.  After all, "new"
means "improved", right?  ;-)

Java, Lisp, Smalltalk, etc do this with GC.  That is, once we have GC
available, we might was well allocate (almost) everything in the heap.
That makes GC less efficient (in cases where static or stack allocation
would work).  It leads to making everything aliased, which causes bugs
(except in a pure functional context -- i.e. none of the three languages
mentioned above).  And it leads to Java's completely broken notion of
finalization, which is just asking for race conditions.

Anyway, I claim that there is no single memory-management strategy that
is suitable in all situations.  We have GC (moving, nonmoving,
generational, incremental, ...), manual new/free, manual mark/release,
reference counting (deferred, nondeferred, ...), region-based (manual,
region inference), stack allocation, static allocation, etc.  You show
me a memory-management strategy, and I can show you a program that will
make that strategy look bad.  ;-)

> I've yet to see an explanation why higher-order wrappers for these
> resources don't do the job well enough.

I agree -- that technique can work just fine in many cases.
It would be a lot more tolerable in Ada if we had anonymous
functions (like "lambda" in Lisp).

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-10 15:54     ` Dmitry A. Kazakov
@ 2007-02-12 14:23       ` Robert A Duff
  2007-02-12 15:49         ` Dmitry A. Kazakov
  0 siblings, 1 reply; 397+ messages in thread
From: Robert A Duff @ 2007-02-12 14:23 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:

> * Ideally, when deriving the programmer should be able to specify his
> intent: whether the new type is equivalent, a generalization (extension), a
> specialization (constraining) or both/neither. 

I've been thinking the same thing.  What language has that capability?

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-12  8:47                                                             ` Jean-Pierre Rosen
@ 2007-02-12 15:31                                                               ` Jeffrey R. Carter
  0 siblings, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-02-12 15:31 UTC (permalink / raw)


Jean-Pierre Rosen wrote:
>
> So what? You still have to transform source text into another 
> representation, and do numerous checks at the same time. That's what I 
> call "compiling".

Then your definition differs from mine, Duff's, and numerous other people's.

-- 
Jeff Carter
"So if I understand 'The Matrix Reloaded' correctly, the Matrix is
basically a Microsoft operating system--it runs for a while and
then crashes and reboots. By design, no less. Neo is just a
memory leak that's too hard to fix, so they left him in ... The
users don't complain because they're packed in slush and kept
sedated."
Marin D. Condic
65



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-12 14:23       ` Robert A Duff
@ 2007-02-12 15:49         ` Dmitry A. Kazakov
  0 siblings, 0 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-12 15:49 UTC (permalink / raw)


On Mon, 12 Feb 2007 09:23:18 -0500, Robert A Duff wrote:

> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> writes:
> 
>> * Ideally, when deriving the programmer should be able to specify his
>> intent: whether the new type is equivalent, a generalization (extension), a
>> specialization (constraining) or both/neither. 
> 
> I've been thinking the same thing.  What language has that capability?

Ada 2010? (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* RE: How come Ada isn't more popular?
  2007-02-12  9:07                                       ` Maciej Sobczak
@ 2007-02-12 20:56                                         ` Randy Brukardt
  2007-02-13  9:02                                           ` Maciej Sobczak
  2007-02-14 10:12                                           ` Dmitry A. Kazakov
  0 siblings, 2 replies; 397+ messages in thread
From: Randy Brukardt @ 2007-02-12 20:56 UTC (permalink / raw)
  To: comp.lang.ada

Maciej Sobczak writes:
> Randy Brukardt wrote:
>
> >> I treat it as an idiom that allows me to finally achieve the goal, but
> >> not as a direct support for the feature that would be most accurate.
> >
> > Fair enough, but then why are you here?
>
> I don't understand. Is the fact that I miss some language feature
> disqualifying me from learning more about the language? :-)

Sorry, that comment seems harsher than I intended. All I meant to say is
that it is pointless to complain about one of the fundamental design choices
of Ada (or any other language). It makes as much sense as complaining that
it is hard to write an imperative program in a functional language.

> > Ada is all about providing building
> > blocks that allow you to accomplish a purpose, rather than whiz-bang
> > features that provide the entire purpose in a neat but
> inflexible package.
>
> Is a direct support for constructors a whiz-bang but inflexible package?
> In what way?

Because there is no single concept of a constructor. You need at a minimum a
copy-constructor and a default-constructor, and then you still have
flexibility issues. Ada chose to decompose these things into their
constituent parts.

Yes, Ada does have a problem with composition, but that is a general
language problem, not one that is specific to constructors or any other
specific feature. It would be much better to fix that composition problem
generally rather than to bolt on more special features that do compose,
while still leaving the general case unfixed.

> > There are many examples of that: [...]
>
> Sure, but there are also examples of missing features.
> For every language there is at least one programmer that will miss
> something. I miss (for example) constructors in Ada.

You haven't clearly explained what the difference between a function and a
constructor is. When we (the ARG) looked at this problem, we eventually
decided (at least from an Ada semantics perspective) that there was in fact
no difference. Thus we decided that it didn't make sense to have a special
feature for the purpose. (That certainly was not where we started out when
considering the issues.)

                           Randy.




^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-12 20:56                                         ` Randy Brukardt
@ 2007-02-13  9:02                                           ` Maciej Sobczak
  2007-02-14 10:12                                           ` Dmitry A. Kazakov
  1 sibling, 0 replies; 397+ messages in thread
From: Maciej Sobczak @ 2007-02-13  9:02 UTC (permalink / raw)


Randy Brukardt wrote:

>> Is a direct support for constructors a whiz-bang but inflexible package?
>> In what way?
> 
> Because there is no single concept of a constructor. You need at a minimum a
> copy-constructor and a default-constructor, and then you still have
> flexibility issues. Ada chose to decompose these things into their
> constituent parts.

I understand that.

> You haven't clearly explained what the difference between a function and a
> constructor is. When we (the ARG) looked at this problem, we eventually
> decided (at least from an Ada semantics perspective) that there was in fact
> no difference.

The difference (as I perceive it) is in the coupling that the "function" 
has with its corresponding type. If it is a function just as any other, 
then the fact that it is a constructor function is only a matter of 
convention. Sometimes it is just a naming convention (for example, when 
the function is called "Initialize" or "Create", etc. then it might be a 
hint that it is a constructor function), but the fact is the the term 
"constructor function" is used by Ada programmers and that means that 
they treat it in a somewhat special way. By coupling the constructor 
with the type, the intent is expressed more clearly and then the purpose 
is not a result of naming convention.
As an example, consider (C++):

template <class T, typename U>
void fun(const U &u)
{
     T t(u);

     // or:
     T *p = new T(u);

     // ...
}

This is truly generic with only two parameters - there is no need to 
drag around the name of the constructor function to instantiate the 
template.

> Thus we decided that it didn't make sense to have a special
> feature for the purpose.

Which I understand. Still, achieving the effect of forced initialization 
requires from me to jump somewhere else for concepts that are not 
directly related to the problem at hand.

-- 
Maciej Sobczak : http://www.msobczak.com/
Programming    : http://www.msobczak.com/prog/



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-11 20:30                                                           ` Jeffrey R. Carter
@ 2007-02-13 18:47                                                             ` Pascal Obry
  2007-02-13 23:08                                                               ` Jeffrey R. Carter
  2007-02-14 11:10                                                               ` Jean-Pierre Rosen
  0 siblings, 2 replies; 397+ messages in thread
From: Pascal Obry @ 2007-02-13 18:47 UTC (permalink / raw)
  To: Jeffrey R. Carter

Jeffrey R. Carter a �crit :
> Pascal Obry wrote:
>>
>> Well one could argue that Ada-ED was a compiler to a virtual machine.
> 
> I never used it, but my understanding is that that isn't an accurate
> description.

I've used it... log time ago and IIRC it was using a kind of p-code a-la
Pascal. This is just a kind of virtual machine to me...

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|              http://www.obry.net
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver wwwkeys.pgp.net --recv-key C1082595



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-13 18:47                                                             ` Pascal Obry
@ 2007-02-13 23:08                                                               ` Jeffrey R. Carter
  2007-02-14 11:13                                                                 ` Jean-Pierre Rosen
  2007-02-14 19:47                                                                 ` Robert A Duff
  2007-02-14 11:10                                                               ` Jean-Pierre Rosen
  1 sibling, 2 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-02-13 23:08 UTC (permalink / raw)


Pascal Obry wrote:
> 
> I've used it... log time ago and IIRC it was using a kind of p-code a-la
> Pascal. This is just a kind of virtual machine to me...

I guess it's a matter of semantics. If the program reads the source and 
performs the actions required by it, I consider it an interpreter. What 
activities it uses internally to perform that interpretation are an 
elephant.

If the program converts the source into another form which can later be 
executed by appropriate HW or an emulator, then I call it a compiler. 
UCSD Pascal had a Pascal-to-P-code compiler. The P-code could be 
executed later, on a P-code machine or, more commonly, through an 
emulator. (I'm not aware of there ever being a P-code machine, which is 
why it's often referred to as a virtual machine. But there's no reason 
there couldn't have been one.) Java works similarly.

My understanding is that Ada-Ed was in the former category.

-- 
Jeff Carter
"C++ is like giving an AK-47 to a monk, shooting him
full of crack and letting him loose in a mall and
expecting him to balance your checking account
'when he has the time.'"
Drew Olbrich
52



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-12 20:56                                         ` Randy Brukardt
  2007-02-13  9:02                                           ` Maciej Sobczak
@ 2007-02-14 10:12                                           ` Dmitry A. Kazakov
  1 sibling, 0 replies; 397+ messages in thread
From: Dmitry A. Kazakov @ 2007-02-14 10:12 UTC (permalink / raw)


On Mon, 12 Feb 2007 14:56:39 -0600, Randy Brukardt wrote:

> Maciej Sobczak writes:

>> Is a direct support for constructors a whiz-bang but inflexible package?
>> In what way?
> 
> Because there is no single concept of a constructor.

Yes. There are constructors of the components, the constructors of the
specific types inherited from, the constructor of the specific type and
potentially the constructor of the class.

> You need at a minimum a
> copy-constructor and a default-constructor,

Well, but there is nothing to do about it. For that matter Ada.Finalization
too has Adjust and Initialize. The difference between creating a copy and
creating a new thing cannot be eliminated by any abstraction. They are
semantically different. What constructor offer is a sane way to express
initialization, by-value parameter passing and assignment.

BTW, I fully agree that there is no need to introduce constructors as a
special language object. It would be sufficient to declare their existence
and to allow definition some of parts of through user-defined primitive and
class-wide operations.

It is just so that we need a little more language support than
Ada.Finalization can offer, and, for all, it must be available for all
types without exceptions.

> Ada chose to decompose these things into their
> constituent parts.

Which was obviously impossible. For example, the assignment problem. Is it
a function or a procedure? How the constraints of the left side are
influenced by the right side? etc.
 
> Yes, Ada does have a problem with composition, but that is a general
> language problem, not one that is specific to constructors or any other
> specific feature.

Yes!

> It would be much better to fix that composition problem
> generally rather than to bolt on more special features that do compose,
> while still leaving the general case unfixed.

Certainly.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-13 18:47                                                             ` Pascal Obry
  2007-02-13 23:08                                                               ` Jeffrey R. Carter
@ 2007-02-14 11:10                                                               ` Jean-Pierre Rosen
  2007-02-14 16:29                                                                 ` Jeffrey R. Carter
  1 sibling, 1 reply; 397+ messages in thread
From: Jean-Pierre Rosen @ 2007-02-14 11:10 UTC (permalink / raw)


Pascal Obry a �crit :
> Jeffrey R. Carter a �crit :
>> Pascal Obry wrote:
>>> Well one could argue that Ada-ED was a compiler to a virtual machine.
>> I never used it, but my understanding is that that isn't an accurate
>> description.
> 
> I've used it... log time ago and IIRC it was using a kind of p-code a-la
> Pascal. This is just a kind of virtual machine to me...
> 
Since we are in historical mode...

The first version of Ada-ED was interpreting directly SETL structures. 
The second one generated code for a virtual machine.

If you are really interested, the description of the virtual machine can 
be found in the PhD thesis of P. Kruchten and J-P. Rosen ;-). Only 
available on paper, though (this was in 1985).
-- 
---------------------------------------------------------
            J-P. Rosen (rosen@adalog.fr)
Visit Adalog's web site at http://www.adalog.fr



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-13 23:08                                                               ` Jeffrey R. Carter
@ 2007-02-14 11:13                                                                 ` Jean-Pierre Rosen
  2007-02-14 16:29                                                                   ` Jeffrey R. Carter
  2007-02-14 19:47                                                                 ` Robert A Duff
  1 sibling, 1 reply; 397+ messages in thread
From: Jean-Pierre Rosen @ 2007-02-14 11:13 UTC (permalink / raw)


Jeffrey R. Carter a �crit :
> The P-code could be 
> executed later, on a P-code machine or, more commonly, through an 
> emulator. (I'm not aware of there ever being a P-code machine, which is 
> why it's often referred to as a virtual machine. But there's no reason 
> there couldn't have been one.) 
There was one, by Western-Digital:
http://en.wikipedia.org/wiki/Pascal_MicroEngine

-- 
---------------------------------------------------------
            J-P. Rosen (rosen@adalog.fr)
Visit Adalog's web site at http://www.adalog.fr



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-14 11:10                                                               ` Jean-Pierre Rosen
@ 2007-02-14 16:29                                                                 ` Jeffrey R. Carter
  2007-02-15  8:39                                                                   ` Jean-Pierre Rosen
  0 siblings, 1 reply; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-02-14 16:29 UTC (permalink / raw)


Jean-Pierre Rosen wrote:
> 
> The first version of Ada-ED was interpreting directly SETL structures. 
> The second one generated code for a virtual machine.

I guess the question, then, is which version received certificate #1?

-- 
Jeff Carter
"I blow my nose on you."
Monty Python & the Holy Grail
03



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-14 11:13                                                                 ` Jean-Pierre Rosen
@ 2007-02-14 16:29                                                                   ` Jeffrey R. Carter
  0 siblings, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-02-14 16:29 UTC (permalink / raw)


Jean-Pierre Rosen wrote:
> There was one [P-code machine], by Western-Digital:
> http://en.wikipedia.org/wiki/Pascal_MicroEngine

Cool.

-- 
Jeff Carter
"I blow my nose on you."
Monty Python & the Holy Grail
03



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-13 23:08                                                               ` Jeffrey R. Carter
  2007-02-14 11:13                                                                 ` Jean-Pierre Rosen
@ 2007-02-14 19:47                                                                 ` Robert A Duff
  1 sibling, 0 replies; 397+ messages in thread
From: Robert A Duff @ 2007-02-14 19:47 UTC (permalink / raw)


"Jeffrey R. Carter" <jrcarter@acm.org> writes:

> If the program converts the source into another form which can later be
> executed by appropriate HW or an emulator, then I call it a
> compiler.

Well, Ada requires certain errors to be detected at compile time,
so it is impossible to write a _pure_ interpreter for Ada -- something
like a Unix shell, which executes the first line before even looking
at the second line.  An Ada implementation has to look at the entire
program before running it.

But still, the part of Ada-Ed that runs the SETL structures or the bytes
or whatever, is an interpreter.  I claim Ada-Ed is a hybrid
implementation of Ada -- neither pure interpreter nor pure compiler.

- Bob



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-14 16:29                                                                 ` Jeffrey R. Carter
@ 2007-02-15  8:39                                                                   ` Jean-Pierre Rosen
  2007-02-15 17:14                                                                     ` Jeffrey R. Carter
  0 siblings, 1 reply; 397+ messages in thread
From: Jean-Pierre Rosen @ 2007-02-15  8:39 UTC (permalink / raw)


Jeffrey R. Carter a �crit :
> Jean-Pierre Rosen wrote:
>>
>> The first version of Ada-ED was interpreting directly SETL structures. 
>> The second one generated code for a virtual machine.
> 
> I guess the question, then, is which version received certificate #1?
> 
It was the full SETL version. I don't argue that it was interpreted. I 
just mean that a "compiler" is any tool that processes a programming 
language (which term would you use instead?).

The difference between HW code generation or "interpretation" is a very 
thin implementation detail. As mentionned before, is P-Code 
interpretation? What if P-Code is cast in hardware? And isn't a "true" 
processor something that interprets its own code with micro-instructions?

-- 
---------------------------------------------------------
            J-P. Rosen (rosen@adalog.fr)
Visit Adalog's web site at http://www.adalog.fr



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: GC in Ada
  2007-02-15  8:39                                                                   ` Jean-Pierre Rosen
@ 2007-02-15 17:14                                                                     ` Jeffrey R. Carter
  0 siblings, 0 replies; 397+ messages in thread
From: Jeffrey R. Carter @ 2007-02-15 17:14 UTC (permalink / raw)


Jean-Pierre Rosen wrote:
>>
> It was the full SETL version. I don't argue that it was interpreted. I 
> just mean that a "compiler" is any tool that processes a programming 
> language (which term would you use instead?).

That's a pretty broad definition. It covers pretty printers and 
AdaSubst, for example.

I don't have a term for any tool that processes a programming language.

-- 
Jeff Carter
"Go and boil your bottoms."
Monty Python & the Holy Grail
01



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-02-06 11:01           ` Ludovic Brenta
@ 2007-02-26  5:47             ` Dave Thompson
  0 siblings, 0 replies; 397+ messages in thread
From: Dave Thompson @ 2007-02-26  5:47 UTC (permalink / raw)


On 6 Feb 2007 03:01:41 -0800, "Ludovic Brenta"
<ludovic@ludovic-brenta.org> wrote:

> On Feb 6, 10:54 am, Dave Thompson wrote:
> > On Wed, 24 Jan 2007 09:07:15 +0100, Ludovic Brenta wrote:
> [snip]
> >> (*) consider that when we increment the address by one, it then
> >> references the next byte; whereas if we increment the pointer by one,
> >> it points to the next "unsigned long", i.e. 2, 4 or 8 bytes and not 1
> >> byte further.  C makes no distinction between addresses and pointers,
> >> lacking expressiveness in a crucial area.
> >
> > Wrong. C, even before C89, does know about pointer targets (strides).
> > Only _very_ early, Bell-Labs-only, pre-K&R1 C that was still in
> > transition from B had pointer === integer.
> 
> Wrong? I think we're in agreement here. I was explaining that a
> pointer is not the same thing as an address, since incrementing a
> pointer gives a different result (next object) than incrementing an
> address (next byte).
> 
Okay, I agree with that. I read your "no distinction" to mean
"addresses and pointers are/appear the same [in C]" or as it is more
commonly phrased (and wrong) on c.l.c "pointers are just addresses".

> In C, address arithmetic is implemented in terms of char* and size_t

Quibble: cv-qual un/signed/plain char *, and any integer type but
preferring size_t and ptrdiff_t, or perhaps ssize_t in POSIX

> only because sizeof(char) == 1; I think that's a hack. In contrast,
> address arithmetic in Ada is in terms of System.Address and
> Storage_Offset, which is much more explicit.
> 
Ada is certainly more explicit, as usual. In 1970 there was still
current or at least recent extensive experience with word-addressed
machines where you in fact did not have consistent (flat) addresses of
all possible data elements, as the now-popular understanding expects
and the Ada model demands (though presumably it could be emulated).

I don't know if you mean 'hack' in its older-computer-specific meaning
of 'clever but arcane', or its more general and media-preferred
meaning of 'cheap', 'sloppy', and even 'criminal'. Given the
environment in which and the goals for which C was created, I think
this was a reasonable design choice at the time, and qualified for the
former meaning -- if we weren't still stuck with it (and other
infelicities) today.

FWLIW C99 adds standard names (typedefs) [u]intptr_t for integer types
able to hold without loss (but with explicit conversions) any data
pointer value, if the implementation has such (and all I know of do).
You still aren't officially guaranteed arithmetic on these values, but
on all known modern platforms it does work.

> > > set_bit_in_register (0xDEADBEEF);
> >
> > The type mismatch is a constraint violation (which must be diagnosed)
> > if the prototype declaration (or definition) is visible, and <snip>
> So 0xDEADBEEF is an int, and there is no implicit conversion to
> unsigned long*. Is that what you're saying? OK, now I see my knowledge
> of C is fading away...
> 
Right. Or rather, 0xDEADBEEF is some integer type, exactly which
depending on your platform -- on most common platforms today with
32-bit signed and unsigned int it will be unsigned int -- and there is
no implicit conversion from any integer type to any pointer, except as
a very hysterical raisin special case the constant (manifest) zero to
a null pointer (of any type). (And even that is better expressed with
the standard macro NULL instead.)

- formerly david.thompson1 || achar(64) || worldnet.att.net



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-23  5:53 How come Ada isn't more popular? artifact.one
                   ` (7 preceding siblings ...)
  2007-01-24  0:12 ` JPWoodruff
@ 2007-03-05  2:19 ` Brian May
  8 siblings, 0 replies; 397+ messages in thread
From: Brian May @ 2007-03-05  2:19 UTC (permalink / raw)


>>>>> "artifact" == artifact one <artifact.one@googlemail.com> writes:

    artifact> My question is: how come Ada isn't more popular?

Reason 1: You have to wade through 400+ messages on the topic on
comp.lang.ada to find out the answer!

(a summary would be good... but I don't have time)

    artifact> This isn't intended to start a flame war, I'm genuinely
    artifact> interested.

Too late ;-)

Actually, more a discussion then a flame war, but anyway.

I know somebody who was forced to program in Ada and hated it.

I guess this just shows different people think in different ways.

Ada is very strict, and this is one of the reasons why I like it.
-- 
Brian May <bam@snoopy.apana.org.au>



^ permalink raw reply	[flat|nested] 397+ messages in thread

* Re: How come Ada isn't more popular?
  2007-01-24 21:31                         ` Markus E Leypold
@ 2007-03-19  2:09                           ` adaworks
  0 siblings, 0 replies; 397+ messages in thread
From: adaworks @ 2007-03-19  2:09 UTC (permalink / raw)


I came across an article in a recent book by Joel Spolsky
with an article titled, "C++.  The Forgotten Trojan Horse,"
by Eric Johnson.   Spolsky's book is titled "The Best Software
Writing I," and it is published by Apress.

The article begins, "I find C++ interesting.  No, not because
my compiler can spit out an incoherent set of errors ... there's
a lesson to be learned about how it conquered an existing
community."

Johnson makes a good case for the insidious way that C++
found its way to dominance, not because of its superiority
as a programming language but for entirely different reasons.

A language achieves popularity or is ignored according to a
set of circumstances that are quite independent of what
any adopter of that language might consider as quality. It
would seem that quality is the last factor to be considered,
otherwise, C++ would never have become so dominant.

In more recent times, Java has campaigned for the role of
dominant language.   However, Java is even more bureaucratic
than Ada and even the simplest tasks are swollen into excessive
syntactic and structural absurdities.

More recently, I am seeing a trend away from C++ and Java
toward more "agile" languages such as Python and Ruby.  In
particular, Ruby seems to have captured the interest of anyone
who has looked carefully at it.  Python is wonderfully easy to
learn, and powerful in what can be done in a few lines of code.
I particularly like the fact that functions are first-class objects
since, for mathematical programming, I like to use functional
programming.    Ruby is the same in this respect.

What I have noticed, though, is that these easy-to-use languages
do not scale-up as well for really large software systems as C++,
Eiffel, and Ada.   Advocates of Python and Ruby disagree, of
course, but advocacy is not a good reason to believe them.  Also,
those who prefer Python and Ruby seem to have rejected the
importance of type-safety, at least in the literature.   As I read
their objections to type-safety, it is clear that they are framing
those objections in their experience with C/C++ (where type
safety is a hideous joke) or Java (where type-safety falls short
of what one would find in Ada).

So, the two languages that seem to best scale up for safety-critical
software are still Eiffel and Ada.

I have come to believe that C++ should always be the language
of last resort for any serious programming project.

Java has a place, but it is in a twilight zone that is often better filled
with Python or Ruby.   We could lose Java today and the world
would not be worse-off for it because Ruby would easily fill the
void.

What does a programmer choose as the language s/he uses at home
or for recreational programming?    Many in this forum choose Ada.
However, even those in this forum, as they discover Python and Ruby,
are probably enjoying the option of making small programs work
quickly and easily for everything from CGI to sockets.

This does not answer the question of why Ada is not more popular,
but one could pose an alternative question of why an error-prone
language should have ever become so popular.   Why would anyone
choose a language that is error-prone and expect an outcome that is
error-free.   Perhaps error-free software is not a goal.  Perhaps the
goal is to simply see how much code we can lay down in given period
of time.

Richard Riehle





^ permalink raw reply	[flat|nested] 397+ messages in thread

end of thread, other threads:[~2007-03-19  2:09 UTC | newest]

Thread overview: 397+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2007-01-23  5:53 How come Ada isn't more popular? artifact.one
2007-01-23  6:37 ` adaworks
2007-01-23  6:50   ` artifact.one
2007-01-23 14:24   ` Arthur Evans Jr
2007-01-23 20:11     ` Jeffrey R. Carter
2007-01-23 21:14       ` Markus E Leypold
2007-01-23 15:23   ` Ed Falis
2007-01-23 20:09   ` Jeffrey R. Carter
2007-01-24  8:50     ` Dmitry A. Kazakov
2007-01-24 20:23       ` Jeffrey R. Carter
2007-01-24 11:06     ` gautier_niouzes
2007-01-24 19:25       ` tmoran
2007-01-25  4:46         ` Gautier
2007-01-25  9:29           ` Markus E Leypold
2007-01-27 16:59             ` Stephen Leake
2007-01-27 20:40               ` Markus E Leypold
2007-01-27 21:19                 ` Markus E Leypold
2007-01-28  8:44                   ` Ray Blaak
2007-01-29  8:56                 ` Maciej Sobczak
2007-01-29 14:21                   ` Markus E Leypold
2007-01-31  9:23                     ` Maciej Sobczak
2007-01-31 10:24                       ` Markus E Leypold
2007-02-02  8:42                         ` Maciej Sobczak
2007-02-02  9:32                           ` Alex R. Mosteo
2007-02-02 11:04                             ` Maciej Sobczak
2007-02-02 13:57                           ` Markus E Leypold
2007-02-03  9:44                             ` Dmitry A. Kazakov
2007-02-03 14:51                               ` Markus E Leypold
2007-02-04 17:55                                 ` Dmitry A. Kazakov
2007-02-04 20:18                                   ` Markus E Leypold
2007-02-04 21:29                                     ` Dmitry A. Kazakov
2007-02-04 22:33                                       ` Markus E Leypold
2007-02-05  9:20                                         ` Dmitry A. Kazakov
2007-02-05 12:16                                           ` Harald Korneliussen
2007-02-05 14:06                                             ` Dmitry A. Kazakov
2007-02-05 13:53                                           ` Markus E Leypold
2007-02-05  9:59                             ` Maciej Sobczak
2007-02-05 13:43                               ` Markus E Leypold
2007-02-06  9:15                                 ` Maciej Sobczak
2007-02-06 11:45                                   ` Markus E Leypold
2007-02-06 14:16                                     ` Maciej Sobczak
2007-02-06 15:44                                       ` Markus E Leypold
2007-02-06 17:40                                         ` Dmitry A. Kazakov
2007-02-07  8:55                                         ` Maciej Sobczak
2007-02-07  9:30                                           ` GC in Ada Martin Krischik
2007-02-07 11:08                                             ` Markus E Leypold
2007-02-07 11:15                                             ` Maciej Sobczak
2007-02-07 11:53                                               ` Martin Krischik
2007-02-07 12:22                                                 ` Markus E Leypold
2007-02-08  7:26                                                   ` Martin Krischik
2007-02-08  9:33                                                     ` Markus E Leypold
2007-02-09 13:37                                                       ` Martin Krischik
2007-02-09 13:47                                                       ` Georg Bauhaus
2007-02-09 15:29                                                         ` Maciej Sobczak
2007-02-09 20:52                                                           ` Georg Bauhaus
2007-02-08  7:48                                                 ` Maciej Sobczak
2007-02-08  8:20                                                   ` Martin Krischik
2007-02-08  8:43                                                   ` Markus E Leypold
2007-02-09 14:20                                                     ` Maciej Sobczak
2007-02-09 16:23                                                       ` Markus E Leypold
2007-02-12  8:52                                                         ` Maciej Sobczak
2007-02-12 12:56                                                           ` Markus E Leypold
2007-02-08 18:24                                                   ` Jeffrey R. Carter
2007-02-09  8:57                                                     ` Jean-Pierre Rosen
2007-02-09 12:57                                                       ` Robert A Duff
2007-02-09 14:44                                                         ` Jean-Pierre Rosen
2007-02-10 13:38                                                           ` Robert A Duff
2007-02-12  8:47                                                             ` Jean-Pierre Rosen
2007-02-12 15:31                                                               ` Jeffrey R. Carter
2007-02-09 18:35                                                       ` Jeffrey R. Carter
2007-02-10 19:01                                                         ` Martin Krischik
2007-02-11 15:22                                                         ` Pascal Obry
2007-02-11 20:30                                                           ` Jeffrey R. Carter
2007-02-13 18:47                                                             ` Pascal Obry
2007-02-13 23:08                                                               ` Jeffrey R. Carter
2007-02-14 11:13                                                                 ` Jean-Pierre Rosen
2007-02-14 16:29                                                                   ` Jeffrey R. Carter
2007-02-14 19:47                                                                 ` Robert A Duff
2007-02-14 11:10                                                               ` Jean-Pierre Rosen
2007-02-14 16:29                                                                 ` Jeffrey R. Carter
2007-02-15  8:39                                                                   ` Jean-Pierre Rosen
2007-02-15 17:14                                                                     ` Jeffrey R. Carter
2007-02-08 18:38                                                 ` Dmitry A. Kazakov
2007-02-09  7:58                                                   ` Maciej Sobczak
2007-02-09 10:07                                                   ` Martin Krischik
2007-02-09 14:10                                                     ` Dmitry A. Kazakov
2007-02-07 12:19                                               ` Markus E Leypold
2007-02-08  7:54                                                 ` Maciej Sobczak
2007-02-08  9:49                                                   ` Markus E Leypold
2007-02-07 10:10                                           ` How come Ada isn't more popular? Georg Bauhaus
2007-02-07 10:56                                           ` Markus E Leypold
2007-02-07 22:58                                             ` Georg Bauhaus
2007-02-08  9:04                                             ` Maciej Sobczak
2007-02-08 10:01                                               ` Markus E Leypold
2007-02-06 17:47                                       ` Ray Blaak
2007-02-06 18:05                                         ` Dmitry A. Kazakov
2007-02-06 18:28                                           ` Markus E Leypold
2007-02-07  7:54                                           ` Maciej Sobczak
2007-02-07  9:42                                             ` Markus E Leypold
2007-02-08  8:10                                               ` Maciej Sobczak
2007-02-08 18:14                                             ` Dmitry A. Kazakov
2007-02-09  8:17                                               ` Maciej Sobczak
2007-02-09 14:02                                                 ` Dmitry A. Kazakov
2007-02-09 18:08                                                   ` Ray Blaak
2007-02-09 18:43                                                     ` Dmitry A. Kazakov
2007-02-09 18:57                                                       ` Ray Blaak
2007-02-09 18:03                                                 ` Ray Blaak
2007-02-09 18:47                                                   ` Randy Brukardt
2007-02-09 19:02                                                     ` Ray Blaak
2007-02-09 19:35                                                       ` Randy Brukardt
2007-02-09 19:52                                                         ` Ray Blaak
2007-02-12  7:20                                                           ` Harald Korneliussen
2007-02-12 14:12                                                             ` Robert A Duff
2007-02-09 22:11                                                         ` Markus E Leypold
2007-02-09 22:05                                                     ` Markus E Leypold
2007-02-10  1:31                                                       ` Randy Brukardt
2007-02-10  2:18                                                         ` Markus E Leypold
2007-02-05 19:05                               ` Ray Blaak
2007-02-09  8:01                           ` adaworks
2007-02-09  9:07                             ` Jean-Pierre Rosen
2007-02-09 10:36                               ` Maciej Sobczak
2007-02-09 12:50                                 ` Robert A Duff
2007-02-09 14:02                                   ` Dmitry A. Kazakov
2007-02-10 18:21                                     ` adaworks
2007-02-10 18:41                                       ` Markus E Leypold
2007-02-10 20:29                                       ` Dmitry A. Kazakov
2007-02-09 14:12                                   ` Maciej Sobczak
2007-02-09 19:41                                     ` Randy Brukardt
2007-02-12  9:07                                       ` Maciej Sobczak
2007-02-12 20:56                                         ` Randy Brukardt
2007-02-13  9:02                                           ` Maciej Sobczak
2007-02-14 10:12                                           ` Dmitry A. Kazakov
2007-02-09  9:21                             ` Markus E Leypold
2007-01-25 21:42           ` Randy Brukardt
2007-01-28 19:32             ` Gautier
2007-01-30 19:41               ` tmoran
2007-01-25 22:21           ` Jeffrey R. Carter
2007-01-25 11:31   ` Ali Bendriss
2007-01-27  5:12     ` Charles D Hixson
2007-01-27  9:52       ` Markus E Leypold
2007-01-27 22:01         ` Charles D Hixson
2007-01-27 23:24           ` Markus E Leypold
2007-01-28  9:14             ` Dmitry A. Kazakov
2007-01-28 15:06               ` Markus E Leypold
2007-01-29 14:37                 ` Dmitry A. Kazakov
2007-01-29 15:50                   ` Markus E Leypold
2007-01-30 19:58                     ` Robert A Duff
2007-01-30 21:52                       ` Markus E Leypold
2007-01-31 22:49                         ` Robert A Duff
2007-01-31 23:07                           ` (see below)
2007-01-31 23:18                             ` Robert A Duff
2007-01-31 23:36                               ` (see below)
2007-02-01  7:57                           ` Markus E Leypold
2007-01-31 17:49                       ` Ed Falis
2007-01-31 22:53                         ` Robert A Duff
2007-01-31 10:55                     ` Dmitry A. Kazakov
2007-01-31 15:16                       ` Markus E Leypold
2007-02-01 14:22                         ` Dmitry A. Kazakov
2007-02-01 15:18                           ` Markus E Leypold
2007-02-01 16:26                           ` Georg Bauhaus
2007-02-01 17:36                             ` Markus E Leypold
2007-02-01 20:53                               ` Georg Bauhaus
2007-02-01 21:57                                 ` Markus E Leypold
2007-02-01 22:03                                 ` Markus E Leypold
2007-02-01 23:40                                 ` Markus E Leypold
2007-02-03 16:54                                   ` Georg Bauhaus
2007-02-03 18:39                                     ` Dmitry A. Kazakov
2007-02-03 20:06                                     ` Markus E Leypold
2007-02-05  0:06                                       ` Markus E Leypold
2007-02-05 13:58                                         ` Georg Bauhaus
2007-02-05 14:23                                           ` Markus E Leypold
2007-02-02  7:17                                 ` Harald Korneliussen
2007-02-05  0:39                               ` Robert A Duff
2007-02-05  1:00                                 ` Markus E Leypold
2007-02-02  9:20                             ` Dmitry A. Kazakov
2007-02-02 12:34                               ` Markus E Leypold
2007-02-03  9:45                                 ` Dmitry A. Kazakov
2007-02-03 14:16                                   ` Markus E Leypold
2007-02-04 19:33                                     ` Dmitry A. Kazakov
2007-02-04 20:44                                       ` Markus E Leypold
2007-02-04 23:00                                         ` Dmitry A. Kazakov
2007-02-04 23:21                                           ` Markus E Leypold
2007-02-02 14:27                               ` Georg Bauhaus
2007-02-02 16:07                                 ` Dmitry A. Kazakov
2007-02-01 19:31                           ` Ray Blaak
2007-02-01 22:54                             ` Randy Brukardt
2007-02-02  1:37                               ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
2007-02-02  9:35                                 ` Dmitry A. Kazakov
2007-02-02 12:44                                   ` in defense of GC Markus E Leypold
2007-02-03 10:13                                     ` Dmitry A. Kazakov
2007-02-03 14:28                                       ` Markus E Leypold
2007-02-04 18:38                                         ` Dmitry A. Kazakov
2007-02-04 20:24                                           ` Markus E Leypold
2007-02-04 21:57                                             ` Dmitry A. Kazakov
2007-02-04 22:47                                               ` Markus E Leypold
2007-02-04 23:08                                                 ` Markus E Leypold
2007-02-05 15:57                                                   ` Markus E Leypold
2007-02-05  8:47                                                 ` Dmitry A. Kazakov
2007-02-05 14:03                                                   ` Markus E Leypold
2007-02-05  0:23                                         ` Robert A Duff
2007-02-05  0:55                                           ` Markus E Leypold
2007-02-06  0:01                                             ` Robert A Duff
2007-02-06  1:06                                               ` Markus E Leypold
2007-02-05  1:00                                           ` Ray Blaak
2007-02-05  1:19                                             ` Markus E Leypold
2007-02-06  8:32                                               ` Ray Blaak
2007-02-06 11:07                                                 ` Markus E Leypold
2007-02-06 18:01                                                   ` Ray Blaak
2007-02-06 18:25                                                     ` Markus E Leypold
2007-02-06 19:42                                                     ` Ray Blaak
2007-02-06  0:18                                             ` Robert A Duff
2007-02-06  0:59                                               ` Ray Blaak
2007-02-06  1:07                                               ` Markus E Leypold
2007-02-02 18:15                                   ` in defense of GC (was Re: How come Ada isn't more popular?) Ray Blaak
2007-02-02 19:35                                     ` Adam Beneschan
2007-02-02 20:04                                     ` Dmitry A. Kazakov
2007-02-02 22:40                                       ` Ray Blaak
2007-02-03 10:00                                         ` Dmitry A. Kazakov
2007-02-03 14:30                                           ` in defense of GC Markus E Leypold
2007-02-02 12:36                                 ` Markus E Leypold
2007-02-02 21:50                                 ` in defense of GC (was Re: How come Ada isn't more popular?) Gautier
2007-02-04  8:19                                   ` Ray Blaak
2007-02-04 17:36                                     ` Hyman Rosen
2007-02-04 21:21                                       ` Ray Blaak
2007-02-05  1:12                                 ` Robert A Duff
2007-02-05  9:06                                   ` Ray Blaak
2007-02-06  0:28                                     ` in defense of GC Robert A Duff
2007-02-06  8:24                                       ` Ray Blaak
2007-02-06 11:50                                         ` Markus E Leypold
2007-02-07  7:44                                           ` Ray Blaak
2007-02-07  8:54                                             ` Georg Bauhaus
2007-02-07 11:19                                               ` Markus E Leypold
2007-02-07 23:32                                                 ` Georg Bauhaus
2007-02-08  8:49                                                   ` Markus E Leypold
2007-02-09 14:09                                                     ` Georg Bauhaus
2007-02-09 16:17                                                       ` Markus E Leypold
2007-02-09 20:51                                                         ` Georg Bauhaus
2007-02-09 22:19                                                           ` Markus E Leypold
2007-02-08  9:24                                                   ` Markus E Leypold
2007-02-09 15:08                                                     ` Georg Bauhaus
2007-02-07 19:01                                               ` Ray Blaak
2007-02-07 11:17                                             ` Markus E Leypold
2007-01-29 16:23                 ` How come Ada isn't more popular? Georg Bauhaus
2007-01-29 16:56                   ` Markus E Leypold
2007-01-29 23:56       ` Randy Brukardt
2007-01-23  6:58 ` AW: " Grein, Christoph (Fa. ESG)
2007-01-23 10:31   ` Talulah
2007-01-23 13:48     ` Anders Wirzenius
2007-01-23 20:17     ` Jeffrey R. Carter
2007-01-23 20:43       ` Pascal Obry
2007-01-24  9:42       ` Maciej Sobczak
2007-01-24 20:48         ` Jeffrey R. Carter
2007-01-23 10:02 ` Stephen Leake
2007-01-23 16:49   ` adaworks
2007-01-23 17:40     ` Markus E Leypold
2007-01-24 12:51       ` Peter Hermann
2007-01-24 14:42         ` Markus E Leypold
2007-01-23 20:10   ` Jeffrey R. Carter
2007-01-23 22:37     ` Frank J. Lhota
2007-01-24  7:27       ` Jeffrey R. Carter
2007-01-24  9:50         ` Maciej Sobczak
2007-01-24 20:25           ` Jeffrey R. Carter
2007-01-24 21:34             ` Markus E Leypold
2007-01-25  9:23               ` Markus E Leypold
2007-01-26  7:59               ` Maciej Sobczak
2007-01-26 20:05                 ` Jeffrey R. Carter
2007-01-26 22:43                   ` Markus E Leypold
2007-01-23 21:19   ` Björn Persson
2007-01-23 10:38 ` Alex R. Mosteo
2007-01-23 12:58   ` gautier_niouzes
2007-01-23 21:56   ` Dr. Adrian Wrigley
2007-01-24 13:52     ` Alex R. Mosteo
2007-01-24 19:25     ` tmoran
2007-01-24 19:38     ` artifact.one
2007-01-26  2:50     ` Keith Thompson
2007-01-26  5:29     ` Gautier
2007-01-27  5:22     ` Charles D Hixson
2007-01-23 19:16 ` Tero Koskinen
2007-01-23 21:12   ` Ludovic Brenta
2007-01-24  9:59     ` Maciej Sobczak
2007-01-24 18:22       ` Yves Bailly
2007-01-24 19:18       ` Markus E Leypold
2007-01-25  8:37         ` Maciej Sobczak
2007-01-25  9:40           ` Markus E Leypold
2007-01-26  8:52             ` Ludovic Brenta
2007-01-26 11:40               ` Markus E Leypold
2007-01-27 16:56             ` Stephen Leake
2007-01-27 19:58               ` Markus E Leypold
2007-01-28 17:12                 ` Ed Falis
2007-01-28 18:38                   ` Markus E Leypold
2007-01-25 10:13           ` Harald Korneliussen
2007-01-25 12:54             ` Markus E Leypold
2007-01-26  7:03               ` Harald Korneliussen
2007-01-25 13:08             ` Markus E Leypold
2007-01-25 22:36             ` Jeffrey R. Carter
2007-01-25 23:26               ` Markus E Leypold
2007-01-26  4:23                 ` Jeffrey R. Carter
2007-01-26 11:35                   ` Markus E Leypold
2007-01-26 20:22                     ` Jeffrey R. Carter
2007-01-26 23:04                       ` Markus E Leypold
2007-01-27 19:57                         ` Frank J. Lhota
2007-01-28 20:43                         ` adaworks
2007-01-28 22:57                           ` Markus E Leypold
2007-01-29  1:04                           ` Jeffrey R. Carter
2007-01-28 20:32                   ` adaworks
2007-01-28 21:12                     ` Cesar Rabak
2007-01-28 22:43                       ` Markus E Leypold
2007-01-29 22:40                         ` Cesar Rabak
2007-01-30  9:31                           ` Markus E Leypold
2007-01-30 16:19                           ` adaworks
2007-01-30 21:05                             ` Jeffrey Creem
2007-01-31  7:59                               ` AW: " Grein, Christoph (Fa. ESG)
2007-02-03 16:33                                 ` Martin Krischik
2007-01-28 22:38                     ` Markus E Leypold
2007-01-29 16:16                       ` adaworks
2007-01-29 16:35                         ` Markus E Leypold
2007-01-29  1:02                     ` Jeffrey R. Carter
2007-01-30  0:21                       ` Randy Brukardt
2007-01-26  7:21                 ` Harald Korneliussen
2007-01-26  7:16               ` Harald Korneliussen
2007-01-27  5:30             ` Charles D Hixson
2007-01-24 20:10   ` Cesar Rabak
2007-01-23 20:02 ` Jeffrey R. Carter
2007-01-24  7:18   ` adaworks
2007-01-24 14:19   ` Alex R. Mosteo
2007-01-24 15:27     ` Poll on background of Ada people (was: How come Ada isn't more po) Larry Kilgallen
2007-01-23 21:36 ` How come Ada isn't more popular? kevin  cline
2007-01-23 22:18   ` Martin Dowie
2007-01-24  4:14     ` Alexander E. Kopilovich
2007-01-24  7:30       ` Jeffrey R. Carter
2007-01-24 20:15         ` Alexander E. Kopilovich
2007-01-25 22:16           ` Jeffrey R. Carter
2007-01-25 23:32             ` Markus E Leypold
2007-01-26  8:50               ` AW: " Grein, Christoph (Fa. ESG)
2007-01-26 11:52                 ` Markus E Leypold
2007-01-29  6:16                   ` AW: " Grein, Christoph (Fa. ESG)
2007-01-29 14:31                     ` Markus E Leypold
2007-01-26  8:56               ` Ludovic Brenta
2007-01-26 11:49                 ` Markus E Leypold
2007-01-26 22:05             ` Alexander E. Kopilovich
2007-01-24  7:31     ` Jeffrey R. Carter
2007-01-24  7:42     ` kevin  cline
2007-01-24  8:07       ` Ludovic Brenta
2007-01-24 12:12         ` Markus E Leypold
2007-01-24 12:48           ` Ludovic Brenta
2007-01-24 14:49             ` Markus E Leypold
2007-01-24 13:40           ` Pascal Obry
2007-01-24 14:50             ` Markus E Leypold
2007-01-24 17:22               ` Pascal Obry
2007-01-24 17:56                 ` Markus E Leypold
2007-01-24 18:09                   ` Pascal Obry
2007-01-24 19:37                     ` Markus E Leypold
2007-01-24 19:52                       ` Pascal Obry
2007-01-24 21:31                         ` Markus E Leypold
2007-03-19  2:09                           ` adaworks
2007-01-25  7:52                     ` Harald Korneliussen
2007-01-24 16:25         ` Adam Beneschan
2007-01-24 17:03           ` Niklas Holsti
2007-01-25 15:37           ` Bob Spooner
2007-02-06  9:54         ` Dave Thompson
2007-02-06 11:01           ` Ludovic Brenta
2007-02-26  5:47             ` Dave Thompson
2007-01-24 16:14       ` adaworks
2007-01-25  0:22         ` kevin  cline
2007-01-25  6:04           ` adaworks
2007-01-25 10:37             ` Maciej Sobczak
2007-01-25 23:36               ` Markus E Leypold
2007-01-25 10:42           ` Dmitry A. Kazakov
2007-01-25  8:27         ` Harald Korneliussen
2007-01-25  4:50       ` Alexander E. Kopilovich
2007-01-27  5:43       ` Charles D Hixson
2007-01-27  8:38         ` Dmitry A. Kazakov
2007-01-28 12:11           ` Michael Bode
2007-01-28 15:20             ` Markus E Leypold
2007-01-29  9:44               ` Martin Krischik
2007-01-27 13:06         ` Gautier
2007-01-27 16:28           ` Ludovic Brenta
2007-01-28  0:55           ` Charles D Hixson
2007-01-28  1:18             ` Ludovic Brenta
2007-01-28 17:06             ` Jeffrey R. Carter
2007-01-28 21:11             ` adaworks
2007-01-24 19:33   ` Arthur Evans Jr
     [not found]     ` <egYth.15026$w91.10597@newsread1.news.pas.earthlink.net>
2007-01-25 22:34       ` Jeffrey R. Carter
2007-01-25 22:55         ` Robert A Duff
2007-01-26 19:59           ` Jeffrey R. Carter
2007-01-27  3:54         ` Randy Brukardt
2007-01-24  0:12 ` JPWoodruff
2007-01-24 10:32   ` gautier_niouzes
2007-01-25  1:01   ` Alexander E. Kopilovich
2007-01-26  5:01     ` JPWoodruff
2007-03-05  2:19 ` Brian May
  -- strict thread matches above, loose matches on Subject: below --
2007-02-10  4:18 Randy Brukardt
2007-02-10  9:15 ` Dmitry A. Kazakov
2007-02-10 13:22   ` Robert A Duff
2007-02-10 15:54     ` Dmitry A. Kazakov
2007-02-12 14:23       ` Robert A Duff
2007-02-12 15:49         ` Dmitry A. Kazakov

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox