comp.lang.ada
 help / color / mirror / Atom feed
* Interesting article on ARG work
@ 2018-04-02  3:32 Randy Brukardt
  2018-04-02 14:49 ` Dan'l Miller
  2018-04-06 13:35 ` Interesting article on ARG work Marius Amado-Alves
  0 siblings, 2 replies; 57+ messages in thread
From: Randy Brukardt @ 2018-04-02  3:32 UTC (permalink / raw)


I just ran across an article about the ARG's recent work. Read it at 
http://www.adaic.org/articles/new-ada-features/.

Randy.



^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-02  3:32 Interesting article on ARG work Randy Brukardt
@ 2018-04-02 14:49 ` Dan'l Miller
  2018-04-03 16:34   ` Bojan Bozovic
  2018-04-06 13:35 ` Interesting article on ARG work Marius Amado-Alves
  1 sibling, 1 reply; 57+ messages in thread
From: Dan'l Miller @ 2018-04-02 14:49 UTC (permalink / raw)


On Sunday, April 1, 2018 at 10:32:50 PM UTC-5, Randy Brukardt wrote:
> I just ran across an article about the ARG's recent work. Read it at 
> http://www.adaic.org/articles/new-ada-features/.

The name Munificent Vigil sounds as though it is a security-minded language from the same team who gave us Squeamish Ossifrage.

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-02 14:49 ` Dan'l Miller
@ 2018-04-03 16:34   ` Bojan Bozovic
  2018-04-03 22:33     ` Randy Brukardt
  0 siblings, 1 reply; 57+ messages in thread
From: Bojan Bozovic @ 2018-04-03 16:34 UTC (permalink / raw)


Interesting read. Now, I am writing this only because I am with correspondence with ARG editor, is it possible (or wise) to add to
Pragma assert(condition,message);
Construct
Assertion name is condition with message;
Or it's just a horrible idea?
Then it's my 2 cents. I'm not deleting my messages even when I'm wrong as it hurts the flow of discussion.

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-03 16:34   ` Bojan Bozovic
@ 2018-04-03 22:33     ` Randy Brukardt
  2018-04-04  2:12       ` Bojan Bozovic
  2018-04-04 15:05       ` Dan'l Miller
  0 siblings, 2 replies; 57+ messages in thread
From: Randy Brukardt @ 2018-04-03 22:33 UTC (permalink / raw)


I'm not sure what your idea is. The pragma Assert exists already, what is it 
that you can't do with it that you need?

                       Randy.

"Bojan Bozovic" <bozovic.bojan@gmail.com> wrote in message 
news:62ee0aac-49da-4925-b9aa-a16695b3fc45@googlegroups.com...
> Interesting read. Now, I am writing this only because I am with 
> correspondence with ARG editor, is it possible (or wise) to add to
> Pragma assert(condition,message);
> Construct
> Assertion name is condition with message;
> Or it's just a horrible idea?
> Then it's my 2 cents. I'm not deleting my messages even when I'm wrong as 
> it hurts the flow of discussion. 


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-03 22:33     ` Randy Brukardt
@ 2018-04-04  2:12       ` Bojan Bozovic
  2018-04-04 15:05       ` Dan'l Miller
  1 sibling, 0 replies; 57+ messages in thread
From: Bojan Bozovic @ 2018-04-04  2:12 UTC (permalink / raw)


Indeed maybe it's just a horrible idea to try to put it into the core language, even with name above optional, so forget about my proposal. Pragma blackhole will be useful, so keep up the good work!


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-03 22:33     ` Randy Brukardt
  2018-04-04  2:12       ` Bojan Bozovic
@ 2018-04-04 15:05       ` Dan'l Miller
  2018-04-04 15:30         ` gerdien.de.kruyf
  2018-04-04 22:30         ` Randy Brukardt
  1 sibling, 2 replies; 57+ messages in thread
From: Dan'l Miller @ 2018-04-04 15:05 UTC (permalink / raw)


On Tuesday, April 3, 2018 at 5:33:47 PM UTC-5, Randy Brukardt wrote:
> I'm not sure what your idea is. The pragma Assert exists already, what is it 
> that you can't do with it that you need?
> 
>                        Randy.
> 
> "Bojan Bozovic" wrote in message 
> > Pragma assert(condition,message);
> > Construct
> > Assertion name is condition with message;

Randy, despite your humor in the original posting, I think that he is (seriously) proposing that assertions optionally be named so that the named condition and message pair is defined in one place and then utilized at the (numerous) places where the assertion appears, so that the condition & message do not vary due to the drift of time, forgetfulness, and whim of different programmers.  Something to the effect of:

in spec:
ASSERTION MyAssertion IS conditionExpressionOrName WITH messageStringLiteralOrName;

elsewhere, potentially numerous times:
PRAGMA ASSERT( MyAssertion );

Bojan, blackhole is not an actual feature (nor is Munificent Vigil an actual programming language).  Look at the 01 April 2018 date of posting.  As a language principle of permitting everything to be declared with a name and then utilized by mentioning the name to assure uniformity at all points of utilization, your idea is neither humorous nor ridiculous, even though it is in reply to a joke.  You actually imply one of my criticisms of Ada:  there is no MetaAda that declares all language principles (preferably in code that compiler vendors would take as input when writing a compiler, instead of taking in English prose).  “Everything shall be able to be named; that name shall be utilizable to assure uniformity at each point of usage to the named declaration.” would be one of those statements in MetaAda, most likely stated in something resembling a language isomorphic to symbolic logic instead of quick-&-dirty English prose here.

(It is actually my criticism of all programming languages, but in all these decades & decades Ada is the only one to have come the closest so far to obeying a MetaAda-that-was-never-codified by mere habit of self-discipline of its designers.  Only Algol68 attempted something even remotely in this category of what I call MetaAda here but unfortunately Algol68's designers chose cryptic van Wijngaarden 2-level grammar as the mode of bizarre [partial] expression of codifying the syntax & semantics of the language—and then not for the purpose of overarching story-arc big-picture principles.  I suppose that I should give a nod to Perl6's Parrot too, but Parrot's purpose seems drastically different, but perhaps that is merely my shallow understanding of Parrot showing there.)

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-04 15:05       ` Dan'l Miller
@ 2018-04-04 15:30         ` gerdien.de.kruyf
  2018-04-04 16:09           ` Dan'l Miller
  2018-04-04 22:30         ` Randy Brukardt
  1 sibling, 1 reply; 57+ messages in thread
From: gerdien.de.kruyf @ 2018-04-04 15:30 UTC (permalink / raw)


On Wednesday, 4 April 2018 17:05:35 UTC+2, Dan'l Miller  wrote:
> On Tuesday, April 3, 2018 at 5:33:47 PM UTC-5, Randy Brukardt wrote:
> > I'm not sure what your idea is. The pragma Assert exists already, what is it 
> > that you can't do with it that you need?
> > 
> >                        Randy.
> > 
> > "Bojan Bozovic" wrote in message 
> > > Pragma assert(condition,message);
> > > Construct
> > > Assertion name is condition with message;
> 
> Randy, despite your humor in the original posting, I think that he is (seriously) proposing that assertions optionally be named so that the named condition and message pair is defined in one place and then utilized at the (numerous) places where the assertion appears, so that the condition & message do not vary due to the drift of time, forgetfulness, and whim of different programmers.  Something to the effect of:
> 
> in spec:
> ASSERTION MyAssertion IS conditionExpressionOrName WITH messageStringLiteralOrName;
> 
> elsewhere, potentially numerous times:
> PRAGMA ASSERT( MyAssertion );
> 
> Bojan, blackhole is not an actual feature (nor is Munificent Vigil an actual programming language).  Look at the 01 April 2018 date of posting.  As a language principle of permitting everything to be declared with a name and then utilized by mentioning the name to assure uniformity at all points of utilization, your idea is neither humorous nor ridiculous, even though it is in reply to a joke.  You actually imply one of my criticisms of Ada:  there is no MetaAda that declares all language principles (preferably in code that compiler vendors would take as input when writing a compiler, instead of taking in English prose).  “Everything shall be able to be named; that name shall be utilizable to assure uniformity at each point of usage to the named declaration.” would be one of those statements in MetaAda, most likely stated in something resembling a language isomorphic to symbolic logic instead of quick-&-dirty English prose here.
> 
> (It is actually my criticism of all programming languages, but in all these decades & decades Ada is the only one to have come the closest so far to obeying a MetaAda-that-was-never-codified by mere habit of self-discipline of its designers.  Only Algol68 attempted something even remotely in this category of what I call MetaAda here but unfortunately Algol68's designers chose cryptic van Wijngaarden 2-level grammar as the mode of bizarre [partial] expression of codifying the syntax & semantics of the language—and then not for the purpose of overarching story-arc big-picture principles.  I suppose that I should give a nod to Perl6's Parrot too, but Parrot's purpose seems drastically different, but perhaps that is merely my shallow understanding of Parrot showing there.)

http://www.dtic.mil/docs/citations/ADA177802
A W-Grammar Description for ADA.

enjoy.

j.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-04 15:30         ` gerdien.de.kruyf
@ 2018-04-04 16:09           ` Dan'l Miller
  0 siblings, 0 replies; 57+ messages in thread
From: Dan'l Miller @ 2018-04-04 16:09 UTC (permalink / raw)


On Wednesday, April 4, 2018 at 10:30:04 AM UTC-5, gerdien....@gmail.com wrote:
> http://www.dtic.mil/docs/citations/ADA177802
> A W-Grammar Description for ADA.
> 
> enjoy.

Nice!

“More study is needed in the area of formal expression for semantics-especially in the area of an expression medium. W-grammars may prove too cumbersome for use in situations where the individual grammatical symbols must be manipulated. This thesis proves the possibility of describing Ada's static semantics formally, but some other method such as axiomatic definition might produce a more useful form for such uses as formal correctness proofs.”
—page 43 therein

I'd agree with that wholeheartedly.  Roy Flowers's “axiomatic” therein ≈ my “symbolic logic” above.

Randy, if you really want to make Ada bullet-proof for safety-critical systems, then the ARM as English prose needs to eventually be retired (e.g., for the next revision of Ada post-Ada2020), replaced by something like a MetaAda based on axioms and deductive logic thereof with something analogous to SPARK (or Ocaml's Coq) finding all the illogicalness and lack of covered cases.  (Bojan's lack of named pragma-asserts would be perhaps the very least among the thousand conflicts & gaps & wrinkles & dohs! that such analysis would find.)  And the first A of the AARM would still exist in a different format as why-commentary on the axioms & their logical deductions.  Indeed, there might be a generated English-prose dump of the MetaAda axioms & logical deductions, where the ARM-as-prose would no longer be human-authored and the AARM would be the generated-ARM dump but including the ARG's why-comments throughout the MetaAda symbolic-logic code.

Right now, the ARG mailing list's content as well as some of the deep technically-insightful content of comp.lang.ada's English prose are effectively a mechanical-Turk precursor of the SPARK-esque/Coq-esque thingy that would grind on the axioms and logical deductions in Ada2025's MetaAda ARM-replacement.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-04 15:05       ` Dan'l Miller
  2018-04-04 15:30         ` gerdien.de.kruyf
@ 2018-04-04 22:30         ` Randy Brukardt
  2018-04-04 22:43           ` Paul Rubin
                             ` (2 more replies)
  1 sibling, 3 replies; 57+ messages in thread
From: Randy Brukardt @ 2018-04-04 22:30 UTC (permalink / raw)


"Dan'l Miller" <optikos@verizon.net> wrote in message 
news:9879872e-c18a-4667-afe5-41ce0f54559f@googlegroups.com...
>On Tuesday, April 3, 2018 at 5:33:47 PM UTC-5, Randy Brukardt wrote:
>> I'm not sure what your idea is. The pragma Assert exists already, what is 
>> it
>> that you can't do with it that you need?
...
>> "Bojan Bozovic" wrote in message
>> > Pragma assert(condition,message);
>> > Construct
>> > Assertion name is condition with message;
>
>Randy, despite your humor in the original posting, I think that he is 
>(seriously) proposing ...

I presumed he was serious, I just didn't understand what he was proposing...

>...that assertions optionally be named so that the named condition and
>message pair is defined in one place and then utilized at the (numerous)
>places where the assertion appears, so that the condition & message do
>not vary due to the drift of time, forgetfulness, and whim of different
>programmers.  ...

Humm, I suppose that could have been it. Or it could be that that is 
something you want...

There is a much more general proposal for Ada 2020 called "ghost code" - a 
silly name for code and declarations intended only to be used by assertions. 
(The idea being that if it is marked and enforced as such, it can be removed 
when the Assertion_Policy is Ignore.)

Using that (which may or may not make it into Ada 2020 -- we haven't yet 
discussed it at a meeting), one could use a ghost function for this purpose:

      function My_Assertion (...) return Boolean is
          (if Condition then raise Assertion_Error with Message else True)
          with Ghost;

     pragma Assert (My_Assertion (...));

(Note: The "..." here is any objects that Condition needs to be evaluated.)

This works now (without the "ghost") and is more general in that one can 
pass parameters if needed which would be a problem with the syntax proposed.

>... there is no MetaAda that declares all language principles (preferably
> in code that compiler vendors would take as input when writing a
>compiler, instead of taking in English prose).  "Everything shall be
>able to be named; that name shall be utilizable to assure uniformity at
>each point of usage to the named declaration." would be one of those
>statements in MetaAda, most likely stated in something resembling a
>language isomorphic to symbolic logic instead of quick-&-dirty English
>prose here.

I suspect it would be a huge job to get consensus on such MetaAda rules. 
While I tend to agree with your rule, it's clear that the Ada 95 designers 
did not. In particular, anonymous access types have various capabilities not 
available to their named counterparts (going back to Ada 95 anonymous access 
parameters). This has gotten worse with each new version. I've spent some 
time complaining about that, but there is only so much that any one person 
can do.

                         Randy.





^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-04 22:30         ` Randy Brukardt
@ 2018-04-04 22:43           ` Paul Rubin
  2018-04-05  0:44             ` Mehdi Saada
  2018-04-05  2:05           ` Bojan Bozovic
  2018-04-05  7:21           ` Dmitry A. Kazakov
  2 siblings, 1 reply; 57+ messages in thread
From: Paul Rubin @ 2018-04-04 22:43 UTC (permalink / raw)


"Randy Brukardt" <randy@rrsoftware.com> writes:
> There is a much more general proposal for Ada 2020 called "ghost code" - a 
> silly name for code and declarations intended only to be used by assertions. 

Is that different from contracts?

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-04 22:43           ` Paul Rubin
@ 2018-04-05  0:44             ` Mehdi Saada
  2018-04-05 21:23               ` Randy Brukardt
  0 siblings, 1 reply; 57+ messages in thread
From: Mehdi Saada @ 2018-04-05  0:44 UTC (permalink / raw)


> Is that different from contracts?
It's an expansion of contracts. AFAIK, One can define in SPARK whole compilation units, whose only purpose is to help writing contracts, for arbitrarily complex dynamic or static checking.
The principle is, with a specific assertion policy, anything marked with the GHOST aspect would disappear from the code (and object code), but the program would be functionally identical.

.. One thing I found (a bit) irritating from a beginner point of view, is that normal visibility rules still still holds in contracts/assertions. It would be much easier to write contracts about conditions of (at least) things declared in the private part. It would be nonsensical to let them reach the body though, that I can understand. But why not packages' private part ?


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-04 22:30         ` Randy Brukardt
  2018-04-04 22:43           ` Paul Rubin
@ 2018-04-05  2:05           ` Bojan Bozovic
  2018-04-05 22:12             ` Randy Brukardt
  2018-04-05  7:21           ` Dmitry A. Kazakov
  2 siblings, 1 reply; 57+ messages in thread
From: Bojan Bozovic @ 2018-04-05  2:05 UTC (permalink / raw)


On Thursday, April 5, 2018 at 12:30:46 AM UTC+2, Randy Brukardt wrote:
> "Dan'l Miller" <optikos@verizon.net> wrote in message 
> news:9879872e-c18a-4667-afe5-41ce0f54559f@googlegroups.com...
> >On Tuesday, April 3, 2018 at 5:33:47 PM UTC-5, Randy Brukardt wrote:
> >> I'm not sure what your idea is. The pragma Assert exists already, what is 
> >> it
> >> that you can't do with it that you need?
> ...
> >> "Bojan Bozovic" wrote in message
> >> > Pragma assert(condition,message);
> >> > Construct
> >> > Assertion name is condition with message;
> >
> >Randy, despite your humor in the original posting, I think that he is 
> >(seriously) proposing ...
> 
> I presumed he was serious, I just didn't understand what he was proposing...
> 
> >...that assertions optionally be named so that the named condition and
> >message pair is defined in one place and then utilized at the (numerous)
> >places where the assertion appears, so that the condition & message do
> >not vary due to the drift of time, forgetfulness, and whim of different
> >programmers.  ...
> 
> Humm, I suppose that could have been it. Or it could be that that is 
> something you want...
> 
> There is a much more general proposal for Ada 2020 called "ghost code" - a 
> silly name for code and declarations intended only to be used by assertions. 
> (The idea being that if it is marked and enforced as such, it can be removed 
> when the Assertion_Policy is Ignore.)
> 
> Using that (which may or may not make it into Ada 2020 -- we haven't yet 
> discussed it at a meeting), one could use a ghost function for this purpose:
> 
>       function My_Assertion (...) return Boolean is
>           (if Condition then raise Assertion_Error with Message else True)
>           with Ghost;
> 
>      pragma Assert (My_Assertion (...));
> 
> (Note: The "..." here is any objects that Condition needs to be evaluated.)
> 
> This works now (without the "ghost") and is more general in that one can 
> pass parameters if needed which would be a problem with the syntax proposed.
> 
> >... there is no MetaAda that declares all language principles (preferably
> > in code that compiler vendors would take as input when writing a
> >compiler, instead of taking in English prose).  "Everything shall be
> >able to be named; that name shall be utilizable to assure uniformity at
> >each point of usage to the named declaration." would be one of those
> >statements in MetaAda, most likely stated in something resembling a
> >language isomorphic to symbolic logic instead of quick-&-dirty English
> >prose here.
> 
> I suspect it would be a huge job to get consensus on such MetaAda rules. 
> While I tend to agree with your rule, it's clear that the Ada 95 designers 
> did not. In particular, anonymous access types have various capabilities not 
> available to their named counterparts (going back to Ada 95 anonymous access 
> parameters). This has gotten worse with each new version. I've spent some 
> time complaining about that, but there is only so much that any one person 
> can do.
> 
>                          Randy.

Ah I was victim of April the 1st joke! I wasn't joking though. My proposal was to add assertions in general to Ada, as something in the core language, like exceptions are, or tasks, which would function just as "ghost code" you had in mind when

pragma assertion_policy (check);

is used. Maybe not use assertion keyword but use functions with ghost? I really can't say which approach wold work best, from the point of function, readability and conciseness. That is for ARG to decide.

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-04 22:30         ` Randy Brukardt
  2018-04-04 22:43           ` Paul Rubin
  2018-04-05  2:05           ` Bojan Bozovic
@ 2018-04-05  7:21           ` Dmitry A. Kazakov
  2018-04-05 22:18             ` Randy Brukardt
  2 siblings, 1 reply; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-05  7:21 UTC (permalink / raw)


On 05/04/2018 00:30, Randy Brukardt wrote:

> There is a much more general proposal for Ada 2020 called "ghost code" - a
> silly name for code and declarations intended only to be used by assertions.
> (The idea being that if it is marked and enforced as such, it can be removed
> when the Assertion_Policy is Ignore.)
> 
> Using that (which may or may not make it into Ada 2020 -- we haven't yet
> discussed it at a meeting), one could use a ghost function for this purpose:
> 
>        function My_Assertion (...) return Boolean is
>            (if Condition then raise Assertion_Error with Message else True)
>            with Ghost;
> 
>       pragma Assert (My_Assertion (...));
> 
> (Note: The "..." here is any objects that Condition needs to be evaluated.)

Assertion code is quite useless from my point of view, but what about 
debugging code? It is quite tedious to comment it in and out all the 
time (and with/use clauses required for it).

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-05  0:44             ` Mehdi Saada
@ 2018-04-05 21:23               ` Randy Brukardt
  0 siblings, 0 replies; 57+ messages in thread
From: Randy Brukardt @ 2018-04-05 21:23 UTC (permalink / raw)


"Mehdi Saada" <00120260a@gmail.com> wrote in message 
news:1e3a9818-7c9a-4a8e-a6fb-ebaabd2570f7@googlegroups.com...
...
>.. One thing I found (a bit) irritating from a beginner point of view,
> is that normal visibility rules still still holds in contracts/assertions. 
> It
> would be much easier to write contracts about conditions of (at
> least) things declared in the private part. It would be nonsensical to
> let them reach the body though, that I can understand. But why not
> packages' private part ?

A subprogram contract is just that - a contract between the caller and 
called subprogram. That means the caller needs to be able to understand the 
contents of the contract, and given that they are not supposed to depend on 
the private part, the contract can't either.

Imagine that items in the private part were allowed. Perhaps we'd have a 
contract like:

    package Something is
        type Bar is ...;
        procedure Proc (Foo : in out Bar)
           with Pre => Is_Groddy (Foo);
    private
        function Is_Groddy (Foo : in Bar) return Boolean;
    end Something;

Now,  the caller is not supposed to call Something.Proc when Is_Groddy is 
false (that's the contract). Perhaps the caller wants to make the test 
themselves (think of how often you test for null with access types before 
making a call):

     if Is_Groddy (My_Bar) then
          Proc (My_Bar);
     else
           -- Some alternative code.
     end if;

But the above is illegal, because Is_Groddy is in the private part and not 
available to the callers of Proc. This leaves the programmer with only bad 
options: make the call unconditionally and hope that My_Bar is in fact 
Groddy -- or handle the Assertion_Errror (generally more expensive than a 
test). (And if you are doing static checking, you end up depending 
explicitly on the private part, meaning you are no longer hiding it in any 
way.)

Regardless of what private information was in the contract, it is not useful 
to the caller (unless they also break privacy and inspect the private part). 
That pretty much defeats the purpose of making it private in the first 
place.

                                 Randy.



^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-05  2:05           ` Bojan Bozovic
@ 2018-04-05 22:12             ` Randy Brukardt
  2018-04-06 13:35               ` Bojan Bozovic
  0 siblings, 1 reply; 57+ messages in thread
From: Randy Brukardt @ 2018-04-05 22:12 UTC (permalink / raw)


"Bojan Bozovic" <bozovic.bojan@gmail.com> wrote in message 
news:17b6d5b9-5909-4f0c-ab25-9b3cf4fd0450@googlegroups.com...
...
> Ah I was victim of April the 1st joke! I wasn't joking though. My
> proposal was to add assertions in general to Ada, as something in
> the core language, like exceptions are, or tasks, which would
> function just as "ghost code" you had in mind when
>
> pragma assertion_policy (check);
>
> is used. Maybe not use assertion keyword but use functions with
> ghost? I really can't say which approach wold work best, from the
>  point of function, readability and conciseness. That is for ARG to
> decide.

But what would the point be? A pragma is a first-class part of the language 
(it's not optional!). And a pragma can be used in places that a statement 
cannot be used (that is, with declarations). To get even equivalent 
functionality that way would be fairly complex (needing both statements and 
declarations). Sounds like a lot of work for very little gain.

Personally, I don't see a lot of value in pragma Assert in the first place 
(as opposed to the other contract assertions). I'm in the camp that 
suppressing/ignoring checks ought to be a last resort, only to be used when 
performance goals can't be met any other way (using a better algorithm is 
always a better way). As such, "cheap" assertions might as well just be part 
of the code; I would never want to remove them. Often, the compiler can 
prove that they're true (so there is no cost), and if not, a well-defined 
failure is better than erroneous execution where anything can happen.

For instance, the Janus/Ada compiler has a lot of code like:
      if Node.Solution = null then
           Internal_Error ("Missing solution");

This could have been modeled as an assertion:

    pragma Assert (Node.Solution /= null, "Missing solution");

but the above modeling allows calling a fairly complex error management 
routine (which, if trace mode is on, gives the user various interactive 
debugging options). And we wouldn't want to remove the check in any case; 
it's much better to detect the problem in a controlled way rather than an 
uncontrolled one (especially if all checking has been suppressed).

Some assertions are expensive to run, and thus shouldn't run all of the 
time. For those, I generally find that it is best to tie them to the 
debugging mode for whatever subsystem that they are related to. (Pretty much 
every program I write has a variety of debugging modes, so that one has a 
limited amount of debugging information to look through to find problems. I 
usually write my programs with a substantial amount of debugging from the 
start, and then usually keep any debugging added to find specific problems. 
Whenever I haven't had substantial debugging, I've always found I've had to 
add it, in order to get any confidence that the code is doing the right 
thing.) Since there are multiple such debugging modes, tying everything to a 
single "check/ignore" flag is much too coarse for my purposes. So again 
there isn't much use for pragma Assert.

                         Randy.



^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-05  7:21           ` Dmitry A. Kazakov
@ 2018-04-05 22:18             ` Randy Brukardt
  2018-04-06  7:30               ` Dmitry A. Kazakov
  2018-04-06 23:49               ` Dan'l Miller
  0 siblings, 2 replies; 57+ messages in thread
From: Randy Brukardt @ 2018-04-05 22:18 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> wrote in message 
news:pa4ip3$1ton$1@gioia.aioe.org...
> On 05/04/2018 00:30, Randy Brukardt wrote:
>
>> There is a much more general proposal for Ada 2020 called "ghost code" - 
>> a
>> silly name for code and declarations intended only to be used by 
>> assertions.
>> (The idea being that if it is marked and enforced as such, it can be 
>> removed
>> when the Assertion_Policy is Ignore.)
>>
>> Using that (which may or may not make it into Ada 2020 -- we haven't yet
>> discussed it at a meeting), one could use a ghost function for this 
>> purpose:
>>
>>        function My_Assertion (...) return Boolean is
>>            (if Condition then raise Assertion_Error with Message else 
>> True)
>>            with Ghost;
>>
>>       pragma Assert (My_Assertion (...));
>>
>> (Note: The "..." here is any objects that Condition needs to be 
>> evaluated.)
>
> Assertion code is quite useless from my point of view, but what about 
> debugging code? It is quite tedious to comment it in and out all the time 
> (and with/use clauses required for it).

My assumption has always been that Ghost code could be used for that as 
well, but since it's more of an idea than a proposal, it's hard to say 
anything definitive.

If you used Janus/Ada, you'd have a built-in solution for that (sadly, 
incompatible with Ada 2020): the @ conditional compilation marker. It is 
interpreted as either "--" or " " depending on a compilation flag, so it can 
easily add or remove anything. We originally invented it to deal with 
debugging/assertion code in the Janus/Ada compiler; probably about 25% of 
the code in Janus/Ada is marked that way. Since it is lexical, it can 
comment out anything, including declarations, pragmas, and with clauses.

In programs intended to be portable, I just use static Boolean flags for 
debugging code. Any compiler with decent dead code elimination (which would 
be about all of them) will get rid of most/all of the code that way. But 
that doesn't work on pragmas, context clauses, or declarations, so it isn't 
quite as through as @. (With Janus/Ada, which does dead subprogram 
elimination during binding, including for tagged type primitives, the 
difference isn't that substantial unless large data declarations are 
involved.)

               Randy.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-05 22:18             ` Randy Brukardt
@ 2018-04-06  7:30               ` Dmitry A. Kazakov
  2018-04-07  2:25                 ` Randy Brukardt
  2018-04-06 23:49               ` Dan'l Miller
  1 sibling, 1 reply; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-06  7:30 UTC (permalink / raw)


On 06/04/2018 00:18, Randy Brukardt wrote:
> "Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> wrote in message
> news:pa4ip3$1ton$1@gioia.aioe.org...
>> On 05/04/2018 00:30, Randy Brukardt wrote:
>>
>>> There is a much more general proposal for Ada 2020 called "ghost code" -
>>> a
>>> silly name for code and declarations intended only to be used by
>>> assertions.
>>> (The idea being that if it is marked and enforced as such, it can be
>>> removed
>>> when the Assertion_Policy is Ignore.)
>>>
>>> Using that (which may or may not make it into Ada 2020 -- we haven't yet
>>> discussed it at a meeting), one could use a ghost function for this
>>> purpose:
>>>
>>>         function My_Assertion (...) return Boolean is
>>>             (if Condition then raise Assertion_Error with Message else
>>> True)
>>>             with Ghost;
>>>
>>>        pragma Assert (My_Assertion (...));
>>>
>>> (Note: The "..." here is any objects that Condition needs to be
>>> evaluated.)
>>
>> Assertion code is quite useless from my point of view, but what about
>> debugging code? It is quite tedious to comment it in and out all the time
>> (and with/use clauses required for it).
> 
> My assumption has always been that Ghost code could be used for that as
> well, but since it's more of an idea than a proposal, it's hard to say
> anything definitive.
> 
> If you used Janus/Ada, you'd have a built-in solution for that (sadly,
> incompatible with Ada 2020): the @ conditional compilation marker. It is
> interpreted as either "--" or " " depending on a compilation flag, so it can
> easily add or remove anything. We originally invented it to deal with
> debugging/assertion code in the Janus/Ada compiler; probably about 25% of
> the code in Janus/Ada is marked that way. Since it is lexical, it can
> comment out anything, including declarations, pragmas, and with clauses.
> 
> In programs intended to be portable, I just use static Boolean flags for
> debugging code. Any compiler with decent dead code elimination (which would
> be about all of them) will get rid of most/all of the code that way. But
> that doesn't work on pragmas, context clauses, or declarations, so it isn't
> quite as through as @. (With Janus/Ada, which does dead subprogram
> elimination during binding, including for tagged type primitives, the
> difference isn't that substantial unless large data declarations are
> involved.)

Yes, it is frequently "with Ada.Text_IO" or some internal package like 
Dump_Pool etc. One could easily imagine record components provided for 
debugging purpose and attributes selected differently, e.g. 
X'Storage_Pool etc.

Another "requirement" is integration into IDE. It should be easy for the 
IDE to hide all this code when debugging is inactive and colorize it 
differently when active.

Janus/Ada solution looks better than pragma. And it could be extended to 
provide multiple sections of debugging code which could be activated and 
deactivated independently.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-05 22:12             ` Randy Brukardt
@ 2018-04-06 13:35               ` Bojan Bozovic
  2018-04-07  2:01                 ` Randy Brukardt
  0 siblings, 1 reply; 57+ messages in thread
From: Bojan Bozovic @ 2018-04-06 13:35 UTC (permalink / raw)


On Friday, April 6, 2018 at 12:12:04 AM UTC+2, Randy Brukardt wrote:
> "Bojan Bozovic" <bozovic.bojan@gmail.com> wrote in message 
> news:17b6d5b9-5909-4f0c-ab25-9b3cf4fd0450@googlegroups.com...
> ...
> > Ah I was victim of April the 1st joke! I wasn't joking though. My
> > proposal was to add assertions in general to Ada, as something in
> > the core language, like exceptions are, or tasks, which would
> > function just as "ghost code" you had in mind when
> >
> > pragma assertion_policy (check);
> >
> > is used. Maybe not use assertion keyword but use functions with
> > ghost? I really can't say which approach wold work best, from the
> >  point of function, readability and conciseness. That is for ARG to
> > decide.
> 
> But what would the point be? A pragma is a first-class part of the language 
> (it's not optional!). And a pragma can be used in places that a statement 
> cannot be used (that is, with declarations). To get even equivalent 
> functionality that way would be fairly complex (needing both statements and 
> declarations). Sounds like a lot of work for very little gain.
> 
> Personally, I don't see a lot of value in pragma Assert in the first place 
> (as opposed to the other contract assertions). I'm in the camp that 
> suppressing/ignoring checks ought to be a last resort, only to be used when 
> performance goals can't be met any other way (using a better algorithm is 
> always a better way). As such, "cheap" assertions might as well just be part 
> of the code; I would never want to remove them. Often, the compiler can 
> prove that they're true (so there is no cost), and if not, a well-defined 
> failure is better than erroneous execution where anything can happen.
> 
> For instance, the Janus/Ada compiler has a lot of code like:
>       if Node.Solution = null then
>            Internal_Error ("Missing solution");
> 
> This could have been modeled as an assertion:
> 
>     pragma Assert (Node.Solution /= null, "Missing solution");
> 
> but the above modeling allows calling a fairly complex error management 
> routine (which, if trace mode is on, gives the user various interactive 
> debugging options). And we wouldn't want to remove the check in any case; 
> it's much better to detect the problem in a controlled way rather than an 
> uncontrolled one (especially if all checking has been suppressed).
> 
> Some assertions are expensive to run, and thus shouldn't run all of the 
> time. For those, I generally find that it is best to tie them to the 
> debugging mode for whatever subsystem that they are related to. (Pretty much 
> every program I write has a variety of debugging modes, so that one has a 
> limited amount of debugging information to look through to find problems. I 
> usually write my programs with a substantial amount of debugging from the 
> start, and then usually keep any debugging added to find specific problems. 
> Whenever I haven't had substantial debugging, I've always found I've had to 
> add it, in order to get any confidence that the code is doing the right 
> thing.) Since there are multiple such debugging modes, tying everything to a 
> single "check/ignore" flag is much too coarse for my purposes. So again 
> there isn't much use for pragma Assert.
> 
>                          Randy.

If assertions/invariants encompassed type invariants and were used in loops as well, like in Eiffel, readability would be better in my humble opinion, while Ada have pragma assert that can be used in loops, and static_predicate and dynamic_predicate on types.

Maybe assertion blocks would work better as in Eiffel for loops? Like exception block in Ada?

Eiffel approach must have been discussed by ARG, however readability is still important for human programmer, even though it isn't important for the compiler.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-02  3:32 Interesting article on ARG work Randy Brukardt
  2018-04-02 14:49 ` Dan'l Miller
@ 2018-04-06 13:35 ` Marius Amado-Alves
  2018-04-07  2:15   ` Randy Brukardt
  1 sibling, 1 reply; 57+ messages in thread
From: Marius Amado-Alves @ 2018-04-06 13:35 UTC (permalink / raw)


Tremendous, thanks you.

(I dont know names, but I can spot *fantasy* passages like "code that doesn't meet its specification" and "cut the number of bugs delivered to customers by 90%")


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-05 22:18             ` Randy Brukardt
  2018-04-06  7:30               ` Dmitry A. Kazakov
@ 2018-04-06 23:49               ` Dan'l Miller
  2018-04-12 10:21                 ` Marius Amado-Alves
  1 sibling, 1 reply; 57+ messages in thread
From: Dan'l Miller @ 2018-04-06 23:49 UTC (permalink / raw)


Attention Shark8,
Multiple branches of the thread-tree of this posting (other than the original joke) indicate another category of modularity than discussed recently previously:  control of degrees of bringing in or taking out different categories or types of debugging/tracing/performance-analysis code (or as AOP calls them: aspects).  Byron could tackle that problem well in order to attract more developers/corporate-sponsors.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-06 13:35               ` Bojan Bozovic
@ 2018-04-07  2:01                 ` Randy Brukardt
  0 siblings, 0 replies; 57+ messages in thread
From: Randy Brukardt @ 2018-04-07  2:01 UTC (permalink / raw)



"Bojan Bozovic" <bozovic.bojan@gmail.com> wrote in message 
news:2be6c759-1a2f-48fc-81b1-ffab1d1c957d@googlegroups.com...
...
>If assertions/invariants encompassed type invariants and were used
>in loops as well, like in Eiffel, readability would be better in my
>humble opinion, while Ada have pragma assert that can be used in
>loops, and static_predicate and dynamic_predicate on types.
>
>Maybe assertion blocks would work better as in Eiffel for loops?
>Like exception block in Ada?

No idea. There are definitely other kinds of contracts not currently defined 
in Ada -- loop invariants certainly are one of those. Ada 2020 adds 
Default_Initial_Condition as a contract, and Stable_Properties as a 
technique for managing contracts (postconditions). We haven't talked much 
about other contracts, but certainly we'll do so in the future.

Such contracts would certainly in addition to pragma Assert (which is 
general, but completely manual). And of course the form hasn't been decided.

Giving a different form to existing contracts isn't likely to get much 
interest. Additional contracts might (or might) be different.

                          Randy.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-06 13:35 ` Interesting article on ARG work Marius Amado-Alves
@ 2018-04-07  2:15   ` Randy Brukardt
  0 siblings, 0 replies; 57+ messages in thread
From: Randy Brukardt @ 2018-04-07  2:15 UTC (permalink / raw)


"Marius Amado-Alves" <amado.alves@gmail.com> wrote in message 
news:532d53c6-268c-468d-9300-67e08fd958c2@googlegroups.com...
> Tremendous, thanks you.
>
> (I dont know names, but I can spot *fantasy* passages like "code that
>doesn't meet its specification" and "cut the number of bugs delivered to
>customers by 90%")

Fantasy? The Blackhole policy would ensure that 90% of the code delivered to 
customers hardly ever executes, so it couldn't possibly have cause bugs. ;-) 
;-)

Moreover, since a clueless implementer took the terminology literally, they 
had sucked their entire development group and all of their beta testers into 
a blackhole. One careful look at the AI would show that wasn't exactly the 
intent. But it definitely ensures that far fewer bugs get to customers. :-)

                     Randy.





^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-06  7:30               ` Dmitry A. Kazakov
@ 2018-04-07  2:25                 ` Randy Brukardt
  2018-04-07 10:11                   ` Dmitry A. Kazakov
  0 siblings, 1 reply; 57+ messages in thread
From: Randy Brukardt @ 2018-04-07  2:25 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> wrote in message 
news:pa77nc$1pu1$1@gioia.aioe.org...
> On 06/04/2018 00:18, Randy Brukardt wrote:
...
>> In programs intended to be portable, I just use static Boolean flags for
>> debugging code. Any compiler with decent dead code elimination (which 
>> would
>> be about all of them) will get rid of most/all of the code that way. But
>> that doesn't work on pragmas, context clauses, or declarations, so it 
>> isn't
>> quite as through as @. (With Janus/Ada, which does dead subprogram
>> elimination during binding, including for tagged type primitives, the
>> difference isn't that substantial unless large data declarations are
>> involved.)
>
> Yes, it is frequently "with Ada.Text_IO" or some internal package like 
> Dump_Pool etc. One could easily imagine record components provided for 
> debugging purpose and attributes selected differently, e.g. X'Storage_Pool 
> etc.

With Janus/Ada's trimming technology, the only part of a package that has to 
appear in the final program is the elaboration code. So for a package like 
Text_IO, 98% of it would get discarded if not used. That only fails if there 
is a large data area (data isn't trimmed) and especially if there is a 
significant cost to initialize that data.

Still, there are cases where the lexical switching works better; we use it 
to suppress (or not) other checks, remove the interactive debugging 
features, and the like. (However, in practice, the last couple of Janus/Ada 
versions were delivered with all of that code intact; the code memory use of 
the compiler is almost irrelevant on modern machines, and it means that 
failures are much more likely to be predictable.)

> Another "requirement" is integration into IDE. It should be easy for the 
> IDE to hide all this code when debugging is inactive and colorize it 
> differently when active.

That sounds like a good idea.

> Janus/Ada solution looks better than pragma. And it could be extended to 
> provide multiple sections of debugging code which could be activated and 
> deactivated independently.

The biggest problem with a lexical solution (and the reason that Ichbiah 
hated them) is that it's trivial to accidentally create a program that is 
only legal in one mode or the other. One has to continually compile 
everything in each possible mode to ensure that there aren't silly syntax 
errors involved. A common mistake that I make all of the time is:

    if Foo then
           Bar;
    else
@        Internal_Error;
    end if;

With the conditional compilation off, the "Internal_Error" is commented out 
and the above isn't syntactically legal. Ichbiah wanted to eliminate that 
from Ada, so there's no Include (a common feature of other languages back in 
the day, and heavily used in the very early versions of Janus/Ada, long gone 
now), no preprocessor, and so on -- everything is handled within the usual 
syntax of the language. Which doesn't always work that well...

                    Randy.



^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-07  2:25                 ` Randy Brukardt
@ 2018-04-07 10:11                   ` Dmitry A. Kazakov
  2018-04-07 15:27                     ` Dan'l Miller
  0 siblings, 1 reply; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-07 10:11 UTC (permalink / raw)


On 2018-04-07 04:25, Randy Brukardt wrote:

> The biggest problem with a lexical solution (and the reason that Ichbiah
> hated them) is that it's trivial to accidentally create a program that is
> only legal in one mode or the other. One has to continually compile
> everything in each possible mode to ensure that there aren't silly syntax
> errors involved. A common mistake that I make all of the time is:
> 
>      if Foo then
>             Bar;
>      else
> @        Internal_Error;
>      end if;
> 
> With the conditional compilation off, the "Internal_Error" is commented out
> and the above isn't syntactically legal.

Yes, this is important but I think this could be resolved, e.g. by 
compiling conditionals into corresponding null-effect constructs.

> Ichbiah wanted to eliminate that
> from Ada, so there's no Include (a common feature of other languages back in
> the day, and heavily used in the very early versions of Janus/Ada, long gone
> now), no preprocessor, and so on -- everything is handled within the usual
> syntax of the language. Which doesn't always work that well...

Though dangers of preprocessing outweigh any inconvenience.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-07 10:11                   ` Dmitry A. Kazakov
@ 2018-04-07 15:27                     ` Dan'l Miller
  2018-04-07 15:59                       ` Dmitry A. Kazakov
  2018-04-09 20:14                       ` Randy Brukardt
  0 siblings, 2 replies; 57+ messages in thread
From: Dan'l Miller @ 2018-04-07 15:27 UTC (permalink / raw)


On Saturday, April 7, 2018 at 5:11:35 AM UTC-5, Dmitry A. Kazakov wrote:
> On 2018-04-07 04:25, Randy Brukardt wrote:
> 
> > The biggest problem with a lexical solution (and the reason that Ichbiah
> > hated them) is that it's trivial to accidentally create a program that is
> > only legal in one mode or the other. One has to continually compile
> > everything in each possible mode to ensure that there aren't silly syntax
> > errors involved. A common mistake that I make all of the time is:
> > 
> >      if Foo then
> >             Bar;
> >      else
> > @        Internal_Error;
> >      end if;
> > 
> > With the conditional compilation off, the "Internal_Error" is commented out
> > and the above isn't syntactically legal.
>
> Yes, this is important but I think this could be resolved, e.g. by 
> compiling conditionals into corresponding null-effect constructs. 

Yeah, right.  This is an artificially contrived “problem” that is in fact not extant in reality.  Simply put the @ symbols in front of the •entire• construct to elide:  the entire else branch.

    if Foo then
           Bar;
@ else
@        Internal_Error;
    endif;

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-07 15:27                     ` Dan'l Miller
@ 2018-04-07 15:59                       ` Dmitry A. Kazakov
  2018-04-08  0:14                         ` Dan'l Miller
  2018-04-09 20:14                       ` Randy Brukardt
  1 sibling, 1 reply; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-07 15:59 UTC (permalink / raw)


On 2018-04-07 17:27, Dan'l Miller wrote:
> On Saturday, April 7, 2018 at 5:11:35 AM UTC-5, Dmitry A. Kazakov wrote:
>> On 2018-04-07 04:25, Randy Brukardt wrote:
>>
>>> The biggest problem with a lexical solution (and the reason that Ichbiah
>>> hated them) is that it's trivial to accidentally create a program that is
>>> only legal in one mode or the other. One has to continually compile
>>> everything in each possible mode to ensure that there aren't silly syntax
>>> errors involved. A common mistake that I make all of the time is:
>>>
>>>       if Foo then
>>>              Bar;
>>>       else
>>> @        Internal_Error;
>>>       end if;
>>>
>>> With the conditional compilation off, the "Internal_Error" is commented out
>>> and the above isn't syntactically legal.
>>
>> Yes, this is important but I think this could be resolved, e.g. by
>> compiling conditionals into corresponding null-effect constructs.
> 
> Yeah, right.  This is an artificially contrived “problem” that is in fact not extant in reality.  Simply put the @ symbols in front of the •entire• construct to elide:  the entire else branch.
> 
>      if Foo then
>             Bar;
> @ else
> @        Internal_Error;
>      endif;
> 

"@ else"
"@        Internal_Error;"

Is not a valid sequence of statements. The point is that like with 
pragma, the conditional must be a syntactic entity, e.g. a sequence of 
statements, a Boolean-valued expression, a declaration, not just a 
sequence of characters.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-07 15:59                       ` Dmitry A. Kazakov
@ 2018-04-08  0:14                         ` Dan'l Miller
  2018-04-08  7:46                           ` Dmitry A. Kazakov
  0 siblings, 1 reply; 57+ messages in thread
From: Dan'l Miller @ 2018-04-08  0:14 UTC (permalink / raw)


On Saturday, April 7, 2018 at 11:00:02 AM UTC-5, Dmitry A. Kazakov wrote:
> On 2018-04-07 17:27, Dan'l Miller wrote:
> > On Saturday, April 7, 2018 at 5:11:35 AM UTC-5, Dmitry A. Kazakov wrote:
> >> On 2018-04-07 04:25, Randy Brukardt wrote:
> >>
> >>> The biggest problem with a lexical solution (and the reason that Ichbiah
> >>> hated them) is that it's trivial to accidentally create a program that is
> >>> only legal in one mode or the other. One has to continually compile
> >>> everything in each possible mode to ensure that there aren't silly syntax
> >>> errors involved. A common mistake that I make all of the time is:
> >>>
> >>>       if Foo then
> >>>              Bar;
> >>>       else
> >>> @        Internal_Error;
> >>>       end if;
> >>>
> >>> With the conditional compilation off, the "Internal_Error" is commented out
> >>> and the above isn't syntactically legal.
> >>
> >> Yes, this is important but I think this could be resolved, e.g. by
> >> compiling conditionals into corresponding null-effect constructs.
> > 
> > Yeah, right.  This is an artificially contrived “problem” that is in fact not extant in reality.  Simply put the @ symbols in front of the •entire• construct to elide:  the entire else branch.
> > 
> >      if Foo then
> >             Bar;
> > @ else
> > @        Internal_Error;
> >      endif;
> > 
> 
> "@ else"
> "@        Internal_Error;"
> 
> Is not a valid sequence of statements.

That is a non sequitur, Dmitry. 

     if Foo then 
           Bar; 
@ else 
@        Internal_Error; 
     endif;

produces either this legal Ada if-else statement, where ␠ is the ASCII/ISO646 space character:

     if Foo then 
           Bar; 
␠␠ else 
␠␠       Internal_Error; 
     endif;

or this legal Ada if statement:

     if Foo then 
           Bar; 
-- else 
--       Internal_Error; 
     endif;

Compare & contrast, and then you will understand what Randy was saying.

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-08  0:14                         ` Dan'l Miller
@ 2018-04-08  7:46                           ` Dmitry A. Kazakov
  2018-04-08 19:48                             ` Dan'l Miller
  0 siblings, 1 reply; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-08  7:46 UTC (permalink / raw)


On 2018-04-08 02:14, Dan'l Miller wrote:
> On Saturday, April 7, 2018 at 11:00:02 AM UTC-5, Dmitry A. Kazakov wrote:
>> On 2018-04-07 17:27, Dan'l Miller wrote:
>>> On Saturday, April 7, 2018 at 5:11:35 AM UTC-5, Dmitry A. Kazakov wrote:
>>>> On 2018-04-07 04:25, Randy Brukardt wrote:
>>>>
>>>>> The biggest problem with a lexical solution (and the reason that Ichbiah
>>>>> hated them) is that it's trivial to accidentally create a program that is
>>>>> only legal in one mode or the other. One has to continually compile
>>>>> everything in each possible mode to ensure that there aren't silly syntax
>>>>> errors involved. A common mistake that I make all of the time is:
>>>>>
>>>>>        if Foo then
>>>>>               Bar;
>>>>>        else
>>>>> @        Internal_Error;
>>>>>        end if;
>>>>>
>>>>> With the conditional compilation off, the "Internal_Error" is commented out
>>>>> and the above isn't syntactically legal.
>>>>
>>>> Yes, this is important but I think this could be resolved, e.g. by
>>>> compiling conditionals into corresponding null-effect constructs.
>>>
>>> Yeah, right.  This is an artificially contrived “problem” that is in fact not extant in reality.  Simply put the @ symbols in front of the •entire• construct to elide:  the entire else branch.
>>>
>>>       if Foo then
>>>              Bar;
>>> @ else
>>> @        Internal_Error;
>>>       endif;
>>>
>>
>> "@ else"
>> "@        Internal_Error;"
>>
>> Is not a valid sequence of statements.
> 
> That is a non sequitur, Dmitry.
> 
>       if Foo then
>             Bar;
> @ else
> @        Internal_Error;
>       endif;
> 
> produces either this legal Ada if-else statement,

[...]

ARM 5.1 (2/3) defines sequence of statements. "else Internal_Error;" is 
not a sequence of statements. The rule is simple, you can insert only 
complete statements.

    if Foo then
@     Bar;
@  end if;
@  if Boo then
       Baz;
    end if;

would not be legal either.

More complicated rules are required for declarations, e.g. that 
conditionally declared entities would not be visible outside conditional 
code:

declare
@  X : Integer;
    Y : Integer;
begin
    Y := X; -- This is illegal
@  Y := X; -- This is OK
end;

or

@ with Text_IO;
use Text_IO; -- No, that does not work

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-08  7:46                           ` Dmitry A. Kazakov
@ 2018-04-08 19:48                             ` Dan'l Miller
  2018-04-08 20:09                               ` Dmitry A. Kazakov
  2018-04-09 16:12                               ` Niklas Holsti
  0 siblings, 2 replies; 57+ messages in thread
From: Dan'l Miller @ 2018-04-08 19:48 UTC (permalink / raw)


On Sunday, April 8, 2018 at 2:46:15 AM UTC-5, Dmitry A. Kazakov wrote:
>     if Foo then
> @     Bar;
> @  end if;
> @  if Boo then
>        Baz;
>     end if;
> 
> would not be legal either.

Where ␠ is the ASCII/ISO646 space character (presented as a graphical character for visualization herein),

    if Foo then
␠     Bar;
␠  end if;
␠  if Boo then
       Baz;
    end if;

and

    if Foo then
--     Bar;
--  end if;
--  if Boo then
       Baz;
    end if;

both are legal Ada statements, spurious and non sequitur evocations of the ARM notwithstanding.

In general, Dmitry, you seem to not understand that a generalized left-right (GLR) parser would parse both sides of the @ concurrently.  Your entire premise is that it is impossible to parse both presence and absence of @ at the same time.  In GLR, you are incorrect.

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-08 19:48                             ` Dan'l Miller
@ 2018-04-08 20:09                               ` Dmitry A. Kazakov
  2018-04-09  3:50                                 ` Dan'l Miller
  2018-04-09 16:12                               ` Niklas Holsti
  1 sibling, 1 reply; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-08 20:09 UTC (permalink / raw)


On 2018-04-08 21:48, Dan'l Miller wrote:
> On Sunday, April 8, 2018 at 2:46:15 AM UTC-5, Dmitry A. Kazakov wrote:
>>      if Foo then
>> @     Bar;
>> @  end if;
>> @  if Boo then
>>         Baz;
>>      end if;
>>
>> would not be legal either.
> 
> Where ␠ is the ASCII/ISO646 space character (presented as a graphical character for visualization herein),
> 
>      if Foo then
> ␠     Bar;
> ␠  end if;
> ␠  if Boo then
>         Baz;
>      end if;
> 
> and
> 
>      if Foo then
> --     Bar;
> --  end if;
> --  if Boo then
>         Baz;
>      end if;
> 
> both are legal Ada statements, spurious and non sequitur evocations of the ARM notwithstanding.
> 
> In general, Dmitry, you seem to not understand that a generalized left-right (GLR) parser would parse both sides of the @ concurrently.  Your entire premise is that it is impossible to parse both presence and absence of @ at the same time.  In GLR, you are incorrect.

No, I don't understand what are you trying to say or how is it related 
to the issue discussed.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-08 20:09                               ` Dmitry A. Kazakov
@ 2018-04-09  3:50                                 ` Dan'l Miller
  2018-04-09  6:40                                   ` Jan de Kruyf
  2018-04-09  7:43                                   ` Dmitry A. Kazakov
  0 siblings, 2 replies; 57+ messages in thread
From: Dan'l Miller @ 2018-04-09  3:50 UTC (permalink / raw)


On Sunday, April 8, 2018 at 3:09:55 PM UTC-5, Dmitry A. Kazakov wrote:
> On 2018-04-08 21:48, Dan'l Miller wrote:
> > On Sunday, April 8, 2018 at 2:46:15 AM UTC-5, Dmitry A. Kazakov wrote:
> >>      if Foo then
> >> @     Bar;
> >> @  end if;
> >> @  if Boo then
> >>         Baz;
> >>      end if;
> >>
> >> would not be legal either.
> > 
> > Where ␠ is the ASCII/ISO646 space character (presented as a graphical character for visualization herein),
> > 
> >      if Foo then
> > ␠     Bar;
> > ␠  end if;
> > ␠  if Boo then
> >         Baz;
> >      end if;
> > 
> > and
> > 
> >      if Foo then
> > --     Bar;
> > --  end if;
> > --  if Boo then
> >         Baz;
> >      end if;
> > 
> > both are legal Ada statements, spurious and non sequitur evocations of the ARM notwithstanding.
> > 
> > In general, Dmitry, you seem to not understand that a generalized left-right (GLR) parser would parse both sides of the @ concurrently.  Your entire premise is that it is impossible to parse both presence and absence of @ at the same time.  In GLR, you are incorrect.
> 
> No, I don't understand what are you trying to say or how is it related 
> to the issue discussed.

I am saying that your insistence on conformance of the extent of the @-code to any portion of the BNF (or anything else) in the ARM is 100% misguided, tantamount to being a troller fisherman.

I am saying that you are 100% incorrect that only the presence or only the absence of the @-code will be compiled in the most-exemplary implementation, leaving the other uncompiled variant to bit-rot uncompiled for extended periods of time.  I am saying that you sure appear to be quite ignorant of GLR parsers' ability to parse both the presence and absence of the @-code ••concurrently in the same invocation•• of the compiler in O(n) time in this case (because the @ mechanism does not suffer combinatorial explosion of, say, C preprocessor's arbitrarily-nested #if...#endif constructs, which in their worst case can evoke GLR parsers' O(n³)-growth time on top of the combinatorial explosion's growth rate).  GLR parsers accomplish this by forking/bifurcating the push-down automata's reduced-production stack before rejoining the forks into traditional LR mode:  one fork of the bifurcated reduced-production stack for •presence• of the @-code and one fork of the bifurcated reduced-production stack for the •absence• of the @-code.  I am saying that you apparently have not even read the Wikipedia article on GLR parsers:  https://en.wikipedia.org/wiki/GLR_parser

I am saying that nearly everything that you have said among your multiple replies to Randy's @-code is hogwash and horse-hooey.  Perhaps.  Every.  Word.  Of.  It.  (especially regarding your pontifications in the large for all possible implementations that no well-oiled-machine spectacular implementation of the @-code is possible; perhaps you might hit the nail on the head on a subpoint here and there regarding some less-than-exemplary implementation).

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09  3:50                                 ` Dan'l Miller
@ 2018-04-09  6:40                                   ` Jan de Kruyf
  2018-04-09  7:43                                   ` Dmitry A. Kazakov
  1 sibling, 0 replies; 57+ messages in thread
From: Jan de Kruyf @ 2018-04-09  6:40 UTC (permalink / raw)


On Monday, 9 April 2018 05:50:32 UTC+2, Dan'l Miller  wrote:
> On Sunday, April 8, 2018 at 3:09:55 PM UTC-5, Dmitry A. Kazakov wrote:
> > On 2018-04-08 21:48, Dan'l Miller wrote:
> > > On Sunday, April 8, 2018 at 2:46:15 AM UTC-5, Dmitry A. Kazakov wrote:
> > >>      if Foo then
> > >> @     Bar;
> > >> @  end if;
> > >> @  if Boo then
> > >>         Baz;
> > >>      end if;
> > >>
> > >> would not be legal either.
> > > 
> > > Where ␠ is the ASCII/ISO646 space character (presented as a graphical character for visualization herein),
> > > 
> > >      if Foo then
> > > ␠     Bar;
> > > ␠  end if;
> > > ␠  if Boo then
> > >         Baz;
> > >      end if;
> > > 
> > > and
> > > 
> > >      if Foo then
> > > --     Bar;
> > > --  end if;
> > > --  if Boo then
> > >         Baz;
> > >      end if;
> > > 
> > > both are legal Ada statements, spurious and non sequitur evocations of the ARM notwithstanding.
> > > 
> > > In general, Dmitry, you seem to not understand that a generalized left-right (GLR) parser would parse both sides of the @ concurrently.  Your entire premise is that it is impossible to parse both presence and absence of @ at the same time.  In GLR, you are incorrect.
> > 
> > No, I don't understand what are you trying to say or how is it related 
> > to the issue discussed.
> 
> I am saying that your insistence on conformance of the extent of the @-code to any portion of the BNF (or anything else) in the ARM is 100% misguided, tantamount to being a troller fisherman.
> 
> I am saying that you are 100% incorrect that only the presence or only the absence of the @-code will be compiled in the most-exemplary implementation, leaving the other uncompiled variant to bit-rot uncompiled for extended periods of time.  I am saying that you sure appear to be quite ignorant of GLR parsers' ability to parse both the presence and absence of the @-code ••concurrently in the same invocation•• of the compiler in O(n) time in this case (because the @ mechanism does not suffer combinatorial explosion of, say, C preprocessor's arbitrarily-nested #if...#endif constructs, which in their worst case can evoke GLR parsers' O(n³)-growth time on top of the combinatorial explosion's growth rate).  GLR parsers accomplish this by forking/bifurcating the push-down automata's reduced-production stack before rejoining the forks into traditional LR mode:  one fork of the bifurcated reduced-production stack for •presence• of the @-code and one fork of the bifurcated reduced-production stack for the •absence• of the @-code.  I am saying that you apparently have not even read the Wikipedia article on GLR parsers:  https://en.wikipedia.org/wiki/GLR_parser
> 
> I am saying that nearly everything that you have said among your multiple replies to Randy's @-code is hogwash and horse-hooey.  Perhaps.  Every.  Word.  Of.  It.  (especially regarding your pontifications in the large for all possible implementations that no well-oiled-machine spectacular implementation of the @-code is possible; perhaps you might hit the nail on the head on a subpoint here and there regarding some less-than-exemplary implementation).

Dan'l please dont get over excited, we all live and learn you know. Even Dmitry, and yes even I (though I hate to admit it!)

You are probably quite right about what you see, from the place where you stand. But Dmitry stands in a different place and consequently sees different things.

Other than that I like to stress that most in the group know Dimtry's worth to us. I would say he is quite valuable.

Cheers,

j.

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09  3:50                                 ` Dan'l Miller
  2018-04-09  6:40                                   ` Jan de Kruyf
@ 2018-04-09  7:43                                   ` Dmitry A. Kazakov
  2018-04-09 13:40                                     ` Dan'l Miller
  1 sibling, 1 reply; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-09  7:43 UTC (permalink / raw)


On 09/04/2018 05:50, Dan'l Miller wrote:

> I am saying that you are 100% incorrect that only the presence or only the absence of the @-code will be compiled in the most-exemplary implementation, leaving the other uncompiled variant to bit-rot uncompiled for extended periods of time.

I didn't say that. On the contrary all combinations must be compiled 
independently on which parts are presently active or inactive. Just the 
same way a pragma is compiled regardless if the compiler is going to 
ignore it in the end. That is the difference to preprocessing.

   I am saying that you sure appear to be quite ignorant of GLR parsers' 
ability to parse both the presence and absence of the @-code 
••concurrently in the same invocation•• of the compiler in O(n) time

Correct. I don't care about LR etc parsers and consider the whole branch 
of formal grammars with accompanied parsers useless. None of the 
compilers from domain-specific language I developed over years ever used 
that stuff.

If you want to say that your parser could not handle that, I would not 
question that. Maybe. Though I doubt that, but do not care, because see 
above.

I also consider the idea of source code generation in general and from a 
formal grammar in particular incredibly harmful for the software 
development POV. There must be no meta-languages involved except where 
absolutely necessary (e.g. in correctness proofs)

> I am saying that nearly everything that you have said among your multiple replies to Randy's @-code is hogwash and horse-hooey.  Perhaps.  Every.  Word.  Of.  It.

It is easy to be perfect. To be wrong in every word is almost impossible 
(due to the Liar Antinomy). So thanks. (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09  7:43                                   ` Dmitry A. Kazakov
@ 2018-04-09 13:40                                     ` Dan'l Miller
  2018-04-09 14:13                                       ` Dmitry A. Kazakov
  0 siblings, 1 reply; 57+ messages in thread
From: Dan'l Miller @ 2018-04-09 13:40 UTC (permalink / raw)


On Monday, April 9, 2018 at 2:43:46 AM UTC-5, Dmitry A. Kazakov wrote:
> On 09/04/2018 05:50, Dan'l Miller wrote:
> 
> > I am saying that you are 100% incorrect that only the presence or only the absence of the @-code will be compiled in the most-exemplary implementation, leaving the other uncompiled variant to bit-rot uncompiled for extended periods of time.
> 
> I didn't say that.

Oh yes you did by examples, where you (and even enlisting the late Ichbiah at one point) give examples of bit-rot where some unwise programmer of @-code left a presence/absence branch of  the @-code uncompilable henceforth.  Your examples were clearly making the case that Mr. Unwise Programmer was going to stink up the codebase with @-code that was compiled only with @-code present that would break in would-be builds where @-code was absent that Mr. Unwise Programmer cavalierly failed to test-build, but that the next beleaguered programmer would need to rectify upon building with @-code absent.  Your entire premise was based on the mistaken claim that the build-time would not build both presence and absence of the @-code in one shot to show Mr. Unwise Programmer the error of his ways right then and there upfront in his face.  Let us look at 2 of those examples:

Dmitry wrote on 08 April 2018:
> More complicated rules are required for declarations, e.g. that 
> conditionally declared entities would not be visible outside conditional 
> code: 
>
> declare 
> @  X : Integer; 
>     Y : Integer; 
> begin 
>     Y := X; -- This is illegal 
> @  Y := X; -- This is OK 
> end; 
>
> or 
>
> @ with Text_IO; 
> use Text_IO; -- No, that does not work 

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 13:40                                     ` Dan'l Miller
@ 2018-04-09 14:13                                       ` Dmitry A. Kazakov
  2018-04-09 14:36                                         ` Dan'l Miller
  0 siblings, 1 reply; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-09 14:13 UTC (permalink / raw)


On 09/04/2018 15:40, Dan'l Miller wrote:
> On Monday, April 9, 2018 at 2:43:46 AM UTC-5, Dmitry A. Kazakov wrote:
>> On 09/04/2018 05:50, Dan'l Miller wrote:
>>
>>> I am saying that you are 100% incorrect that only the presence or only the absence of the @-code will be compiled in the most-exemplary implementation, leaving the other uncompiled variant to bit-rot uncompiled for extended periods of time.
>>
>> I didn't say that.
> 
> Oh yes you did by examples, where you (and even enlisting the late Ichbiah at one point) give examples of bit-rot where some unwise programmer of @-code left a presence/absence branch of  the @-code uncompilable henceforth.  Your examples were clearly making the case that Mr. Unwise Programmer was going to stink up the codebase with @-code that was compiled only with @-code present that would break in would-be builds where @-code was absent that Mr. Unwise Programmer cavalierly failed to test-build, but that the next beleaguered programmer would need to rectify upon building with @-code absent.  Your entire premise was based on the mistaken claim that the build-time would not build both presence and absence of the @-code in one shot to show Mr. Unwise Programmer the error of his ways right then and there upfront in his face.  Let us look at 2 of those examples:
> 
> Dmitry wrote on 08 April 2018:
>> More complicated rules are required for declarations, e.g. that
>> conditionally declared entities would not be visible outside conditional
>> code:
>>
>> declare
>> @  X : Integer;
>>      Y : Integer;
>> begin
>>      Y := X; -- This is illegal
>> @  Y := X; -- This is OK
>> end;
>>
>> or
>>
>> @ with Text_IO;
>> use Text_IO; -- No, that does not work

Both examples present would be illegal code. What is your point? To have 
it legal?

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 14:13                                       ` Dmitry A. Kazakov
@ 2018-04-09 14:36                                         ` Dan'l Miller
  2018-04-09 14:44                                           ` Dmitry A. Kazakov
  0 siblings, 1 reply; 57+ messages in thread
From: Dan'l Miller @ 2018-04-09 14:36 UTC (permalink / raw)


On Monday, April 9, 2018 at 9:13:26 AM UTC-5, Dmitry A. Kazakov wrote:
> On 09/04/2018 15:40, Dan'l Miller wrote:
> > On Monday, April 9, 2018 at 2:43:46 AM UTC-5, Dmitry A. Kazakov wrote:
> >> On 09/04/2018 05:50, Dan'l Miller wrote:
> >>
> >>> I am saying that you are 100% incorrect that only the presence or only the absence of the @-code will be compiled in the most-exemplary implementation, leaving the other uncompiled variant to bit-rot uncompiled for extended periods of time.
> >>
> >> I didn't say that.
> > 
> > Oh yes you did by examples, where you (and even enlisting the late Ichbiah at one point) give examples of bit-rot where some unwise programmer of @-code left a presence/absence branch of  the @-code uncompilable henceforth.  Your examples were clearly making the case that Mr. Unwise Programmer was going to stink up the codebase with @-code that was compiled only with @-code present that would break in would-be builds where @-code was absent that Mr. Unwise Programmer cavalierly failed to test-build, but that the next beleaguered programmer would need to rectify upon building with @-code absent.  Your entire premise was based on the mistaken claim that the build-time would not build both presence and absence of the @-code in one shot to show Mr. Unwise Programmer the error of his ways right then and there upfront in his face.  Let us look at 2 of those examples:
> > 
> > Dmitry wrote on 08 April 2018:
> >> More complicated rules are required for declarations, e.g. that
> >> conditionally declared entities would not be visible outside conditional
> >> code:
> >>
> >> declare
> >> @  X : Integer;
> >>      Y : Integer;
> >> begin
> >>      Y := X; -- This is illegal
> >> @  Y := X; -- This is OK
> >> end;
> >>
> >> or
> >>
> >> @ with Text_IO;
> >> use Text_IO; -- No, that does not work
> 
> Both examples present would be illegal code. What is your point? To have 
> it legal?

I have already answered that completely in my prior replies:  an exemplary implementation would compile both sides of the @-code presence and absence to reveal to Mr. Unwise Programmer his uncouthness right away—either via GLR, GLL, or brute-force invoking the compiler twice, once with and once without @-code.  Assuming that you read at all before posting, my point now is becoming:  methinks something else is going on here with you, Dmitry.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 14:36                                         ` Dan'l Miller
@ 2018-04-09 14:44                                           ` Dmitry A. Kazakov
  2018-04-09 15:03                                             ` Dan'l Miller
  0 siblings, 1 reply; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-09 14:44 UTC (permalink / raw)


On 09/04/2018 16:36, Dan'l Miller wrote:
> On Monday, April 9, 2018 at 9:13:26 AM UTC-5, Dmitry A. Kazakov wrote:
>> On 09/04/2018 15:40, Dan'l Miller wrote:
>>> On Monday, April 9, 2018 at 2:43:46 AM UTC-5, Dmitry A. Kazakov wrote:
>>>> On 09/04/2018 05:50, Dan'l Miller wrote:
>>>>
>>>>> I am saying that you are 100% incorrect that only the presence or only the absence of the @-code will be compiled in the most-exemplary implementation, leaving the other uncompiled variant to bit-rot uncompiled for extended periods of time.
>>>>
>>>> I didn't say that.
>>>
>>> Oh yes you did by examples, where you (and even enlisting the late Ichbiah at one point) give examples of bit-rot where some unwise programmer of @-code left a presence/absence branch of  the @-code uncompilable henceforth.  Your examples were clearly making the case that Mr. Unwise Programmer was going to stink up the codebase with @-code that was compiled only with @-code present that would break in would-be builds where @-code was absent that Mr. Unwise Programmer cavalierly failed to test-build, but that the next beleaguered programmer would need to rectify upon building with @-code absent.  Your entire premise was based on the mistaken claim that the build-time would not build both presence and absence of the @-code in one shot to show Mr. Unwise Programmer the error of his ways right then and there upfront in his face.  Let us look at 2 of those examples:
>>>
>>> Dmitry wrote on 08 April 2018:
>>>> More complicated rules are required for declarations, e.g. that
>>>> conditionally declared entities would not be visible outside conditional
>>>> code:
>>>>
>>>> declare
>>>> @  X : Integer;
>>>>       Y : Integer;
>>>> begin
>>>>       Y := X; -- This is illegal
>>>> @  Y := X; -- This is OK
>>>> end;
>>>>
>>>> or
>>>>
>>>> @ with Text_IO;
>>>> use Text_IO; -- No, that does not work
>>
>> Both examples present would be illegal code. What is your point? To have
>> it legal?
> 
> I have already answered that completely in my prior replies:  an exemplary implementation would compile both sides of the @-code presence and absence to reveal to Mr. Unwise Programmer his uncouthness right away—either via GLR, GLL, or brute-force invoking the compiler twice, once with and once without @-code.

And the problem/point is?

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 14:44                                           ` Dmitry A. Kazakov
@ 2018-04-09 15:03                                             ` Dan'l Miller
  0 siblings, 0 replies; 57+ messages in thread
From: Dan'l Miller @ 2018-04-09 15:03 UTC (permalink / raw)


On Monday, April 9, 2018 at 9:44:17 AM UTC-5, Dmitry A. Kazakov wrote:
> On 09/04/2018 16:36, Dan'l Miller wrote:
> >> Both examples present would be illegal code. What is your point? To have
> >> it legal?
> > 
> > I have already answered that completely in my prior replies:  an exemplary implementation would compile both sides of the @-code presence and absence to reveal to Mr. Unwise Programmer his uncouthness right away—either via GLR, GLL, or brute-force invoking the compiler twice, once with and once without @-code.
> 
> And the problem/point is?

> > Assuming that you read at all before posting, my point now is becoming:
> > methinks something else is going on here with you, Dmitry. 

troller fisherman—an elaborate and sophisticated one most of the time in replies to other top-level postings, but you were sloppier this time herein, accidentally more directly revealing what you do

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-08 19:48                             ` Dan'l Miller
  2018-04-08 20:09                               ` Dmitry A. Kazakov
@ 2018-04-09 16:12                               ` Niklas Holsti
  2018-04-09 16:30                                 ` Dmitry A. Kazakov
  2018-04-09 18:08                                 ` Dan'l Miller
  1 sibling, 2 replies; 57+ messages in thread
From: Niklas Holsti @ 2018-04-09 16:12 UTC (permalink / raw)


On 18-04-08 22:48 , Dan'l Miller wrote:
> On Sunday, April 8, 2018 at 2:46:15 AM UTC-5, Dmitry A. Kazakov wrote:
>>     if Foo then
>> @     Bar;
>> @  end if;
>> @  if Boo then
>>        Baz;
>>     end if;
>>
>> would not be legal either.
>
> Where ␠ is the ASCII/ISO646 space character (presented as
> a graphical character for visualization herein),
>
>     if Foo then
> ␠     Bar;
> ␠  end if;
> ␠  if Boo then
>        Baz;
>     end if;
>
> and
>
>     if Foo then
> --     Bar;
> --  end if;
> --  if Boo then
>        Baz;
>     end if;
>
> both are legal Ada statements, spurious and non sequitur
> evocations of the ARM notwithstanding.

I'm sure that Dmitry understands fully how the "@" mechanism currently 
works in Janus/Ada: as a lexical marker that either makes the line 
visible to the parser and compiler, or makes it into a comment line that 
is essentially invisible (white space) to the parser and compiler.

As Randy showed, it is then easy to make editing mistakes that make one 
of the two resulting source-code forms grammatically incorrect.

I think Dmitry is suggesting that to avoid such problems, the inclusion 
or exclusion of code parts should not be done on the lexical level, line 
by line, but should be better integrated into the grammatical structure 
so that the included or excluded pieces of code would be whole 
grammatical structures, with grammatically defined boundaries.

> In general, Dmitry, you seem to not understand that a generalized
> left-right (GLR) parser would parse both sides of the @ concurrently.
> Your entire premise is that it is impossible to parse both presence
> and absence of @ at the same time.  In GLR, you are incorrect.

I don't think that GLR parsing solves this problem. Firstly, I assume 
that you are thinking of non-deterministic grammar rules in which  "@ 
<construct>" could be parsed either into "<construct>", understanding 
the "@" option as "enabled", or into whitespace, understanding the "@" 
option as "disabled". Well and good, but the GLR parser would apply this 
nondeterministic choice separately at each occurrence of a "@", so there 
would be an exponentially large set of alternative parse attempts.

For example, the text

     if Foo then
        Bar;
@   else
@      Baz;
     end if;

would be parsed in four ways, according to the four combinations of 
choices (enabled/disabled) for the two @ signs:

First way (disabled - disabled):

    if Foo then
       Bar;
    end if;

Second way (enabled - disabled):

    if Foo then
       Bar;
    else
    end if;

Third way (disabled - enabled):

    if Foo then
       Bar;
       Baz;
    end if;

Fourth way (enabled - enabled):

    if Foo then
       Bar;
    else
       Baz;
    end if;

Only the first and fourth ways correspond to how "@" works in Janus/Ada. 
The second way is even syntactically illegal, but an error message from 
a GLR parser would be quite confusing to a Janus/Ada user.

For GLR to work for this problem, you would have to extend it with some 
kind of "uniform choice" rule, by which all non-determinism depending on 
the use of "@" should follow a single, global, non-deterministic choice.

-- 
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 16:12                               ` Niklas Holsti
@ 2018-04-09 16:30                                 ` Dmitry A. Kazakov
  2018-04-09 16:45                                   ` Niklas Holsti
  2018-04-09 20:24                                   ` Randy Brukardt
  2018-04-09 18:08                                 ` Dan'l Miller
  1 sibling, 2 replies; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-09 16:30 UTC (permalink / raw)


On 2018-04-09 18:12, Niklas Holsti wrote:

> For example, the text
> 
>      if Foo then
>         Bar;
> @   else
> @      Baz;
>      end if;
> 
> would be parsed in four ways, according to the four combinations of 
> choices (enabled/disabled) for the two @ signs:

[...]

It is a trivial recursive-descent parser. "else" cannot follow "@", 
because there is no statement starting with "else". Full stop.

You compile all code, always, taking choices at the stage of code 
generation. Simple as tooth powder.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 16:30                                 ` Dmitry A. Kazakov
@ 2018-04-09 16:45                                   ` Niklas Holsti
  2018-04-09 17:33                                     ` Dan'l Miller
  2018-04-09 19:47                                     ` Dmitry A. Kazakov
  2018-04-09 20:24                                   ` Randy Brukardt
  1 sibling, 2 replies; 57+ messages in thread
From: Niklas Holsti @ 2018-04-09 16:45 UTC (permalink / raw)


On 18-04-09 19:30 , Dmitry A. Kazakov wrote:
> On 2018-04-09 18:12, Niklas Holsti wrote:
>
>> For example, the text
>>
>>      if Foo then
>>         Bar;
>> @   else
>> @      Baz;
>>      end if;
>>
>> would be parsed in four ways, according to the four combinations of
>> choices (enabled/disabled) for the two @ signs:
>
> [...]
>
> It is a trivial recursive-descent parser. "else" cannot follow "@",
> because there is no statement starting with "else". Full stop.
>
> You compile all code, always, taking choices at the stage of code
> generation. Simple as tooth powder.

I was only commenting on the inapplicability of a GLR parser for this 
problem.

You seem to be suggesting a specific way to use "@" to include/exclude 
entire statements (not text lines) from the effective code. Could you 
please make your suggestion explicit? For example, with regard to the 
need (or perhaps not) to have at least a "null;" statement in each branch.

-- 
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 16:45                                   ` Niklas Holsti
@ 2018-04-09 17:33                                     ` Dan'l Miller
  2018-04-09 19:47                                     ` Dmitry A. Kazakov
  1 sibling, 0 replies; 57+ messages in thread
From: Dan'l Miller @ 2018-04-09 17:33 UTC (permalink / raw)


On Monday, April 9, 2018 at 11:45:29 AM UTC-5, Niklas Holsti wrote:
> On 18-04-09 19:30 , Dmitry A. Kazakov wrote:
> > On 2018-04-09 18:12, Niklas Holsti wrote:
> > ...
> > It is a trivial recursive-descent parser. "else" cannot follow "@",
> > because there is no statement starting with "else". Full stop.
> > ...
>
> You seem to be suggesting a specific way to use "@" to include/exclude 
> entire statements (not text lines) from the effective code.

Yes, that is the red herring that Dmitry is repeatedly gratuitously injecting for no good reason other than to be a troller fisherman to elicit replies.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 16:12                               ` Niklas Holsti
  2018-04-09 16:30                                 ` Dmitry A. Kazakov
@ 2018-04-09 18:08                                 ` Dan'l Miller
  2018-04-09 21:17                                   ` Niklas Holsti
  1 sibling, 1 reply; 57+ messages in thread
From: Dan'l Miller @ 2018-04-09 18:08 UTC (permalink / raw)


On Monday, April 9, 2018 at 11:12:56 AM UTC-5, Niklas Holsti wrote:
> On 18-04-08 22:48 , Dan'l Miller wrote:
> ... but the GLR parser would apply this 
> nondeterministic choice separately at each occurrence of a "@", so there 
> would be an exponentially large set of alternative parse attempts.

Good Lord!  What an especially inappropriate design to intentionally cause combinatorial explosion.  Why look for the worst possible design?  The following URL's design would be wiser and practical (e.g., where @ is isomorphic to #if DEBUG in the C preprocessor and thus exactly one CONFIG…X in this paper).  This paper explains quite lucidly the fork-merge of the temporarily bifurcated reduced-production stack.  Note that this paper solves the general case of numerous conditional-compilation CONFIG…Xs.  Here with @-code there is exactly one such conditional-compilation configurator per compilation unit.  The reduced-production stack will temporarily bifurcate into exactly 2 parallel reduced-production stacks (and have exactly 2 alternative AST subtrees extant concurrently for the affected lines of source code).  Then those 2 bifurcations will be merged again into a traditional LR reduced-production stack upon lack of an @ prefix after a contiguous sequence of @-code lines.  The GLR is driven by a bifurcatable lexical-analysis layer that intentionally (dynamically) introduces an ambiguity.  Niklas, you seem to be thinking that @ would appear in the GLRed BNF.  No, it would not, at least not in an exemplary implementation.  In GLR, the origin of the ambiguity can be from a variety of origins:  codified in the grammar itself, dynamically injected by the lexical-analysis layer, a programmer's unwisely-excessively-deeply nested conditional-compilation via a preprocessor, and so forth.

https://cs.nyu.edu/rgrimm/papers/pldi12.pdf

And if bottom-up parsing is too out-of-vogue for anyone out there, then there exists GLL for bifurcating the top-down recursive-descent call-stack, such as via generators.  It is merely a difference of which stack is being bifurcated into 2 temporarily-both-entertained stacks (to produce 2 alternative AST subtrees extant concurrently for the affected lines of source code) before being merged back together as
1) a traditional LL call-stack of guesses that turned out correct
or
2) a traditional LR reduced-production stack of •reactive•-system's observance of a stream of tokens.  (But I digress to that Rx/Ix prior discussion here on comp.lang.ada.)


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 16:45                                   ` Niklas Holsti
  2018-04-09 17:33                                     ` Dan'l Miller
@ 2018-04-09 19:47                                     ` Dmitry A. Kazakov
  1 sibling, 0 replies; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-09 19:47 UTC (permalink / raw)


On 2018-04-09 18:45, Niklas Holsti wrote:

> You seem to be suggesting a specific way to use "@" to include/exclude 
> entire statements (not text lines) from the effective code.

Right:

- statements in the context of control flow;
- declarations in the context of declarations;
- Boolean expression within an expression (replaced by False when removed).

Conditional declaration are invisible outside conditional code:

    declare
       @ X : Integer;
    begin
       X := 1; -- Illegal, X is not visible

Unconditionally declared objects are immutable within conditional code.

    declare
       X : Integer;
    begin
       @ X := 1; -- Illegal, X is read-only

Maybe this rule should be relaxed a bit.

> Could you 
> please make your suggestion explicit? For example, with regard to the 
> need (or perhaps not) to have at least a "null;" statement in each branch.

All rules regarding null, return apply:

    if Foo then
       @ null;
       -- Illegal, statement is expected
    end if;

    function Foo return Integer is
    begin
       null;
       @ return 123;
    end Foo; -- Illegal, no return statement

But this is legal:

    function Foo return Integer is
    begin
       @ return 123;
       return 122;
    end Foo;

BTW, I don't like @ much. Maybe there should be a better syntax, 
especially to support multiple conditional spaces, e.g. for different 
things to debug. A sort of brackets would work better:

    pragma Debug (Pool) do
       -- Debugging code for Poll
    end Debug;

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-07 15:27                     ` Dan'l Miller
  2018-04-07 15:59                       ` Dmitry A. Kazakov
@ 2018-04-09 20:14                       ` Randy Brukardt
  1 sibling, 0 replies; 57+ messages in thread
From: Randy Brukardt @ 2018-04-09 20:14 UTC (permalink / raw)


"Dan'l Miller" <optikos@verizon.net> wrote in message 
news:80db2d05-744f-4201-ba1b-4436f8040491@googlegroups.com...
On Saturday, April 7, 2018 at 5:11:35 AM UTC-5, Dmitry A. Kazakov wrote:
>> Yes, this is important but I think this could be resolved, e.g. by
>> compiling conditionals into corresponding null-effect constructs.

>Yeah, right.  This is an artificially contrived "problem" that is in fact
>not extant in reality.  Simply put the @ symbols in front of the .entire.
>construct to elide:  the entire else branch.
>
>    if Foo then
>           Bar;
>@ else
>@        Internal_Error;
>    endif;

Of course the fix is easy. But your definition of "artificially contrived" 
is pretty strange -- this is a mistake I make literally every week that I 
work on Janus/Ada. (I compile it both ways periodically to detect such 
mistakes, there's always several each time.)

The existence of an easy work-around has no effect on the human-factors part 
of this -- if there is an easy way to make a mistake, it will be made (and 
frequently).

Case in point: I've been using this feature regularly for 38+ years, I'm 
well aware of this concern, and I *still* make it regularly. (I do often see 
the mistake before compiling, but going back to fix it is just an extra 
step.)That's pretty much the definition of dangerous language design.

                      Randy. 


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 16:30                                 ` Dmitry A. Kazakov
  2018-04-09 16:45                                   ` Niklas Holsti
@ 2018-04-09 20:24                                   ` Randy Brukardt
  2018-04-10  8:17                                     ` Dmitry A. Kazakov
  1 sibling, 1 reply; 57+ messages in thread
From: Randy Brukardt @ 2018-04-09 20:24 UTC (permalink / raw)


Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> wrote in message 
news:pag4fd$pn4$1@gioia.aioe.org...
> On 2018-04-09 18:12, Niklas Holsti wrote:
>
>> For example, the text
>>
>>  if Foo then
>>  Bar;
>> @ else
>> @ Baz;
>>  end if;
>>
>> would be parsed in four ways, according to the four combinations of 
>> choices (enabled/disabled) for the two @ signs:
>
> [...]
>
> It is a trivial recursive-descent parser. "else" cannot follow "@", 
> because there is no statement starting with "else". Full stop.

Of course, that would very much emasulate the feature (probably 25% of the 
@-code in Janus/Ada is to remove various "elsif" and "else" cases). 
[Remember, we designed this code to work on very small memory hosts, so 
anything not critical to the operation of the compiler is removed. One 
reason that I don't use that any more in production code.]

Another common use that you wouldn't allow is to remove the exception 
handler from some code. (For Janus/Ada, the existence of an exception 
handler has a cost, so if it is not needed for production code, it is best 
completely eliminated.)

I think you could come up with a rule that would be less harmful to the 
feature than requiring full statements, especially for composite statements 
like if and case and blocks.

                   Randy. 


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 18:08                                 ` Dan'l Miller
@ 2018-04-09 21:17                                   ` Niklas Holsti
  2018-04-09 22:09                                     ` Dan'l Miller
  0 siblings, 1 reply; 57+ messages in thread
From: Niklas Holsti @ 2018-04-09 21:17 UTC (permalink / raw)


On 18-04-09 21:08 , Dan'l Miller wrote:
> On Monday, April 9, 2018 at 11:12:56 AM UTC-5, Niklas Holsti wrote:
>> On 18-04-08 22:48 , Dan'l Miller wrote: ... but the GLR parser
>> would apply this nondeterministic choice separately at each
>> occurrence of a "@", so there would be an exponentially large set
>> of alternative parse attempts.
>
> Good Lord!  What an especially inappropriate design to intentionally
> cause combinatorial explosion.

How kind you are.

That is AIUI what plain GLR would result in, if one does not extend GLR 
with something more.

> The following URL's design would be wiser and practical
> (e.g., where @ is isomorphic to #if DEBUG in the C preprocessor and
> thus exactly one CONFIG…X in this paper).

The reference Dan'l introduced is:

   [1] https://cs.nyu.edu/rgrimm/papers/pldi12.pdf

Quote from that paper, describing the SuperC parsing method, called 
Fork-Merge LR: "our work is inspired by GLR parsing ... which also forks 
and merges subparsers. But whereas GLR parsers match different 
productions to the same input fragment, FMLR matches the same 
production to different input fragments. Furthermore, unlike GLR and 
TypeChef, FMLR parsers can reuse existing LR grammars and parser table 
generators; only the parser engine is new."

It seems to be rather different from ordinary GLR.

> The reduced-production stack will temporarily bifurcate into exactly
> 2 parallel reduced-production stacks (and have exactly 2 alternative
> AST subtrees extant concurrently for the affected lines of source code).

If there are 2 alternative AST subtrees at each choice point, the total 
number of possible complete ASTs (which is what the later stages of an 
ordinary compiler would have to process) is exponential in the number of 
choice points (that is, "@" instances) -- assuming that the alternatives 
can be chosen independently, which is what plain GLR would give, as I 
understand it.

> Then those 2 bifurcations will be
> merged again into a traditional LR reduced-production stack upon lack
> of an @ prefix after a contiguous sequence of @-code lines.

I didn't claim that the parser would take exponential resources 
(although it may happen in [1]) just that the number of possible parses 
(ASTs) is exponential.

If the parser produces an "AST-with-alternatives" structure with 
alternative subtrees at N points, this "AST-with-alternatives" structure 
itself is not exponential in size, but it represents 2**N ordinary ASTs. 
Assuming, again, that we are allowed to choose any combination of the N 
alternative subtrees for producing the ordinary ASTs from the 
"AST-with-alternatives".

> Niklas, you seem to be
> thinking that @ would appear in the GLRed BNF.  No, it would not, at
> least not in an exemplary implementation.  In GLR, the origin of the
> ambiguity can be from a variety of origins:  codified in the grammar
> itself,

I suppose that means nondeterministic (ambiguous) productions, which is 
what I was talking of. With "@", or the equivalent, visible in the 
grammar, to make it ambiguous where a choice between alternatives is wanted.

> dynamically injected by the lexical-analysis layer,

The parser sees what the lexer produces. If the lexer produces an "@", 
or the equivalent "#ifdef" or whatever, then that must be in the grammar 
in order to be parsed.

If the lexer somehow compels the parser to fork or join at certain 
points in the token stream, this can just as well be described formally 
as inserting a special token or token sequence in the stream. In [1] it 
seems that the lexer and preprocessor stages produce C source which 
still contains #ifdef-#else-#endif structures, forcing the parser to 
fork at the #ifdef/#else, and join/merge at the #endif.

> a programmer's unwisely-excessively-deeply nested
> conditional-compilation via a preprocessor, and so forth.

Preprocessing is not parsing.

Ref. [1] indeed describes extensive automatic restructuring of the 
source code, mostly at the lexer and preprocessor levels, before the 
actual parsing.

One of the automatic transformations applied in [1] hoists #ifdef 
conditionals to a level where their branches are complete C syntactical 
"constructs". For example, in this original source code the #ifdef 
conditional part is not a complete C statement:

    #ifdef A
       if (cond)
          x = y;
       else
    #endif
          x = z;

In [1], this code is transformed (by the parser, it is said) into an 
equivalent form where the ifdef-controlled alternatives are C statements:

    #ifdef A
       if (cond)
          x = y;
       else
          x = z;
    #else
       x = z;
    #endif

It seems to me that this is similar to what Dmitry is suggesting, and 
that Dmitry would require programmers to write the second form rather 
than the first, although it is longer and has some duplicated code. For 
readers unused to #ifdef's (such as I) the second form is clearer.

The essential point in which [1] departs from basic GLR is that [1] 
produces an AST-with-alternatives structure where each alternative is 
labelled with the logical condition (#ifdef condition) for choosing this 
alternative.

If we assume, as in [1], that the "configuration variables" in the 
#ifdef conditions are constant (that is, have the same value at all 
choice points in the AST-with-alternatives), then the alternative 
subtrees chosen at the various choice points are correlated, and we no 
longer have an exponential number of ASTs (unless there are 
exponentially many combinations of control-variable values, which may 
happen).

This is exactly the kind of "uniform choice" principle that I spoke of, 
in my preceding post, as a necessary extension of GLR, for this job. 
Thus I think [1] reinforces the point I made there.

Note that [1] only claims to *parse* C in an "all configurations at 
once" manner. It explicitly excludes *compilation* from its scope. The 
parser in [1] does do a little semantic analysis (classifying 
identifiers as "typdef" or "other"), but it certainly cannot guarantee 
that a successfully parsed C program will also compile successfully in 
some configuration, let alone in all configurations. For an Ada 
"conditional compilation" facility, one would like to have the latter 
guarantee, or at least a guarantee of legality in all configurations.

-- 
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 21:17                                   ` Niklas Holsti
@ 2018-04-09 22:09                                     ` Dan'l Miller
  2018-04-10 19:23                                       ` Niklas Holsti
  0 siblings, 1 reply; 57+ messages in thread
From: Dan'l Miller @ 2018-04-09 22:09 UTC (permalink / raw)


On Monday, April 9, 2018 at 4:17:42 PM UTC-5, Niklas Holsti wrote:
> If there are 2 alternative AST subtrees at each choice point, the total 
> number of possible complete ASTs (which is what the later stages of an 
> ordinary compiler would have to process) is exponential in the number of 
> choice points (that is, "@" instances)

Trivially exponential:  at most 2¹ = 2 alternative AST subtrees as some points, because there is only one choice point:  an @-code-is-present on the compiler's command-line versus an @-code-is-elided on the compiler's command line.  Each @-prefixed line is not isomorphic to yet another differently-named macro in the C preprocessor (or M4, p4, etc preprocessors).  For p choicepoints on the compiler command-line, yes you are trivially correct:  “exponential” 2ᵖ but where p is not only constant, but merely p=1, hence GLR's O(pn) asymptotic rate of growth is O(1n) = O(n) and likewise for the at-most twice-as-big AST.  Not @ instances (as counted by lines of code) but @ degrees of freedom on the compiler command-line.  You are not counting the correct things.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 20:24                                   ` Randy Brukardt
@ 2018-04-10  8:17                                     ` Dmitry A. Kazakov
  0 siblings, 0 replies; 57+ messages in thread
From: Dmitry A. Kazakov @ 2018-04-10  8:17 UTC (permalink / raw)


On 09/04/2018 22:24, Randy Brukardt wrote:
> Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> wrote in message
> news:pag4fd$pn4$1@gioia.aioe.org...
>> On 2018-04-09 18:12, Niklas Holsti wrote:
>>
>>> For example, the text
>>>
>>>   if Foo then
>>>   Bar;
>>> @ else
>>> @ Baz;
>>>   end if;
>>>
>>> would be parsed in four ways, according to the four combinations of
>>> choices (enabled/disabled) for the two @ signs:
>>
>> [...]
>>
>> It is a trivial recursive-descent parser. "else" cannot follow "@",
>> because there is no statement starting with "else". Full stop.
> 
> Of course, that would very much emasulate the feature (probably 25% of the
> @-code in Janus/Ada is to remove various "elsif" and "else" cases).
> [Remember, we designed this code to work on very small memory hosts, so
> anything not critical to the operation of the compiler is removed. One
> reason that I don't use that any more in production code.]
> 
> Another common use that you wouldn't allow is to remove the exception
> handler from some code. (For Janus/Ada, the existence of an exception
> handler has a cost, so if it is not needed for production code, it is best
> completely eliminated.)

I think both cases would be happily handled by optimization:

    if ... then
      ...
    else
      @ Something
    end if;

Since @ Something becomes null, else null; is removed by the optimizer. 
The same would happen with

    exception
       when ... =>
          @ Something
          [raise;]

> I think you could come up with a rule that would be less harmful to the
> feature than requiring full statements, especially for composite statements
> like if and case and blocks.

Maybe, but that would require more pages to explain in the RM (:-))

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-09 22:09                                     ` Dan'l Miller
@ 2018-04-10 19:23                                       ` Niklas Holsti
  2018-04-10 19:46                                         ` Dan'l Miller
  0 siblings, 1 reply; 57+ messages in thread
From: Niklas Holsti @ 2018-04-10 19:23 UTC (permalink / raw)


On 18-04-10 01:09 , Dan'l Miller wrote:
> On Monday, April 9, 2018 at 4:17:42 PM UTC-5, Niklas Holsti wrote:
>> If there are 2 alternative AST subtrees at each choice point, the
>> total number of possible complete ASTs (which is what the later
>> stages of an ordinary compiler would have to process) is
>> exponential in the number of choice points (that is, "@"
>> instances)
>
> Trivially exponential:  at most 2¹ = 2 alternative AST subtrees as
> some points, because there is only one choice point:  an
> @-code-is-present on the compiler's command-line versus an
> @-code-is-elided on the compiler's command line.

I call a "choice point" a point in the input sentence where the parser 
has to fork into alternatives. For a GLR parser, this means every 
_occurrence_ of an "@" in the input sentence.

You are ignoring my explicitly stated assumption (which follows if one 
is using _just_ a GLR parser) that the choices of alternative at the 
various choice points are uncorrelated. This makes the number of 
combinations of choices, which is the number of alternative _entire_ 
ASTs (in the commonly accepted meaning of AST as _one_ parse of the 
input sentence) exponential in the number of choice points.

The paper to which you referred avoids that problem by associating 
choice points with conditions depending on preprocessor variables, which 
is a natural thing to do, but which also means passing beyond the 
abilities of _just_ GLR parsing.

> Each @-prefixed line is not isomorphic to yet another
> differently-named macro in the C preprocessor (or M4, p4, etc
> preprocessors).

In plain GLR parsing it is isomorphic, because a GLR parser does not 
create any correlations between the non-deterministic choices at 
different points in the sentence. That would be a context-dependent grammar.

> For p choicepoints on
> the compiler command-line, yes you are trivially correct:
> “exponential” 2ᵖ but where p is not only constant, but merely p=1,
> hence GLR's O(pn) asymptotic rate of growth is O(1n) = O(n) and
> likewise for the at-most twice-as-big AST.

That is true for the "Fork-Merge LR" (FMLR) parsing in the paper you 
referenced, but that is not GLR.

> Not @ instances (as counted by lines of code) but @ degrees of
> freedom on the compiler command-line.  You are not counting the
> correct things.

We are certainly counting different things.

You said that GLR parsing could be used to parse both "@ enabled" and "@ 
disabled" sentences at the same time. I maintain that _just_ a GLR 
parser is not suitable, because the uncorrelated choice of alternative 
parses at each choice point (each "@") does not match the meaning of the 
"@" symbol (enabled everywhere, or disabled everywhere).

The FMLR method, with alternatives correlated by logical conditions, 
works, at least if one fully believes the claims in the paper to which 
you referred (I am perhaps not quite convinced that the paper's 
"ifdef-hoisting" will work and remain relatively local in all cases).

-- 
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-10 19:23                                       ` Niklas Holsti
@ 2018-04-10 19:46                                         ` Dan'l Miller
  2018-04-15  7:50                                           ` Niklas Holsti
  0 siblings, 1 reply; 57+ messages in thread
From: Dan'l Miller @ 2018-04-10 19:46 UTC (permalink / raw)


On Tuesday, April 10, 2018 at 2:23:24 PM UTC-5, Niklas Holsti wrote:
> For a GLR parser,
> 
> _just_ a GLR parser
>
> _just_ GLR parsing.
>
> In plain GLR parsing
>
> That is true for the "Fork-Merge LR" (FMLR) parsing in the paper you 
> referenced, but that is not GLR.
>
> _just_ a GLR 

Which flavor of generalized LR are you taking as the sole seminal reference to definitively brand the official “just GLR” and “plain GLR”?  There are 2 seminal references historically:  Bernard Lang (1974) and Tomita Masaru (1985).  Lang's versus Tomita's 2 approaches to GLR are substantially different.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-06 23:49               ` Dan'l Miller
@ 2018-04-12 10:21                 ` Marius Amado-Alves
  2018-04-15 13:07                   ` Ada conditional compilation and program variants Niklas Holsti
  0 siblings, 1 reply; 57+ messages in thread
From: Marius Amado-Alves @ 2018-04-12 10:21 UTC (permalink / raw)


The @ sign looks suspicially like a blackhole, swalling all material around it.

Dmitry is contradictory (!) and confusing (all combinations of @-lines, really?)

I don't know Janus.

Given all these singular factoids I have no choice but to deem this conversation a subtle continuation of the joke.

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-10 19:46                                         ` Dan'l Miller
@ 2018-04-15  7:50                                           ` Niklas Holsti
  2018-04-15 13:31                                             ` Dan'l Miller
  0 siblings, 1 reply; 57+ messages in thread
From: Niklas Holsti @ 2018-04-15  7:50 UTC (permalink / raw)


On 18-04-10 22:46 , Dan'l Miller wrote:
> On Tuesday, April 10, 2018 at 2:23:24 PM UTC-5, Niklas Holsti wrote:
>> For a GLR parser,
>>
>> _just_ a GLR parser
>>
>> _just_ GLR parsing.
>>
>> In plain GLR parsing
>>
>> That is true for the "Fork-Merge LR" (FMLR) parsing in the paper
>> you referenced, but that is not GLR.
>>
>> _just_ a GLR
>
> Which flavor of generalized LR are you taking as the sole seminal
> reference to definitively brand the official “just GLR” and “plain
> GLR”?  There are 2 seminal references historically:  Bernard Lang
> (1974) and Tomita Masaru (1985).  Lang's versus Tomita's 2 approaches
> to GLR are substantially different.

I just went by the Wikipedia article that you referenced 
(https://en.wikipedia.org/wiki/GLR_parser), which does not mention 
context dependency, and the statement in the FMLR paper that their 
method is different from GLR.

Perhaps our disagreements are partly due to my strict interpretation of 
"parsing" as separate from "semantic analysis", while some people, and 
perhaps you, grant a "parser" the right to depend on context, such as on 
information collected into a symbol table. The symbol table (and its 
dependence on configuration-variable values) is an essential part of the 
FMLR processor.

Perhaps we can conclude this debate by agreeing that the FMLR method can 
process C source, with macros and #ifdefs, and check that all variants 
are syntactically valid, without separately processing /all/ the source 
with /all/ the combinations of configuration-variable values. It still 
cannot be called a "one pass" process, because it splits into 
(conceptually) parallel processes when needed (and, depending on some 
delicate choices, the number of such processes can be exponential in the 
number of configuration variables).

-- 
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Ada conditional compilation and program variants
  2018-04-12 10:21                 ` Marius Amado-Alves
@ 2018-04-15 13:07                   ` Niklas Holsti
  2018-05-07  8:41                     ` Jacob Sparre Andersen
  0 siblings, 1 reply; 57+ messages in thread
From: Niklas Holsti @ 2018-04-15 13:07 UTC (permalink / raw)


(I changed the Subject to more apt. Long post follows, sorry.)

On 18-04-12 13:21 , Marius Amado-Alves wrote:
> The @ sign looks suspicially like a blackhole, swalling all material
> around it.
>
> Dmitry is contradictory (!) and confusing (all combinations of
> @-lines, really?)
>
> I don't know Janus.
>
> Given all these singular factoids I have no choice but to deem this
> conversation a subtle continuation of the joke.

I don't think so.

Starting from the joke, the thread passed on to consider several 
existing compiler features, and serious suggestions, for controlling 
which parts of the Ada source code are used by the compiler, and for 
what purposes, including

- the proposal for "ghost code" in Ada 2020

- other extensions to the "contract" features of Ada

- the inconvenience, in current Ada, of having to comment-in and 
comment-out debugging code, when the static-Boolean-conditional method 
does not work (eg. to include or exclude a "with" clause)

- Janus/Ada conditional compilation, with the "@" marker

- the risk that some combinations of the conditions for conditional 
compilation result in a variant of the source code that is syntactically 
or semantically illegal, with an example from Randy for "@"

- Dmitry's suggestions for changes to the Janus/Ada "@" to reduce that 
risk by requiring "@" to control only whole syntactic constructs (ie. to 
work on the grammar level rather than the lexical level)

- Dan'l's suggestions for using GLR parsing, or related fork/merge 
parsing methods, to ensure that all possible variants of the source code 
are syntactically legal, with a reference to an implementation for 
C-with-preprocessor that has successfully done this for the Linux kernel 
code (with an interesting connection to Dmitry's suggestion in that one 
essential part of this implementation is "ifdef-hoisting" where the 
ifdef scopes are expanded to control whole syntactic constructs).

It seems to me that a common question for the above points is how to 
manage "variants" of an Ada program (and, as Dan'l commented, "aspects" 
might be included, because they have a similar need to be separated from 
the rest of the program). Variants may be necessary to support different 
target systems, or different compilers, or just to choose which optional 
features (such as debugging or state-consistency checking) should be 
included in a particular build of the program.

The traditional solution is to use a preprocessor to conditionally 
select or transform the source code before the compiler sees it. This 
however means that the basic source code is not Ada, but 
Ada-with-preprocessor, and it also leads to differences between 
implementations, for example Janus/Ada with "@" versus the AdaCore/GNAT 
preprocessor commands. And of course all the old arguments against 
macro-based, text-oriented preprocessors are still valid.

Barring preprocessors, a practical solution, which I often use and which 
I believe is widely used, is to isolate the features and variant code 
into their own packages (or separate subprograms), to provide variant 
bodies for those packages (eg. for different targets, or to include or 
omit debugging), and to guide the Ada compiler to select the desired 
bodies through some kind of search path (eg. ADA_INCLUDE_PATH for GNAT, 
or the GNAT project files for gprbuild). This often works well, but also 
often leads to some amount of duplicated invariant code in the various 
package bodies, because isolating exactly and only the variant code into 
packages would create a mess of very many small packages, possibly 
conflicting with the logical modular structure of the program. 
Furthermore, just as for the preprocessor method, this variant-bodies 
method is not standardized and is therefore supported in different ways 
by different IDEs and compilers.

It further seems to me that this thread has identified some desirable 
requirements for an Ada conditional-compilation / variant-support 
feature, including:

1. It should be defined in the Ada standard, to ensure portability 
across compilers and to make it easier for IDEs to support it by eg. 
hiding or colorizing inactive parts of the source code.

2. It should be a part of the Ada language (the grammar), and should 
control whole syntactic constructs, unlike macro-based, text-oriented 
conditional compilation directives or preprocessors.

3. It should allow implementations to use GLR/FMLR-like parsing and 
analysis methods that can process all possible variants "at once" and 
check their legality as far as possible.

The variant-bodies method provides all of these, to some extent. It 
follows point 1 because all the source code is standard Ada. It follows 
point 2 because the boundaries of the variant code are package or 
subprogram boundaries. It follows point 3 because the static conformance 
of each body variant with all package declarations (and thus with any 
variant body of any other package) can be ensured by compiling each body 
variant separately, without having to build the whole program for all 
possible combinations of variant bodies. (This assumes that only bodies 
have variants, and declarations are invariant; unfortunately, 
declarations often need variants, too.)

On the other hand, the variant-bodies method fails to provide some 
aspects of the above requirements: it fails on point 1 because the 
method of choosing a particular variant of a body is not standardized; 
it supports point 2 weakly, because it is limited to variants that are 
complete bodies, and usually requires a dedicated source-code file for 
each variant; and for point 3 it forces implementations of the 
"all-variants-at-once" processors to use compiler/builder-specific 
methods for finding the possible variants of each body.

At present, I don't have a suggestion for a better method (than the 
variant-bodies method), but I think it could be a fruitful extension to 
Ada, especially if it it could support all three uses: variants 
(different implementations of a non-optional feature of the program), 
optional features, and aspects (by which I mean the centralized 
specification, at one point in the program, of distributed actions taken 
at several, appropriate points in the program).

In fact, I have one idea that could be part of this Ada extension: 
"package definitions". A package definition would be a new kind of 
compilable unit (but generating no code), usually in its own source-code 
file, which would bear a similar relation to a package declaration as a 
package declaration currently bears to the package body:

- A package declaration declares the things that the corresponding 
package body is required to implement.

- Analogously, a package definition would define, or specify, the things 
that the corresponding package declaration is required to declare.

However, this definition would be on a less specific level (more 
"generic") than the actual declarations, and would therefore allow 
different package declarations (variants) to conform to the same package 
definition.

For example, the definition of package A could require package A to 
declare a type T that is a discrete type, but the definition might not 
require anything more of T. Thus, one variant of package A might declare 
T as an integer range, while another could define T as an enumeration.

The package definition would be, for example:

    package definition A is
       type T is (<>);
       procedure F (Item : in T);
    end A;

and a possible conforming package declaration would be:

    package A is
       type T is (X, Y, Z);
       type S is new String;
       procedure Foo (Item : in T);
       procedure Bar (Item : in out T);
    end A;

The Ada RM is in fact full of package definitions, because the 
declarations of the predefined, standard packages shown in the RM 
contain text like

    subtype Any_Priority is Integer range /implementation-defined/;

which a package definition would state as (for example):

    subtype Any_Priority is Integer range <>;

Package definitions would be optional, but if a package definition is 
present then the package declaration must conform to it. This would 
allow the "variant bodies" method to be extended to allow also variant 
declarations, as long as the package definitions are invariant.

An interesting question is this: if package declaration or body B uses 
package A, and package A has a definition, how far can a compiler check 
that B uses A in legal ways if the compiler is allowed to look only at 
the definition of A, but not at the (or a) declaration of A?

If the compiler could check legality using only the package definitions 
of server packages, it would make it easier to ensure that all program 
variants are legal, by two separate and non-combinatorial steps:

- checking each variant package declaration against the (invariant) 
package definition, and

- checking each variant package body against the (invariant) package 
definitions of all other packages (those used by this body).

If checking the legality of how B uses A requires looking at the 
declaration of A (and not only at the definition of A), there is again a 
risk of combinatorial explosion in checking the legality of all 
(complete) program variants when there are variant package declarations.

-- 
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-15  7:50                                           ` Niklas Holsti
@ 2018-04-15 13:31                                             ` Dan'l Miller
  2018-04-15 18:37                                               ` Niklas Holsti
  0 siblings, 1 reply; 57+ messages in thread
From: Dan'l Miller @ 2018-04-15 13:31 UTC (permalink / raw)


On Sunday, April 15, 2018 at 2:50:15 AM UTC-5, Niklas Holsti wrote:
> On 18-04-10 22:46 , Dan'l Miller wrote:
> > On Tuesday, April 10, 2018 at 2:23:24 PM UTC-5, Niklas Holsti wrote:
> >> For a GLR parser,
> >>
> >> _just_ a GLR parser
> >>
> >> _just_ GLR parsing.
> >>
> >> In plain GLR parsing
> >>
> >> That is true for the "Fork-Merge LR" (FMLR) parsing in the paper
> >> you referenced, but that is not GLR.
> >>
> >> _just_ a GLR
> >
> > Which flavor of generalized LR are you taking as the sole seminal
> > reference to definitively brand the official “just GLR” and “plain
> > GLR”?  There are 2 seminal references historically:  Bernard Lang
> > (1974) and Tomita Masaru (1985).  Lang's versus Tomita's 2 approaches
> > to GLR are substantially different.

Clearly Tomita's popularization of GLR in 1985 uses forking and merging of the reduced-productions stack, as called graph-structured stack (GSS).  You can clearly see the merging of the branches along the GSS in section 4.6 of this tutorial-survey paper:

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.132.2321&rep=rep1&type=pdf

If all of the branches of the GSS are thus merged, then GLR returns to LR temporarily until the next arbitrary-length look-ahead is encountered.  This is what eliminates the exponential explosion that you keep fearing.  Yes, a trivial exponential explosion would occur for @-code:  from 1 (equals 2 to the power of 0) main-trunk along the LR reduced-productions stack to 2 (equals 2 to the power of 1) when encountering a •contiguous sequence• of @-code, but then the exponential explosion stops at the end of the •contiguous sequence• of @-code as all branches of the GSS are merged back to LR's single main-trunk.


^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Interesting article on ARG work
  2018-04-15 13:31                                             ` Dan'l Miller
@ 2018-04-15 18:37                                               ` Niklas Holsti
  0 siblings, 0 replies; 57+ messages in thread
From: Niklas Holsti @ 2018-04-15 18:37 UTC (permalink / raw)


On 18-04-15 16:31 , Dan'l Miller wrote:
> On Sunday, April 15, 2018 at 2:50:15 AM UTC-5, Niklas Holsti wrote:
>> On 18-04-10 22:46 , Dan'l Miller wrote:
>>> On Tuesday, April 10, 2018 at 2:23:24 PM UTC-5, Niklas Holsti wrote:
>>>> For a GLR parser,
>>>>
>>>> _just_ a GLR parser
>>>>
>>>> _just_ GLR parsing.
>>>>
>>>> In plain GLR parsing
>>>>
>>>> That is true for the "Fork-Merge LR" (FMLR) parsing in the paper
>>>> you referenced, but that is not GLR.
>>>>
>>>> _just_ a GLR
>>>
>>> Which flavor of generalized LR are you taking as the sole seminal
>>> reference to definitively brand the official “just GLR” and “plain
>>> GLR”?  There are 2 seminal references historically:  Bernard Lang
>>> (1974) and Tomita Masaru (1985).  Lang's versus Tomita's 2 approaches
>>> to GLR are substantially different.
>
> Clearly Tomita's popularization of GLR in 1985 uses forking and
> merging of the reduced-productions stack, as called graph-structured
> stack (GSS).  You can clearly see the merging of the branches along
> the GSS in section 4.6 of this tutorial-survey paper:
>
> http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.132.2321&rep=rep1&type=pdf
>
> If all of the branches of the GSS are thus merged, then GLR returns
> to LR temporarily until the next arbitrary-length look-ahead is
> encountered.  This is what eliminates the exponential explosion that
> you keep fearing.

You keep misunderstanding my point. I fully grant that GLR parsers 
mostly avoid exponential blow-up of the parsing process. But if the 
parser is not context-dependent, the number of complete parses 
considered during a GLR parse is exponential in the number of places in 
the sentence where the parse is ambiguous / non-deterministic, because 
the choices of parsing actions at each such place are not correlated.

In essence, without context dependency or some other similar mechanism 
to enforce uniform choice, a GLR parser for @ assumes that "@X" can be 
equivalent to "X" at one place, and to "--X" at another. This leads to 
an exponential number of complete parses, but more seriously, it does 
not match the intended semantics of @, so most of those parses are 
infeasible and can fail (I showed examples earlier).

-- 
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .

^ permalink raw reply	[flat|nested] 57+ messages in thread

* Re: Ada conditional compilation and program variants
  2018-04-15 13:07                   ` Ada conditional compilation and program variants Niklas Holsti
@ 2018-05-07  8:41                     ` Jacob Sparre Andersen
  0 siblings, 0 replies; 57+ messages in thread
From: Jacob Sparre Andersen @ 2018-05-07  8:41 UTC (permalink / raw)


Niklas Holsti wrote:

> An interesting question is this: if package declaration or body B uses
> package A, and package A has a definition, how far can a compiler
> check that B uses A in legal ways if the compiler is allowed to look
> only at the definition of A, but not at the (or a) declaration of A?
>
> If the compiler could check legality using only the package
> definitions of server packages, it would make it easier to ensure that
> all program variants are legal, by two separate and non-combinatorial
> steps:
>
> - checking each variant package declaration against the (invariant)
>   package definition, and
>
> - checking each variant package body against the (invariant) package
>   definitions of all other packages (those used by this body).
>
> If checking the legality of how B uses A requires looking at the
> declaration of A (and not only at the definition of A), there is again
> a risk of combinatorial explosion in checking the legality of all
> (complete) program variants when there are variant package
> declarations.

The only counter-examples I've been able to come up with so far are
run-time errors, such as going beyond the range of a numerical type.

What about enumeration types?  Should it be allowed to declare names for
some, but not all values of an enumeration type?  One could of course
declare a function returning a value of the type in the package
definition, and say that it can be implemented in the package
declaration as an enumeration value. - Can't do that for characters
though, so maybe it is a bad idea.

Greetings,

Jacob
-- 
"Genes don't matter. It's all physics."

^ permalink raw reply	[flat|nested] 57+ messages in thread

end of thread, other threads:[~2018-05-07  8:41 UTC | newest]

Thread overview: 57+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2018-04-02  3:32 Interesting article on ARG work Randy Brukardt
2018-04-02 14:49 ` Dan'l Miller
2018-04-03 16:34   ` Bojan Bozovic
2018-04-03 22:33     ` Randy Brukardt
2018-04-04  2:12       ` Bojan Bozovic
2018-04-04 15:05       ` Dan'l Miller
2018-04-04 15:30         ` gerdien.de.kruyf
2018-04-04 16:09           ` Dan'l Miller
2018-04-04 22:30         ` Randy Brukardt
2018-04-04 22:43           ` Paul Rubin
2018-04-05  0:44             ` Mehdi Saada
2018-04-05 21:23               ` Randy Brukardt
2018-04-05  2:05           ` Bojan Bozovic
2018-04-05 22:12             ` Randy Brukardt
2018-04-06 13:35               ` Bojan Bozovic
2018-04-07  2:01                 ` Randy Brukardt
2018-04-05  7:21           ` Dmitry A. Kazakov
2018-04-05 22:18             ` Randy Brukardt
2018-04-06  7:30               ` Dmitry A. Kazakov
2018-04-07  2:25                 ` Randy Brukardt
2018-04-07 10:11                   ` Dmitry A. Kazakov
2018-04-07 15:27                     ` Dan'l Miller
2018-04-07 15:59                       ` Dmitry A. Kazakov
2018-04-08  0:14                         ` Dan'l Miller
2018-04-08  7:46                           ` Dmitry A. Kazakov
2018-04-08 19:48                             ` Dan'l Miller
2018-04-08 20:09                               ` Dmitry A. Kazakov
2018-04-09  3:50                                 ` Dan'l Miller
2018-04-09  6:40                                   ` Jan de Kruyf
2018-04-09  7:43                                   ` Dmitry A. Kazakov
2018-04-09 13:40                                     ` Dan'l Miller
2018-04-09 14:13                                       ` Dmitry A. Kazakov
2018-04-09 14:36                                         ` Dan'l Miller
2018-04-09 14:44                                           ` Dmitry A. Kazakov
2018-04-09 15:03                                             ` Dan'l Miller
2018-04-09 16:12                               ` Niklas Holsti
2018-04-09 16:30                                 ` Dmitry A. Kazakov
2018-04-09 16:45                                   ` Niklas Holsti
2018-04-09 17:33                                     ` Dan'l Miller
2018-04-09 19:47                                     ` Dmitry A. Kazakov
2018-04-09 20:24                                   ` Randy Brukardt
2018-04-10  8:17                                     ` Dmitry A. Kazakov
2018-04-09 18:08                                 ` Dan'l Miller
2018-04-09 21:17                                   ` Niklas Holsti
2018-04-09 22:09                                     ` Dan'l Miller
2018-04-10 19:23                                       ` Niklas Holsti
2018-04-10 19:46                                         ` Dan'l Miller
2018-04-15  7:50                                           ` Niklas Holsti
2018-04-15 13:31                                             ` Dan'l Miller
2018-04-15 18:37                                               ` Niklas Holsti
2018-04-09 20:14                       ` Randy Brukardt
2018-04-06 23:49               ` Dan'l Miller
2018-04-12 10:21                 ` Marius Amado-Alves
2018-04-15 13:07                   ` Ada conditional compilation and program variants Niklas Holsti
2018-05-07  8:41                     ` Jacob Sparre Andersen
2018-04-06 13:35 ` Interesting article on ARG work Marius Amado-Alves
2018-04-07  2:15   ` Randy Brukardt

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox