comp.lang.ada
 help / color / mirror / Atom feed
* Lost in translation (with SPARK user rules)
@ 2010-05-26 10:09 Yannick Duchêne (Hibou57)
  2010-05-26 10:38 ` Phil Thornley
                   ` (3 more replies)
  0 siblings, 4 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-26 10:09 UTC (permalink / raw)


Lost in translation... do you know the movie ?

No, I'm joking, this is not about this 2002 movie, that's about SPARK, and  
exactly, about translating some expressions in the rules language.

Previously, I have successfully added these rules, in an *.RLU file:

    inequals(122):  X/Y<1     may_be_deduced_from [ X>=0, Y>0, Y>X ].
    inequals(123):  X/Y<1     may_be_deduced_from [ X<=0, Y<0, Y<X ].
    inequals(124):  X/Y>(-1)  may_be_deduced_from [ X>=0, Y<0, (-Y)>X ].
    inequals(125):  X/Y>(-1)  may_be_deduced_from [ X<=0, Y>0, Y>(-X) ].


I wanted to do the same, for another kind of rule, about bitwise. The rule  
I wanted to add is something like:
(X and 2 ** N) = 0 may be deduced from ((X / (2 ** N)) and 1) = 0
and the opposite

I tried with this:

    bitwise(112): bit__and(X div (2**N), 1)=0 may_be_deduced_from [  
(bit__and(X, 2**N)=0), X>=0, N>=0 ].
    bitwise(113): bit__and(X, 2**N)=0 may_be_deduced_from [ (bit__and(X div  
(2**N), 1)=0), X>=0, N>=0 ].

But this fails, although the simplifier does not returns me any syntax  
error messages about this *.RLU file. This fails, because it cannot prove  
something like this:

    --# check ((Instance and 2) = 0) -> (((Instance / 2) and 1) = 0);

The latter should be directly deduced from the two previous rules if these  
were OK. So I suppose something is wrong with these rules, and the  
expression in “[...]” is not interpreted the way it seems to be to me.

One entertaining thing: I've noticed “and 1” is always replaced by “mod 2”  
in the simplifier's *.SIV output files. As an example, the latter Check  
clause is replaced by:

    C1:    bit__and(instance, 2) = 0 -> instance div 2 mod 2 = 0 .



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 10:09 Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
@ 2010-05-26 10:38 ` Phil Thornley
  2010-05-26 10:57   ` Yannick Duchêne (Hibou57)
  2010-05-27 19:53 ` Warren
                   ` (2 subsequent siblings)
  3 siblings, 1 reply; 46+ messages in thread
From: Phil Thornley @ 2010-05-26 10:38 UTC (permalink / raw)


On 26 May, 11:09, Yannick Duchêne (Hibou57) <yannick_duch...@yahoo.fr>
wrote:
[...]
> I tried with this:
>
>     bitwise(112): bit__and(X div (2**N), 1)=0 may_be_deduced_from [  
> (bit__and(X, 2**N)=0), X>=0, N>=0 ].
>     bitwise(113): bit__and(X, 2**N)=0 may_be_deduced_from [ (bit__and(X div  
> (2**N), 1)=0), X>=0, N>=0 ].
>
> But this fails, although the simplifier does not returns me any syntax  
> error messages about this *.RLU file. This fails, because it cannot prove  
> something like this:
>
>     --# check ((Instance and 2) = 0) -> (((Instance / 2) and 1) = 0);
>
> The latter should be directly deduced from the two previous rules if these  
> were OK. So I suppose something is wrong with these rules, and the  
> expression in “[...]” is not interpreted the way it seems to be to me.

I don't think there us anything wrong with these rules, but the
Simplifier will not use a *combination* of user rules to discharge a
proof.  (They are not as 'powerful' as the built-in rules as they are
not consulted until the Simplifier has tried all its other
strategies).

For example, section 7.3.1 of the Simplifier User Manual says:
   "For an inference rule of the form
      rulename(1): Goal may_be_deduced_from Conditions.
   the Simplifier will attempt to find a means of instantiating
   all of the wildcards in Goal such that it becomes an exact
   match for an existing undischarged conclusion."

This means that a single inference rule is applied only if it
*directly* matches a conclusion.

Also, since user rules are applied right at the end, you have to use
the form of the conclusion as it appears in the SIV file, so:
> One entertaining thing: I've noticed “and 1” is always replaced by “mod 2”  
> in the simplifier's *.SIV output files. As an example, the latter Check  
> clause is replaced by:
>
>     C1:    bit__and(instance, 2) = 0 -> instance div 2 mod 2 = 0 .

If that is the conclusion left in the SIV file then the user rule to
prove it is:
   user_bitwise(1): bit__and(X, 2) = 0 -> X div 2 mod 2 = 0
may_be_deduced .

(and a minor comment is that it is probably not a good idea to use the
Proof Checker rule family names for Simplifier user rules - it just
adds to the confusion about how the rules are used.)

Cheers,

Phil



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 10:38 ` Phil Thornley
@ 2010-05-26 10:57   ` Yannick Duchêne (Hibou57)
  2010-05-26 14:15     ` Pascal Obry
  2010-05-27  8:13     ` Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
  0 siblings, 2 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-26 10:57 UTC (permalink / raw)


Le Wed, 26 May 2010 12:38:52 +0200, Phil Thornley  
<phil.jpthornley@googlemail.com> a écrit:
> I don't think there us anything wrong with these rules, but the
> Simplifier will not use a *combination* of user rules to discharge a
> proof.  (They are not as 'powerful' as the built-in rules as they are
> not consulted until the Simplifier has tried all its other
> strategies).
>
> For example, section 7.3.1 of the Simplifier User Manual says:
>    "For an inference rule of the form
>       rulename(1): Goal may_be_deduced_from Conditions.
>    the Simplifier will attempt to find a means of instantiating
>    all of the wildcards in Goal such that it becomes an exact
>    match for an existing undischarged conclusion."
>
> This means that a single inference rule is applied only if it
> *directly* matches a conclusion.
>
> Also, since user rules are applied right at the end, you have to use
> the form of the conclusion as it appears in the SIV file, so:
>> One entertaining thing: I've noticed “and 1” is always replaced by “mod  
>> 2” in the simplifier's *.SIV output files. As an example, the latter  
>> Check  clause is replaced by:
>>
>>     C1:    bit__and(instance, 2) = 0 -> instance div 2 mod 2 = 0 .
>
> If that is the conclusion left in the SIV file then the user rule to
> prove it is:
>    user_bitwise(1): bit__and(X, 2) = 0 -> X div 2 mod 2 = 0
> may_be_deduced .
I remember about this, this was just that I did not suspected this had to  
be a so exact match.

> (and a minor comment is that it is probably not a good idea to use the
> Proof Checker rule family names for Simplifier user rules - it just
> adds to the confusion about how the rules are used.)
Oops, sorry, I though this was better to go on with an existing naming  
convention already established.


Overall so far to me.

I don't know what to think about one feeling, may be I'm wrong: I feel a  
bit discouraged with SPARK now. I wonder if this is really this thing I  
was waiting for so long (I am referring to some of my previous words about  
it in another thread).

May be I should give up with it.

If I could believe they could be some funding or a market for that, I  
would like to work on another system in this area... but I don't think so  
(I don't think there are many people expecting this kind of things).

Well, for the time being, I feel I gonna forget about it at least for some  
time and go on without it.

-- 
There is even better than a pragma Assert: a SPARK --# check.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 10:57   ` Yannick Duchêne (Hibou57)
@ 2010-05-26 14:15     ` Pascal Obry
  2010-05-26 14:28       ` Dmitry A. Kazakov
                         ` (2 more replies)
  2010-05-27  8:13     ` Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
  1 sibling, 3 replies; 46+ messages in thread
From: Pascal Obry @ 2010-05-26 14:15 UTC (permalink / raw)



Yannick,

> Overall so far to me.
> 
> I don't know what to think about one feeling, may be I'm wrong: I feel a
> bit discouraged with SPARK now. I wonder if this is really this thing I
> was waiting for so long (I am referring to some of my previous words
> about it in another thread).

Maybe you were expecting something that SPARK is not. SPARK is not a
replacement for Ada. SPARK is designed for highly secure software. You
won't create a Web server, an XML parser or even some kind of simulation
and GUI with it. In those application domains you need full Ada (binding
to external libraries, generics, full OO, standard libraries Ada.*,
Interfaces.* and possibly GNAT.*, full text and stream IO...).

I had to create a binding to the OS sockets in SPARK, I can tell you
that it was not easy... Especially as this was my first experience with
SPARK!

For embedded control-command application that's another story. I think
that SPARK has something invaluable to offer.

I have also thought that you can mix SPARK and Ada in the same
application. Using SPARK in the critical part, and Ada for the rest...
Don't know how well this would work as I have not gone through this yet.

Just my 2 cents of course!

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|    http://www.obry.net  -  http://v2p.fr.eu.org
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver keys.gnupg.net --recv-key F949BD3B




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 14:15     ` Pascal Obry
@ 2010-05-26 14:28       ` Dmitry A. Kazakov
  2010-05-26 19:28         ` Yannick Duchêne (Hibou57)
  2010-05-26 19:16       ` Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
  2010-05-27 10:11       ` Sockets package in SPARK (Was: Lost in translation (with SPARK user rules)) Jacob Sparre Andersen
  2 siblings, 1 reply; 46+ messages in thread
From: Dmitry A. Kazakov @ 2010-05-26 14:28 UTC (permalink / raw)


On Wed, 26 May 2010 16:15:41 +0200, Pascal Obry wrote:

> I have also thought that you can mix SPARK and Ada in the same
> application. Using SPARK in the critical part, and Ada for the rest...
> Don't know how well this would work as I have not gone through this yet.

Yes. I think this could be a direction in which Ada should evolve. It
should have a modular part equivalent to SPARK, which can be used with
certain compilation units. So that the programmer could choose the level of
safety he is ready to invest into.

It would be nice to be able to start the project at some middle level (a
bit higher than of present Ada, but lower than SPARK), and then gradually
adjust it, as the project evolves.

Another 2 cents.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 14:15     ` Pascal Obry
  2010-05-26 14:28       ` Dmitry A. Kazakov
@ 2010-05-26 19:16       ` Yannick Duchêne (Hibou57)
  2010-05-26 19:32         ` Pascal Obry
  2010-05-26 22:17         ` Peter C. Chapin
  2010-05-27 10:11       ` Sockets package in SPARK (Was: Lost in translation (with SPARK user rules)) Jacob Sparre Andersen
  2 siblings, 2 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-26 19:16 UTC (permalink / raw)


Le Wed, 26 May 2010 16:15:41 +0200, Pascal Obry <pascal@obry.net> a écrit:

Hi Pascal,

> Maybe you were expecting something that SPARK is not. SPARK is not a
> replacement for Ada.
Yes, I understand that, I did not had another believe with this

> SPARK is designed for highly secure software.
That is where I was not agree with a previous similar sentence from  
someone else. I don't see a reason why only critical applications should  
benefit of verifiability. This would not be a waste to apply standard  
logic to program construction, and I even feel this should be an expected  
minimum.

> You
> won't create a Web server, an XML parser or even some kind of simulation
> and GUI with it. In those application domains you need full Ada (binding
> to external libraries, generics, full OO, standard libraries Ada.*,
> Interfaces.* and possibly GNAT.*, full text and stream IO...).
For IO, there are ways to have a specification and hide an implementation,  
like SPARK_IO do. For simulation, depends on simulation of what. For GUI,  
I agree, as this mostly have to be plastic. For an XML parser, this should  
be OK, as this is mainly a kind of state-machine. For a web-server, I  
don't have an idea (may be yes for some parts, not for some others).

> I had to create a binding to the OS sockets in SPARK, I can tell you
> that it was not easy... Especially as this was my first experience with
> SPARK!
Well, I was trying to prove a personal implementation of a Deflate stream  
(RFC 1951), decocer/encoder.

What matters did you encounter with the OS socket binding ? (if this can  
be drawn with no excessive difficulties)

> For embedded control-command application that's another story. I think
> that SPARK has something invaluable to offer.
>
> I have also thought that you can mix SPARK and Ada in the same
> application. Using SPARK in the critical part, and Ada for the rest...
> Don't know how well this would work as I have not gone through this yet.
OK, but what made me disappointed, is mainly that I could not make it  
learn/use new rules really properly (well, and the probably coming trouble  
with generics as well). I'm pretty sure that if it would be possible it  
could be used in many more cases than only the so called high-critical  
ones.

> Just my 2 cents of course!
No, was valuable


I may try again, and see can what can be done with this rules problem.


-- 
There is even better than a pragma Assert: a SPARK --# check.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 14:28       ` Dmitry A. Kazakov
@ 2010-05-26 19:28         ` Yannick Duchêne (Hibou57)
  2010-05-26 20:14           ` Dmitry A. Kazakov
  0 siblings, 1 reply; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-26 19:28 UTC (permalink / raw)


Le Wed, 26 May 2010 16:28:20 +0200, Dmitry A. Kazakov  
<mailbox@dmitry-kazakov.de> a écrit:
> Yes. I think this could be a direction in which Ada should evolve. It
> should have a modular part equivalent to SPARK, which can be used with
> certain compilation units. So that the programmer could choose the level  
> of
> safety he is ready to invest into.

Please, tell more : you mean a kind of pragma or compiler option like the  
ones there is for runtime checks ?

By the way, that's an opportunity for two other ideas : why not integrate  
the SPARK language definition in the Ada standard ? We may have wordings  
in the Ada reference or the annotated reference, stating that is and that  
is allowed or disallowed with SPARK. And second thing, related to this  
latter : why not an ACAT tests suit targeting SPARK/Ada compilation  
capabilities beside of full Ada ? Actually, how can you test an compiler  
compliance with SPARK ? I feel you can do it only for full Ada.

> It would be nice to be able to start the project at some middle level (a
> bit higher than of present Ada, but lower than SPARK), and then gradually
> adjust it, as the project evolves.
Like ensuring it is written in a valid SPARK syntax before we know if this  
part will really be semantically checked or not ?

> Another 2 cents.
Re-no, re-was re-valuable


-- 
There is even better than a pragma Assert: a SPARK --# check.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 19:16       ` Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
@ 2010-05-26 19:32         ` Pascal Obry
  2010-05-26 20:56           ` Yannick Duchêne (Hibou57)
  2010-05-26 22:06           ` Peter C. Chapin
  2010-05-26 22:17         ` Peter C. Chapin
  1 sibling, 2 replies; 46+ messages in thread
From: Pascal Obry @ 2010-05-26 19:32 UTC (permalink / raw)


Yannick,

> That is where I was not agree with a previous similar sentence from
> someone else. I don't see a reason why only critical applications should
> benefit of verifiability. This would not be a waste to apply standard
> logic to program construction, and I even feel this should be an
> expected minimum.

Right. I think the response to this is pragmatism and cost. For non
critical softwares do you think it is reasonable to use SPARK? This is
only a matter of trade-off I think.

> What matters did you encounter with the OS socket binding ? (if this can
> be drawn with no excessive difficulties)

The fact that your are working on a boundary. The secure SPARK on one
side and the OS non-SPARK foreign world. One difficult part was that
this is IO (stream like) where data are read from the same socket but
two consecutive read won't return the same value. There is way in SPARK
to deal with that, to avoid SPARK complaining that you have ineffective
code... not straight forward!

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|    http://www.obry.net  -  http://v2p.fr.eu.org
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver keys.gnupg.net --recv-key F949BD3B




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 19:28         ` Yannick Duchêne (Hibou57)
@ 2010-05-26 20:14           ` Dmitry A. Kazakov
  2010-05-26 21:14             ` Yannick Duchêne (Hibou57)
  2010-05-26 22:01             ` Peter C. Chapin
  0 siblings, 2 replies; 46+ messages in thread
From: Dmitry A. Kazakov @ 2010-05-26 20:14 UTC (permalink / raw)


On Wed, 26 May 2010 21:28:58 +0200, Yannick Duch�ne (Hibou57) wrote:

> Le Wed, 26 May 2010 16:28:20 +0200, Dmitry A. Kazakov  
> <mailbox@dmitry-kazakov.de> a �crit:
>> Yes. I think this could be a direction in which Ada should evolve. It
>> should have a modular part equivalent to SPARK, which can be used with
>> certain compilation units. So that the programmer could choose the level  
>> of
>> safety he is ready to invest into.
> 
> Please, tell more : you mean a kind of pragma or compiler option like the  
> ones there is for runtime checks ?

No run time checks, but an option to tell more about the contract, with
enforced static checks, that this indeed hold. If you have no time, no
guts, or when the algorithm does not allow certain proofs, you just do not
make promises you cannot keep and go further.

> By the way, that's an opportunity for two other ideas : why not integrate  
> the SPARK language definition in the Ada standard ? We may have wordings  
> in the Ada reference or the annotated reference, stating that is and that  
> is allowed or disallowed with SPARK.

I think it should be more than just two levels. But yes, each language
construct and each library operation shall have a contract.

> Actually, how can you test an compiler  
> compliance with SPARK ? I feel you can do it only for full Ada.

Likely yes, because there exist legal Ada programs, such that no Ada
compiler could compile.

>> It would be nice to be able to start the project at some middle level (a
>> bit higher than of present Ada, but lower than SPARK), and then gradually
>> adjust it, as the project evolves.

> Like ensuring it is written in a valid SPARK syntax before we know if this  
> part will really be semantically checked or not ?

Rather by refining the contracts. When you feel that the implementation is
mature, you can add more promises to the contract of and see if they hold
(=provable). If they don't you could try to re-implement some parts of it.
When you feel that it takes too much time, is impossible to prove, you can
drop the idea to do it formally. You will sill have a gain of deeper
understanding how the thing works and could document why do you think it is
correct, even if that is not formally provable.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 19:32         ` Pascal Obry
@ 2010-05-26 20:56           ` Yannick Duchêne (Hibou57)
  2010-05-26 22:06           ` Peter C. Chapin
  1 sibling, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-26 20:56 UTC (permalink / raw)


Le Wed, 26 May 2010 21:32:02 +0200, Pascal Obry <pascal@obry.net> a écrit:
> Right. I think the response to this is pragmatism and cost. For non
> critical softwares do you think it is reasonable to use SPARK? This is
> only a matter of trade-off I think.
So, depends on how it is balanced.

If the component of an application of an application, is to be executed  
thousand of times in a day or in a week, and especially if this is to be  
executed automatically, the trade-off should give weight to verifiability,  
even for applications which are not know to be critical in the usual sense.

For others, I would say proofs could help to understand why it works (not  
just how). This could be nice for pedagogy, as a formal comments is always  
more readable than implicit human language (not that I don't like the  
latter, just that it is sometime too much weak). And for maintainability,  
like modifying this and that for more efficiency, or simply adding a  
functionality, with the insurance nothing will be broken on the way.

What kind of trade-off should not care ?

(note: I was thinking about also some other reasons, however not exposed,  
to be short)

> [...] where data are read from the same socket but
> two consecutive read won't return the same value. There is way in SPARK
> to deal with that, to avoid SPARK complaining that you have ineffective
> code... not straight forward!
>
> Pascal.
I see what the subject was looking like


I was thinking about something. With my attempted pre/post/check in mind,  
I was reading the SPARK source, and noticed one thing : the postconditions  
in the SPARK sources, are far less detailed than what I was attempting to  
do with my packages. To be honest, I was not only to prove there will not  
be any runtime error, I was also using postconditions to express  
properties and logics. So obviously, this ends in much more difficult  
complicated things to demonstrate.

This may be a rule of thumb with SPARK : first, try to have a program  
proved to be runtime error free, then after only, when that done, try to  
add more to express logic properties.

May be I was somewhere between proof of absence of runtime error and data  
refinement (I mean what in french is named « réification »). Possibly  
SPARK do not the best with the latter. Finally, stating a procedure or  
function has this and that logic property, is another way to state this is  
the concrete implementation of the abstraction which has these same  
properties ; so this is refinement/reification.

Perhaps SPARK can do that... a bit, and perhaps I was wrong to expect it  
could fully do that.

So this rule of thumb : start with proof of no runtime error first, and  
only later try more and don't expect this is required to succeed ?

However, being able to add rules which could be applied the same way  
embedded rules are, would really be nice. While I suspect this would not  
be compatible with some efficiency requirements of the SPARK Simplifier  
(if only I could build the Simplifier from modified source, but I can't).


-- 
There is even better than a pragma Assert: a SPARK --# check.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 20:14           ` Dmitry A. Kazakov
@ 2010-05-26 21:14             ` Yannick Duchêne (Hibou57)
  2010-05-26 21:15               ` Yannick Duchêne (Hibou57)
  2010-05-26 22:01             ` Peter C. Chapin
  1 sibling, 1 reply; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-26 21:14 UTC (permalink / raw)


Le Wed, 26 May 2010 22:14:48 +0200, Dmitry A. Kazakov  
<mailbox@dmitry-kazakov.de> a écrit:
> No run time checks, but an option to tell more about the contract, with
> enforced static checks, that this indeed hold. If you have no time, no
> guts, or when the algorithm does not allow certain proofs, you just do  
> not
> make promises you cannot keep and go further.
OK, I see : you mean interfacing between Ada and SPARK ? Is that the idea ?

Indeed, would be nice if Ada compiler could fee SPARK Examiner with  
required condition (provided I've understood your words).

> I think it should be more than just two levels. But yes, each language
> construct and each library operation shall have a contract.
Goes the same way as the above (OK)

>> Actually, how can you test an compiler
>> compliance with SPARK ? I feel you can do it only for full Ada.
>
> Likely yes, because there exist legal Ada programs, such that no Ada
> compiler could compile.
So this could be one added good reason to have a test suit targeting the  
SPARK subset only.

> Rather by refining the contracts. When you feel that the implementation  
> is
> mature, you can add more promises to the contract of and see if they hold
> (=provable). If they don't you could try to re-implement some parts of  
> it.
> When you feel that it takes too much time, is impossible to prove, you  
> can
> drop the idea to do it formally. You will sill have a gain of deeper
> understanding how the thing works and could document why do you think it  
> is
> correct, even if that is not formally provable.
This seems to mean something similar to one of my previous message, about  
the fact I was perhaps targeting too much at first sight. Having different  
levels in mind seems indeed a requirement if one don't want to be too much  
discouraged.


-- 
There is even better than a pragma Assert: a SPARK --# check.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 21:14             ` Yannick Duchêne (Hibou57)
@ 2010-05-26 21:15               ` Yannick Duchêne (Hibou57)
  0 siblings, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-26 21:15 UTC (permalink / raw)


Le Wed, 26 May 2010 23:14:24 +0200, Yannick Duchêne (Hibou57)  
<yannick_duchene@yahoo.fr> a écrit:
> Indeed, would be nice if Ada compiler could fee SPARK Examiner with  
> required condition
Typo error : read “Indeed, would be nice if Ada compilers could *feed*  
Examiner with required conditions”


-- 
There is even better than a pragma Assert: a SPARK --# check.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 20:14           ` Dmitry A. Kazakov
  2010-05-26 21:14             ` Yannick Duchêne (Hibou57)
@ 2010-05-26 22:01             ` Peter C. Chapin
  2010-05-27 12:32               ` Mark Lorenzen
  2010-05-27 18:34               ` Pascal Obry
  1 sibling, 2 replies; 46+ messages in thread
From: Peter C. Chapin @ 2010-05-26 22:01 UTC (permalink / raw)


Dmitry A. Kazakov wrote:

> Rather by refining the contracts. When you feel that the implementation is
> mature, you can add more promises to the contract of and see if they hold
> (=provable). If they don't you could try to re-implement some parts of it.
> When you feel that it takes too much time, is impossible to prove, you can
> drop the idea to do it formally. You will sill have a gain of deeper
> understanding how the thing works and could document why do you think it is
> correct, even if that is not formally provable.

I think this is a good way to work with SPARK. I'm developing a SPARK program
now. I started with the specifications, working with them until I felt
comfortable with the overall structure of the program. Refactoring at that
stage was very cheap and I did it several times.

I just finished coding the body of the one of the critical packages. The SPARK
Examiner was very helpful at catching silly errors that I might have missed.
In one case I forget to condition a flag at the end of a procedure and was
told that my code did not agree with my annotations. Nice.

Next I was able to prove the code free of runtime errors. This worked out
quite easily in this case. After changing two declarations to use properly
constrainted subtypes (rather than just Natural as I had originally... my
bad), the SPARK simplifier was able to discharge 96% of the VCs "out of the
box." The remaining ones were not hard to fix. I felt lucky!

Now I hope to encode some useful and interesting information about the
intended functionality of the subprograms in SPARK annotations (#pre, #post,
etc), and see if I can still get the proofs to go through. Somewhere around
now I will also start building my test program. One thing that I like about
SPARK is that you can do a lot of useful things with it without worrying
about stubbing out a zillion supporting packages for purposes of creating a
test program.

I do agree certainly that SPARK is not for everything. The code I'm working on
is (potentially) security sensitive so the extra effort of working with SPARK
seems worthwhile. My test harness, on the other hand, is likely to be in full
Ada.

Peter




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 19:32         ` Pascal Obry
  2010-05-26 20:56           ` Yannick Duchêne (Hibou57)
@ 2010-05-26 22:06           ` Peter C. Chapin
  2010-05-27 18:39             ` Pascal Obry
  1 sibling, 1 reply; 46+ messages in thread
From: Peter C. Chapin @ 2010-05-26 22:06 UTC (permalink / raw)


Pascal Obry wrote:

> The fact that your are working on a boundary. The secure SPARK on one
> side and the OS non-SPARK foreign world. One difficult part was that
> this is IO (stream like) where data are read from the same socket but
> two consecutive read won't return the same value. There is way in SPARK
> to deal with that, to avoid SPARK complaining that you have ineffective
> code... not straight forward!

Is this the sort of thing external own variables are intended to handle?

Peter




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 19:16       ` Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
  2010-05-26 19:32         ` Pascal Obry
@ 2010-05-26 22:17         ` Peter C. Chapin
  1 sibling, 0 replies; 46+ messages in thread
From: Peter C. Chapin @ 2010-05-26 22:17 UTC (permalink / raw)


Yannick Duchêne (Hibou57) wrote:

> That is where I was not agree with a previous similar sentence from  
> someone else. I don't see a reason why only critical applications should  
> benefit of verifiability. This would not be a waste to apply standard  
> logic to program construction, and I even feel this should be an expected  
> minimum.

I think the reason is that it's hard. SPARK's restrictions exist, in part, to
create a language that can manipulated in a reasonably formal way using
current technology. As soon as you start adding back exceptions, dynamic
memory, recursion, etc, etc, it becomes very hard (beyond the state of the
art?) to say significant things statically in formal way. Yet those features
*are* very useful for many programs.

I know there has been a lot of work done in the field of static analysis and I
hope the field continues to advance. As it does, SPARK, or something like it,
may be able to take on some of the complicated features it currently avoids.
Also, depending on your needs, other kinds of static analysis might be useful
besides formally proving program properties. So what I'm saying is that it
may be premature to think about incorporating the SPARK sublanguage formally
into the Ada standard. SPARK is a tool... one of many. I look forward to even
better tools in the future!

For example take a look at this:

        http://www.adacore.com/home/products/codepeer/

I believe this tool works with full Ada and it sounds pretty neat. Disclaimer:
I haven't tried it.

Peter





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 10:57   ` Yannick Duchêne (Hibou57)
  2010-05-26 14:15     ` Pascal Obry
@ 2010-05-27  8:13     ` Yannick Duchêne (Hibou57)
  2010-05-27 10:55       ` Yannick Duchêne (Hibou57)
  1 sibling, 1 reply; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-27  8:13 UTC (permalink / raw)


Le Wed, 26 May 2010 12:57:11 +0200, Yannick Duchêne (Hibou57)  
<yannick_duchene@yahoo.fr> a écrit:
>> If that is the conclusion left in the SIV file then the user rule to
>> prove it is:
>>    user_bitwise(1): bit__and(X, 2) = 0 -> X div 2 mod 2 = 0
>> may_be_deduced .
> I remember about this, this was just that I did not suspected this had  
> to be a so exact match.
Finally, there is a work-around: if the Simplifier expect an expression  
which exactly match the one provided as a user rule, and there is not a  
direct matching expression in the source, then just use Simplifier's  
embedded rules to prove an expression is equivalent to the one the  
Simplifier expect an exact match.

That is, if there a user rule of the form

    my_rule(1) Conclusion_1 may_be_deduced_from [Expression_1].

and the source unfortunately contains only

    Expression_2

then try something like

    --# check Expression_2 -> Intermediate_Expression(1);
    --# check ...;
    --# check ...;
    --# check Intermediate_Expression(N) -> Expression_1;

and here it is.

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Sockets package in SPARK (Was: Lost in translation (with SPARK user rules))
  2010-05-26 14:15     ` Pascal Obry
  2010-05-26 14:28       ` Dmitry A. Kazakov
  2010-05-26 19:16       ` Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
@ 2010-05-27 10:11       ` Jacob Sparre Andersen
  2010-05-27 18:41         ` Pascal Obry
  2 siblings, 1 reply; 46+ messages in thread
From: Jacob Sparre Andersen @ 2010-05-27 10:11 UTC (permalink / raw)


Pascal Obry wrote:

> I had to create a binding to the OS sockets in SPARK, I can tell you
> that it was not easy...

Is that binding available for download?  (Or is it an internal product?)

Greetings,

Jacob
-- 
Those who can't laugh at themselves leave the job to others.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27  8:13     ` Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
@ 2010-05-27 10:55       ` Yannick Duchêne (Hibou57)
  2010-05-27 11:32         ` Yannick Duchêne (Hibou57)
  2010-05-27 12:22         ` stefan-lucks
  0 siblings, 2 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-27 10:55 UTC (permalink / raw)


Le Thu, 27 May 2010 10:13:55 +0200, Yannick Duchêne (Hibou57)  
<yannick_duchene@yahoo.fr> a écrit:
> Finally, there is a work-around: if the Simplifier expect an expression  
> which exactly match the one provided as a user rule, and there is not a  
> direct matching expression in the source, then just use Simplifier's  
> embedded rules to prove an expression is equivalent to the one the  
> Simplifier expect an exact match.
>
> That is, if there a user rule of the form
>
>     my_rule(1) Conclusion_1 may_be_deduced_from [Expression_1].
>
> and the source unfortunately contains only
>
>     Expression_2
>
> then try something like
>
>     --# check Expression_2 -> Intermediate_Expression(1);
>     --# check ...;
>     --# check ...;
>     --# check Intermediate_Expression(N) -> Expression_1;
>
> and here it is.
>

Unfortunately, I came into another trouble. I was trying this, and used  
intermediate variables to help. But Simplifier insist on using expressions  
instead of variables, so it can't find a match.

An example will be more expressive:

Here is an excerpt of a source (T is a modular type):

    ...
    --# check T'Pos (B) >= 0; -- Check #1
    --# check T'Pos (B) <= 1; -- Check #2
    --# check (T'Pos (B) = 0) or (T'Pos (B) = 1); -- Check #3
    ...

Here is a rule in a user rules file:

    my_rule(1): (X=0) or (X=1) may_be_deduced_from [ integer(X), X>=0, X<=1  
].

And Simplifier fails to prove Check #3 ; an excerpt of the *.SIV file  
shows why (conclusion associated to Check #3):

    C1:    bit__and(v, 1) = 0 or bit__and(v, 1) = 1 .

It replaces B with the source expression of the value of B and is thus  
unable to find a match with the rule. By the way, it drops Check #1 and  
Check #2, which does not appears at all in the hypotheses list (possibly  
it seems to much obvious to Simplifier).

I've tried to use an Assert instead of a Check, this did not change  
anything.

So if Simplifier requires an exact match to user rules, is there a way to  
avoid the simplifier to brake any attempt to prove an expression match a  
rule ?

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 10:55       ` Yannick Duchêne (Hibou57)
@ 2010-05-27 11:32         ` Yannick Duchêne (Hibou57)
  2010-05-27 11:41           ` Phil Thornley
  2010-05-27 12:22         ` stefan-lucks
  1 sibling, 1 reply; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-27 11:32 UTC (permalink / raw)


Le Thu, 27 May 2010 12:55:04 +0200, Yannick Duchêne (Hibou57)  
<yannick_duchene@yahoo.fr> a écrit:
> And Simplifier fails to prove Check #3 ; an excerpt of the *.SIV file  
> shows why (conclusion associated to Check #3):
>
>     C1:    bit__and(v, 1) = 0 or bit__and(v, 1) = 1 .

Ok, found. I was not correct with the way to require the type to be  
integer as done in

    my_set(1): (X=0) or (X=1) may_be_deduced_from [ integer(X), X>=0, X<=1  
].

It has to be

    my_set(1): (X=0) or (X=1) may_be_deduced_from [  
goal(checktype(X,integer)), X>=0, X<=1 ].

I've check that

    my_set(1): (X=0) or (X=1) may_be_deduced_from [ goal(integer(X)), X>=0,  
X<=1 ].

does not work, which lead to another question : what is the difference  
between the two predicates “integer(X)” and “checktype(X,integer)” ? The  
documentation is not clear to me about it. About “checktype(EXPRESSION,  
TYPE_IDENTIFIER)”, it says “This predicate may be used to check that an  
expression is of an appropriate type” and about “integer(EXPRESSION)” is  
says This built-in Prolog predicate succeeds if and only if its argument  
is instantiated to an integer. So what's the difference between  
“goal(integer(X))” and “goal(checktype(X,integer))”. Does the latter  
refers to root type as defined in source and the other to something else ?  
(so what?)

Another strange thing is that the file NUMINEQS.RUL does not contains any  
predicate stating arguments have to be integers and there is just a  
comment in heading, nothing in rules about it.

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 12:22         ` stefan-lucks
@ 2010-05-27 11:37           ` Yannick Duchêne (Hibou57)
  2010-05-27 11:57           ` Phil Thornley
  1 sibling, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-27 11:37 UTC (permalink / raw)


Le Thu, 27 May 2010 14:22:26 +0200, <stefan-lucks@see-the.signature> a  
écrit:
> Wouldn't it be much simpler to use the proof checker, instead of the
> simplifier?
>
> Stefan
>
I don't like the checker, this is an external tool (... and not an handy  
one), to me, SPARK is a language, not a tool. And further more, I really  
prefer to write proofs in source, because this is what I am expecting of  
such a thing as program proof is (I don't bother about adding a user rules  
file, although I would like a way to not need to copy it every where it is  
needed and have it to be loaded all time automatically).

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 11:32         ` Yannick Duchêne (Hibou57)
@ 2010-05-27 11:41           ` Phil Thornley
  2010-05-27 12:42             ` Yannick Duchêne (Hibou57)
  0 siblings, 1 reply; 46+ messages in thread
From: Phil Thornley @ 2010-05-27 11:41 UTC (permalink / raw)


On 27 May, 12:32, Yannick Duchêne (Hibou57) <yannick_duch...@yahoo.fr>
wrote:
[...]
> Ok, found. I was not correct with the way to require the type to be  
> integer as done in
>
>     my_set(1): (X=0) or (X=1) may_be_deduced_from [ integer(X), X>=0, X<=1  
> ].
>
> It has to be
>
>     my_set(1): (X=0) or (X=1) may_be_deduced_from [  
> goal(checktype(X,integer)), X>=0, X<=1 ].
>
> I've check that
>
>     my_set(1): (X=0) or (X=1) may_be_deduced_from [ goal(integer(X)), X>=0,  
> X<=1 ].
>
> does not work, which lead to another question : what is the difference  
> between the two predicates “integer(X)” and “checktype(X,integer)” ? The  
> documentation is not clear to me about it. About “checktype(EXPRESSION,  
> TYPE_IDENTIFIER)”, it says “This predicate may be used to check that an  
> expression is of an appropriate type” and about “integer(EXPRESSION)” is  
> says This built-in Prolog predicate succeeds if and only if its argument  
> is instantiated to an integer. So what's the difference between  
> “goal(integer(X))” and “goal(checktype(X,integer))”. Does the latter  
> refers to root type as defined in source and the other to something else ?  
> (so what?)

It's explained in the Simplifier manual section 7.2.3:
goal(integer(X)) requires X to *be* an integer, not just of integer
type (so goal(integer(42)) succeeds).

If you always use goal(checktype(X, integer)) then is works for
anything that has an integer type.

>
> Another strange thing is that the file NUMINEQS.RUL does not contains any  
> predicate stating arguments have to be integers and there is just a  
> comment in heading, nothing in rules about it.
This is the documentation of a Checker rule file, so is not relevant
to the Simplifier.  In the actual rule file used by the Checker the
comment that you mention will be included in what the Checker sees, so
will be effective in restricting the application of the rule.

Cheers,

Phil



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 12:22         ` stefan-lucks
  2010-05-27 11:37           ` Yannick Duchêne (Hibou57)
@ 2010-05-27 11:57           ` Phil Thornley
  2010-05-27 12:36             ` Yannick Duchêne (Hibou57)
  1 sibling, 1 reply; 46+ messages in thread
From: Phil Thornley @ 2010-05-27 11:57 UTC (permalink / raw)


On 27 May, 13:22, stefan-lu...@see-the.signature wrote:
[..]
> Wouldn't it be much simpler to use the proof checker, instead of the
> simplifier?

The main reason that I have for avoiding the Proof Checker for VCs
left after simplfication is that the proofs need to be repeated every
time the VCs are generated - which is OK except that small and
irrelevant changes to the code often break the proof scripts that you
have to keep to do the reproofs each time.
(And the Proof Checker provides no support for the maintenance of
existing scripts.)

So my approach is now to create user rules so that the Simplfier
proves 100% of the VCs.  However this sometimes requires quite complex
user rules that are difficult validate manually, so I use the Proof
Checker to validate those rules by creating VCs that correspond to the
rule and proving those.

There are a couple of small examples of this in the Tokeneer code
distributed with SPARK GPL 8.1.1 - see code/core/tismain.rlu and code/
core/tismain/mainloopbody.rlu.

Cheers,

Phil



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 10:55       ` Yannick Duchêne (Hibou57)
  2010-05-27 11:32         ` Yannick Duchêne (Hibou57)
@ 2010-05-27 12:22         ` stefan-lucks
  2010-05-27 11:37           ` Yannick Duchêne (Hibou57)
  2010-05-27 11:57           ` Phil Thornley
  1 sibling, 2 replies; 46+ messages in thread
From: stefan-lucks @ 2010-05-27 12:22 UTC (permalink / raw)


[-- Attachment #1: Type: TEXT/PLAIN, Size: 1088 bytes --]

On Thu, 27 May 2010, Yannick Duch�ne (Hibou57) wrote:

> Le Thu, 27 May 2010 10:13:55 +0200, Yannick Duch�ne (Hibou57)
> <yannick_duchene@yahoo.fr> a �crit:
> >
> >That is, if there a user rule of the form
> >
> >    my_rule(1) Conclusion_1 may_be_deduced_from [Expression_1].
> >
> >and the source unfortunately contains only
> >
> >    Expression_2
> >
> >then try something like
> >
> >    --# check Expression_2 -> Intermediate_Expression(1);
> >    --# check ...;
> >    --# check ...;
> >    --# check Intermediate_Expression(N) -> Expression_1;
> >
> >and here it is.
> 
> Unfortunately, I came into another trouble. I was trying this, and used
> intermediate variables to help. But Simplifier insist on using expressions
> instead of variables, so it can't find a match.

Wouldn't it be much simpler to use the proof checker, instead of the 
simplifier?

Stefan

-- 
------ Stefan Lucks   --  Bauhaus-University Weimar  --   Germany  ------
               Stefan dot Lucks at uni minus weimar dot de
------  I  love  the  taste  of  Cryptanalysis  in  the  morning!  ------

^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 22:01             ` Peter C. Chapin
@ 2010-05-27 12:32               ` Mark Lorenzen
  2010-05-27 18:34               ` Pascal Obry
  1 sibling, 0 replies; 46+ messages in thread
From: Mark Lorenzen @ 2010-05-27 12:32 UTC (permalink / raw)


On 27 Maj, 00:01, "Peter C. Chapin" <pcc482...@gmail.com> wrote:
>
> Next I was able to prove the code free of runtime errors. This worked out
> quite easily in this case. After changing two declarations to use properly
> constrainted subtypes (rather than just Natural as I had originally... my
> bad), the SPARK simplifier was able to discharge 96% of the VCs "out of the
> box." The remaining ones were not hard to fix. I felt lucky!
>
> Now I hope to encode some useful and interesting information about the
> intended functionality of the subprograms in SPARK annotations (#pre, #post,
> etc), and see if I can still get the proofs to go through. Somewhere around
> now I will also start building my test program. One thing that I like about
> SPARK is that you can do a lot of useful things with it without worrying
> about stubbing out a zillion supporting packages for purposes of creating a
> test program.

I would like to suggest that you always use the mandatory annotations
(of course)
*and also* the #pre annotation. The use of preconditions can
drastically improve
the Simplifier's ability to discharge VCs. You already discovered that
effect when
you introduced constrained subtypes. Furthermore you don't have to
think about
ways to make your subprograms total by using "nil" values or error
flags, which
also relieves the calling subprogram from checking for such "nil"
values and
error flags.

Then, if you are bold, you can start to think about writing
postconditions.
*However*, if the criticality of your program *requires* the use of
postconditions,
I suggest that you introduce them from the beginning on as their
introduction they
may require some reworking (often simplification) of your code.

- Mark L



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 11:57           ` Phil Thornley
@ 2010-05-27 12:36             ` Yannick Duchêne (Hibou57)
  2010-05-27 13:38               ` Phil Thornley
  0 siblings, 1 reply; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-27 12:36 UTC (permalink / raw)


Le Thu, 27 May 2010 13:57:58 +0200, Phil Thornley  
<phil.jpthornley@googlemail.com> a écrit:
> So my approach is now to create user rules so that the Simplfier
> proves 100% of the VCs.  However this sometimes requires quite complex
> user rules that are difficult validate manually, so I use the Proof
> Checker to validate those rules by creating VCs that correspond to the
> rule and proving those.
This is close to my personal wish too, except I see a slight difference :  
I prefer to write very simple user rules, easily proved (like as easy to  
prove as using a truth table) and write Check based demonstrations relying  
on these rules. This is because I feel it is more safe (enforce soundness)  
and because it left more in the source in less in external documents (does  
not brake source navigability and layout logic).

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 11:41           ` Phil Thornley
@ 2010-05-27 12:42             ` Yannick Duchêne (Hibou57)
  0 siblings, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-27 12:42 UTC (permalink / raw)


Le Thu, 27 May 2010 13:41:08 +0200, Phil Thornley  
<phil.jpthornley@googlemail.com> a écrit:
> It's explained in the Simplifier manual section 7.2.3:
> goal(integer(X)) requires X to *be* an integer, not just of integer
> type (so goal(integer(42)) succeeds).
Thanks for the reference (may be I tend to become lazy when I am facing  
scattered documentations, oops)

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 12:36             ` Yannick Duchêne (Hibou57)
@ 2010-05-27 13:38               ` Phil Thornley
  2010-06-03  2:44                 ` Yannick Duchêne (Hibou57)
  0 siblings, 1 reply; 46+ messages in thread
From: Phil Thornley @ 2010-05-27 13:38 UTC (permalink / raw)


On 27 May, 13:36, Yannick Duchêne (Hibou57) <yannick_duch...@yahoo.fr>
wrote:
> Le Thu, 27 May 2010 13:57:58 +0200, Phil Thornley  
> <phil.jpthorn...@googlemail.com> a écrit:> So my approach is now to create user rules so that the Simplfier
> > proves 100% of the VCs.  However this sometimes requires quite complex
> > user rules that are difficult validate manually, so I use the Proof
> > Checker to validate those rules by creating VCs that correspond to the
> > rule and proving those.
>
> This is close to my personal wish too, except I see a slight difference :  
> I prefer to write very simple user rules, easily proved (like as easy to  
> prove as using a truth table) and write Check based demonstrations relying  
> on these rules. This is because I feel it is more safe (enforce soundness)  
> and because it left more in the source in less in external documents (does  
> not brake source navigability and layout logic).

I very much agree with this approach - adding a sequence of check
annotations so that the rules required are easy to validate. However
you have to balance this against the size of the annotations required.

For example, a few years ago I was trying to prove the correctness of
code for a red-black tree, the single rotation procedure was four
lines of code, but it took about 60 lines of check annotations plus a
bunch of quite complex rules, to prove that the tree stayed a valid
binary tree (and I never got round to proving the colour invariants).

Cheers,

Phil



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 22:01             ` Peter C. Chapin
  2010-05-27 12:32               ` Mark Lorenzen
@ 2010-05-27 18:34               ` Pascal Obry
  2010-05-27 19:18                 ` Yannick Duchêne (Hibou57)
                                   ` (2 more replies)
  1 sibling, 3 replies; 46+ messages in thread
From: Pascal Obry @ 2010-05-27 18:34 UTC (permalink / raw)


Peter,

> I do agree certainly that SPARK is not for everything. The code I'm working on
> is (potentially) security sensitive so the extra effort of working with SPARK
> seems worthwhile. My test harness, on the other hand, is likely to be in full
> Ada.

Just curious, why do you feel that you still need to write tests after
having run successfully the SPARK proof checker?

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|    http://www.obry.net  -  http://v2p.fr.eu.org
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver keys.gnupg.net --recv-key F949BD3B




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 22:06           ` Peter C. Chapin
@ 2010-05-27 18:39             ` Pascal Obry
  0 siblings, 0 replies; 46+ messages in thread
From: Pascal Obry @ 2010-05-27 18:39 UTC (permalink / raw)


Peter,

> Is this the sort of thing external own variables are intended to handle?

Yes, that's what I thought at first. But for some reasons (I don't
remember now) this was not working in this case (sorry, I really have a
bad memory!). I had designed 3 or 4 different spec and had help from the
SPARK team before converging on a working one.

Pascal.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|    http://www.obry.net  -  http://v2p.fr.eu.org
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver keys.gnupg.net --recv-key F949BD3B




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Sockets package in SPARK (Was: Lost in translation (with SPARK user rules))
  2010-05-27 10:11       ` Sockets package in SPARK (Was: Lost in translation (with SPARK user rules)) Jacob Sparre Andersen
@ 2010-05-27 18:41         ` Pascal Obry
  2010-05-27 19:20           ` Yannick Duchêne (Hibou57)
  0 siblings, 1 reply; 46+ messages in thread
From: Pascal Obry @ 2010-05-27 18:41 UTC (permalink / raw)


Jacob,

> Is that binding available for download?  (Or is it an internal product?)

Internal sorry.

-- 

--|------------------------------------------------------
--| Pascal Obry                           Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--|    http://www.obry.net  -  http://v2p.fr.eu.org
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver keys.gnupg.net --recv-key F949BD3B




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 18:34               ` Pascal Obry
@ 2010-05-27 19:18                 ` Yannick Duchêne (Hibou57)
  2010-05-28  9:39                 ` Maciej Sobczak
  2010-05-28 11:57                 ` SPARK and testing. Was: " Peter C. Chapin
  2 siblings, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-27 19:18 UTC (permalink / raw)


Le Thu, 27 May 2010 20:34:52 +0200, Pascal Obry <pascal@obry.net> a écrit:

> Peter,
>
>> I do agree certainly that SPARK is not for everything. The code I'm  
>> working on
>> is (potentially) security sensitive so the extra effort of working with  
>> SPARK
>> seems worthwhile. My test harness, on the other hand, is likely to be  
>> in full
>> Ada.
>
> Just curious, why do you feel that you still need to write tests after
> having run successfully the SPARK proof checker?
>
> Pascal.
Good question Pascal,

May be because that's a tradition, may be because some inside voices tell  
you you're bad if you don't run test, or may be due to the lack of  
understanding that test are done when there is no way to prove anything  
(that is, tests would be a fall-back).

Peter, I'm not telling that's you though, I'm just telling that testing is  
so much genetically inlaid in the computer culture, that doing without it,  
would be somewhat frightening, just like the first time one learn to swim  
or to ride a bicycle (waw, do I really have to stand up on this thing  
which not even able to stand up by itself ?)

But may be there is another reason : logic.

That is a question I have in mind since I've recently discovered (well,  
really started with) SPARK : one may prove something, while he/she may not  
really know what he/she is proving. You may prove some specifications (I  
mean pre/post, not runtime error safety), but what if this specifications  
does not express what he/she supposed these expressed ?

One component of the chain may still be a source of matters : the one from  
a though to a specification.

I agree testing is a least a bit needed for that. However, this is not to  
be driven as the other kind of testing, which is very different, I mean  
the one which is driven when nothing else can be done (or no body wanted  
to do something else).


-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Sockets package in SPARK (Was: Lost in translation (with SPARK user rules))
  2010-05-27 18:41         ` Pascal Obry
@ 2010-05-27 19:20           ` Yannick Duchêne (Hibou57)
  2010-05-28  7:43             ` Sockets package in SPARK Jacob Sparre Andersen
  0 siblings, 1 reply; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-27 19:20 UTC (permalink / raw)


Le Thu, 27 May 2010 20:41:22 +0200, Pascal Obry <pascal@obry.net> a écrit:
>> Is that binding available for download?  (Or is it an internal product?)
>
> Internal sorry.
>
Jacob, if you have some needs in this area, feel free to ask me (could  
negotiate via e-mail if you wish).

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 10:09 Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
  2010-05-26 10:38 ` Phil Thornley
@ 2010-05-27 19:53 ` Warren
  2010-05-29 23:03 ` Yannick Duchêne (Hibou57)
  2010-06-01  5:42 ` Yannick Duchêne (Hibou57)
  3 siblings, 0 replies; 46+ messages in thread
From: Warren @ 2010-05-27 19:53 UTC (permalink / raw)


=?utf-8?Q?Yannick_Duch=C3=AAne_=28Hibou57?= =?utf-8?Q?=29?= expounded in
news:op.vda45rsfule2fv@garhos: 

> Lost in translation... do you know the movie ?
> 
> No, I'm joking, this is not about this 2002 movie, that's about SPARK,
> and  exactly, about translating some expressions in the rules
> language. 

That movie sucked and was boring. I'm not joking. ;-)

Warren



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Sockets package in SPARK
  2010-05-27 19:20           ` Yannick Duchêne (Hibou57)
@ 2010-05-28  7:43             ` Jacob Sparre Andersen
  0 siblings, 0 replies; 46+ messages in thread
From: Jacob Sparre Andersen @ 2010-05-28  7:43 UTC (permalink / raw)


Yannick Duch�ne wrote:
> Le Thu, 27 May 2010 20:41:22 +0200, Pascal Obry <pascal@obry.net> a �crit:

>>> Is that binding available for download?  (Or is it an internal
>>> product?)
>>
>> Internal sorry.
>
> Jacob, if you have some needs in this area, feel free to ask me (could
> negotiate via e-mail if you wish).

I just happened to have considered to do what Pascal has already done.
Not as much with a specific need in mind as an interesting challenge
which might deliver a useful result to the Ada and SPARK community.

Greetings,

Jacob
-- 
"How may I be honest with you today?"



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 18:34               ` Pascal Obry
  2010-05-27 19:18                 ` Yannick Duchêne (Hibou57)
@ 2010-05-28  9:39                 ` Maciej Sobczak
  2010-05-28 11:57                 ` SPARK and testing. Was: " Peter C. Chapin
  2 siblings, 0 replies; 46+ messages in thread
From: Maciej Sobczak @ 2010-05-28  9:39 UTC (permalink / raw)


On 27 Maj, 20:34, Pascal Obry <pas...@obry.net> wrote:

> Just curious, why do you feel that you still need to write tests after
> having run successfully the SPARK proof checker?

Because the SPARK proof checker does not guarantee the lack of
infinite loops in the program?

--
Maciej Sobczak



^ permalink raw reply	[flat|nested] 46+ messages in thread

* SPARK and testing. Was: Lost in translation (with SPARK user rules)
  2010-05-27 18:34               ` Pascal Obry
  2010-05-27 19:18                 ` Yannick Duchêne (Hibou57)
  2010-05-28  9:39                 ` Maciej Sobczak
@ 2010-05-28 11:57                 ` Peter C. Chapin
  2010-05-28 12:59                   ` SPARK and testing Peter C. Chapin
  2 siblings, 1 reply; 46+ messages in thread
From: Peter C. Chapin @ 2010-05-28 11:57 UTC (permalink / raw)


Pascal Obry wrote:

> Just curious, why do you feel that you still need to write tests after
> having run successfully the SPARK proof checker?

Despite the proofs testing is still essential. At least that is my view. Here
is why:

1. The proofs show that the subprograms obey the annotations that I provide.
However, there is nothing that I'm doing that formally links those
annotations to the actual intended purpose of my code. My code might be
correctly implemented (proved) but perhaps it correctly implements the wrong
thing. I realize there are formal methods that address this issue, but I'm
not using them.

2. The proofs only show that the source code is, in some sense, correct.
However, they say nothing about the correctness of the code generated by the
compiler or about how that code will interact with the operating system or
hardware. Testing isn't as exhaustive as proof but it is a "top to bottom"
method of checking one's code. If a test succeeds, the entire execution stack
worked properly... for that case at least. My code tends to interact with
hardware in a very low level way. What if I misunderstood what I read in the
hardware data sheets? SPARK can't help me with that misunderstanding.

3. Some of what I want to test are "non-functional" requirements such as
execution speed and memory consumption. There are also some usability issues
to consider. It seems like these things are outside SPARK's domain.

4. I might get lazy or run out of time and not complete all the proofs. After
a while testing becomes the easier path. I could believe that the rate of bug
elimination per unit of development time might improve by switching over to
testing before every single VC has been discharged. Of course this will
depend on how experienced one is with the tools (and with testing), so I'm
not sure if this is really true or not.

In summary I've concluded that no matter how nifty it is to prove one's code
using a tool like SPARK, testing remains a necessary step in the production
of quality software. SPARK is great but it has a limited scope of
applicability; there are important things that SPARK has nothing to say
about. So I write test programs.

Recently a collague of mine pointed me to the following article about recent
radiation therapy machine accidents:

http://www.nytimes.com/2010/01/24/health/24radiation.html?pagewanted=all

He commented that SPARK might have helped to avoid those accidents. Based on
my understanding of the article, I agree that SPARK might have been helpful
to a degree. However, some of the problems that occurred were really related
to poor user interface decisions and poor hospital procedures. I don't see
how SPARK could have helped with that. However, appropriate user testing
might have caught some of the problems.

It seems to me that SPARK is all about building a correct *implementation*,
but it says nothing about the correctness of the design.

Peter




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: SPARK and testing.
  2010-05-28 11:57                 ` SPARK and testing. Was: " Peter C. Chapin
@ 2010-05-28 12:59                   ` Peter C. Chapin
  2010-05-28 23:06                     ` Yannick Duchêne (Hibou57)
  0 siblings, 1 reply; 46+ messages in thread
From: Peter C. Chapin @ 2010-05-28 12:59 UTC (permalink / raw)


Peter C. Chapin wrote:

> 4. I might get lazy or run out of time and not complete all the proofs. 

I can elaborate on this a little more. In my case the program I'm working on
now needs to compute the minimum model of a collection of Datalog rules
(Datalog is a simple logic language that has been discussed as a possible
database query language). I don't know how to express such a complex, high
level characteristic as a SPARK '--#post' annotation.

What I need to say is that the information in an array Model_Area (an array of
tuples) is the minimum model implied by the Datalog rules in a different
array Storage_Area. It is well beyond my skills with SPARK to say such a
thing. I'm not saying that it is beyond SPARK's ability to express that
relationship, but for me it is just easier to explore the matter with a test
program.

Peter




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: SPARK and testing.
  2010-05-28 12:59                   ` SPARK and testing Peter C. Chapin
@ 2010-05-28 23:06                     ` Yannick Duchêne (Hibou57)
  0 siblings, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-28 23:06 UTC (permalink / raw)


Le Fri, 28 May 2010 14:59:38 +0200, Peter C. Chapin <pcc482719@gmail.com>  
a écrit:
> I can elaborate on this a little more. In my case the program I'm  
> working on
> now needs to compute the minimum model of a collection of Datalog rules
> (Datalog is a simple logic language that has been discussed as a possible
> database query language). I don't know how to express such a complex,  
> high
> level characteristic as a SPARK '--#post' annotation.
>
> What I need to say is that the information in an array Model_Area (an  
> array of
> tuples) is the minimum model implied by the Datalog rules in a different
> array Storage_Area. It is well beyond my skills with SPARK to say such a
> thing. I'm not saying that it is beyond SPARK's ability to express that
> relationship, but for me it is just easier to explore the matter with a  
> test
> program.
I believe there is indeed some reification capabilities in SPARK. This is  
not just that easy. I don't believe it would be more easy anyway with  
other methods like B or VDM.

As long as you can write in SPARK, some pre/post matching an abstract data  
type property, you can do the same in SPARK. The matter is, how difficult  
it is. And when I see things about SPARK, it mostly talks about avoiding  
runtime error, and not so much about proving an ADT has this and that  
property. Which does not mean it is not possible.

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 10:09 Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
  2010-05-26 10:38 ` Phil Thornley
  2010-05-27 19:53 ` Warren
@ 2010-05-29 23:03 ` Yannick Duchêne (Hibou57)
  2010-05-30  6:55   ` Yannick Duchêne (Hibou57)
  2010-05-30  9:26   ` Phil Thornley
  2010-06-01  5:42 ` Yannick Duchêne (Hibou57)
  3 siblings, 2 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-29 23:03 UTC (permalink / raw)


Le Wed, 26 May 2010 12:09:17 +0200, Yannick Duchêne (Hibou57)  
<yannick_duchene@yahoo.fr> a écrit:
> Lost in translation... do you know the movie ?

> No, I'm joking, this is not about this 2002 movie, that's about SPARK,  
> and exactly,
> about translating some expressions in the rules language.
> [...]

In this area, the Simplifier sometimes shows slight capacities to be  
boring.

Let a Check clause be “--# check ((2 ** Count) < (2 ** 8)) -> (Count < 8);”
Let a user rule be “my_rule: ((X ** Y) < (X ** Z)) -> (Y < Z)  
may_be_deduced_from [some-obvious-conditions].”

There should be an exact match, isn't it ?

Well, I guess you guess: it fails to prove the Check clause. Now, if you  
would like to know why, here is: it replaces this by “2 ** count < 256 ->  
count < 8”, ... it turns “(2 ** 8)” into “256”, so that it can't see any  
more an exact match with the user rule

I've check in the *.SLG file, this is indeed due to simplification, this  
is not due to an hypothetical related error.


-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-29 23:03 ` Yannick Duchêne (Hibou57)
@ 2010-05-30  6:55   ` Yannick Duchêne (Hibou57)
  2010-05-30  9:30     ` Phil Thornley
  2010-05-30  9:26   ` Phil Thornley
  1 sibling, 1 reply; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-30  6:55 UTC (permalink / raw)


Le Sun, 30 May 2010 01:03:31 +0200, Yannick Duchêne (Hibou57)  
<yannick_duchene@yahoo.fr> a écrit:
> Let a Check clause be “--# check ((2 ** Count) < (2 ** 8)) -> (Count <  
> 8);”
> Let a user rule be “my_rule: ((X ** Y) < (X ** Z)) -> (Y < Z)  
> may_be_deduced_from [some-obvious-conditions].”
>
> There should be an exact match, isn't it ?
>
> Well, I guess you guess: it fails to prove the Check clause. Now, if you  
> would like to know why, here is: it replaces this by “2 ** count < 256  
> -> count < 8”, ... it turns “(2 ** 8)” into “256”, so that it can't see  
> any more an exact match with the user rule

I didn't wanted to add rules for all constants, like 256, and etc.  
Fortunately, there is a somewhat cleaner way to work around, which still  
involves a constant, but a nice one : 2.

If “my_rule(1): ((X ** Y) < (X ** Z)) -> (Y < Z) may_be_deduced_from  
[...].” is completed with another rule like, “my_rule(2): ((2 ** Y) < (2  
** Z)) -> (Y < Z) may_be_deduced_from [...].”, then it did not change  
anymore 2 ** 8 into 256 and see a match.

I don't know what it do in the background, nor why it preserve the  
expression when the rule to match contains at least a constant, however,  
what is nice with this work around, it that it will work with all powers  
of two.

I still would like to know about the details : why when X is replaced by  
the literal 2, it does not match the same way ?

Important to notice anyway.

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-29 23:03 ` Yannick Duchêne (Hibou57)
  2010-05-30  6:55   ` Yannick Duchêne (Hibou57)
@ 2010-05-30  9:26   ` Phil Thornley
  2010-05-30  9:57     ` Yannick Duchêne (Hibou57)
  1 sibling, 1 reply; 46+ messages in thread
From: Phil Thornley @ 2010-05-30  9:26 UTC (permalink / raw)


On 30 May, 00:03, Yannick Duchêne (Hibou57) <yannick_duch...@yahoo.fr>
wrote:
> In this area, the Simplifier sometimes shows slight capacities to be  
> boring.
>
> Let a Check clause be “--# check ((2 ** Count) < (2 ** 8)) -> (Count < 8);”
> Let a user rule be “my_rule: ((X ** Y) < (X ** Z)) -> (Y < Z)  
> may_be_deduced_from [some-obvious-conditions].”
>
> There should be an exact match, isn't it ?
>
> Well, I guess you guess: it fails to prove the Check clause. Now, if you  
> would like to know why, here is: it replaces this by “2 ** count < 256 ->  
> count < 8”, ... it turns “(2 ** 8)” into “256”, so that it can't see any  
> more an exact match with the user rule
>
> I've check in the *.SLG file, this is indeed due to simplification, this  
> is not due to an hypothetical related error.

The full description of all the simplification steps is in section 3
of the Simplifier User Manual, so if you read this and follow through
the steps in a simplification log file then you should get a better
understanding of how the tool works - and so how to write user rules
that are effective.

Application of user rules is the final step (out of about a dozen)
that the Simplifier uses, so at that point a conclusion is likely to
have been changed from the form of the annotation that generated it.

In the case you describe here Count must be defined as a constant with
value 8, and that replacement is the first step that the Simplifier
takes.  Then it will convert 2**8 to 256 in a subsequent step. (A
basic strategy of the Simplifier is to replace any expressions with a
simpler version wherever possible - perhaps that's why it called the
Simplifier rather than the 'Prover' :-).

Section 3.12 describes how the Simplifier uses user rules - and it is
clear from this description that the rule needs to match expressions
in the VC as they are at the end of the process, not as in the
original VC.

You can see why this is sensible (even if it has some drawbacks*).
Generally the way that a user rule is written is in response to an
unproven conclusion in a simplified VC, so the author of the rule uses
the final form of the unproven VC to write the rule. Therefore the
rule can only be reliably applied at the point where the VC has
reached its final form.

I guess it might be possible for the Simplifier to try all the user
rules at each stage of the process - but the developers of the tool
have been concentrating heavily on making simplification as quick as
possible (quite rightly in my view) and would not want to do anything
that might slow it down without a very strong argument for the
advantages of doing that.

* The main drawback that I see is that some user rules define
replacements that must be made for the VC to be provable, but at
present these are made too late for them to be effective, since the
Simplifier has already exhausted all its other strategies.
It would be nice to have replacements defined in user rules applied in
the first simplification stage, how about:
my_rewrite(1): A must_be_replaced_by B if [...] .

(Of course nothing is ever simple, for example what happens if one
rewrite rule requires another to be applied before its sideconditions
are satisfied - are all such rules tried repeatedly until none are
used, or is it a once-only pass? etc...)

Cheers,

Phil



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-30  6:55   ` Yannick Duchêne (Hibou57)
@ 2010-05-30  9:30     ` Phil Thornley
  2010-05-30  9:46       ` Yannick Duchêne (Hibou57)
  0 siblings, 1 reply; 46+ messages in thread
From: Phil Thornley @ 2010-05-30  9:30 UTC (permalink / raw)


On 30 May, 07:55, Yannick Duchêne (Hibou57) <yannick_duch...@yahoo.fr>
wrote:
[...]
> I still would like to know about the details : why when X is replaced by  
> the literal 2, it does not match the same way ?

I would guess that the Simplifier manages to match 2**Y to 256 (by
instantiating Y with 8), but not X**Y.

Cheers,

Phil



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-30  9:30     ` Phil Thornley
@ 2010-05-30  9:46       ` Yannick Duchêne (Hibou57)
  0 siblings, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-30  9:46 UTC (permalink / raw)


Le Sun, 30 May 2010 11:30:29 +0200, Phil Thornley  
<phil.jpthornley@googlemail.com> a écrit:
>
> I would guess that the Simplifier manages to match 2**Y to 256 (by
> instantiating Y with 8), but not X**Y.
>
> Cheers,
>
> Phil
That would be a bit more than just matching so, this is a seed of solving.

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-30  9:26   ` Phil Thornley
@ 2010-05-30  9:57     ` Yannick Duchêne (Hibou57)
  0 siblings, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-05-30  9:57 UTC (permalink / raw)


Le Sun, 30 May 2010 11:26:12 +0200, Phil Thornley  
<phil.jpthornley@googlemail.com> a écrit:
> You can see why this is sensible (even if it has some drawbacks*).
> Generally the way that a user rule is written is in response to an
> unproven conclusion in a simplified VC, so the author of the rule uses
> the final form of the unproven VC to write the rule. Therefore the
> rule can only be reliably applied at the point where the VC has
> reached its final form.
Yes, I know. The nut with me, is that I try to find the best way to use  
these rules as general rules, and precisely avoid to write rules which  
would only match expressions in a given unproved VC.

The technique is evolving, it works better than at my very first attempts.  
May be I will write some french pages about it some day.

> I guess it might be possible for the Simplifier to try all the user
> rules at each stage of the process - but the developers of the tool
> have been concentrating heavily on making simplification as quick as
> possible (quite rightly in my view) and would not want to do anything
> that might slow it down without a very strong argument for the
> advantages of doing that.
I guess this is important, as I've notice the simplifier is easily slowed  
down as the number of checks increase (or may be this is the effect of my  
user rules, which I suppose, are not as much efficiently handled as the  
built-in ones).

> (Of course nothing is ever simple, for example what happens if one
> rewrite rule requires another to be applied before its sideconditions
> are satisfied - are all such rules tried repeatedly until none are
> used, or is it a once-only pass? etc...)
That is why, except with one unique single case (a rewrite rule involving  
modular types), my own rule is to not use rewrite rules at all, because it  
can easily become a one way trap only. Inference rules are more plastic  
and does not stick anything.

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-26 10:09 Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
                   ` (2 preceding siblings ...)
  2010-05-29 23:03 ` Yannick Duchêne (Hibou57)
@ 2010-06-01  5:42 ` Yannick Duchêne (Hibou57)
  3 siblings, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-06-01  5:42 UTC (permalink / raw)


A little test I did right a few seconds ago.

Let a rule be of the form:

    C1 may_be_deduced_from [C2, V].

Which is so that this one is valid too:

    C2 may_be_deduced_from [C1, V].

So finally, C1 and C2 and both either conclusion or condition.

This may *logically* (not by pattern) match three other things:

    C1  -> C2 may_be_deduced_from [V].
    C2  -> C1 may_be_deduced_from [V].
    C1 <-> C2 may_be_deduced_from [V].

Although only one would be required, which is

    C1 <-> C2 may_be_deduced_from [V].

Instead one need to have (and write) all of:

    C1 may_be_deduced_from [C2, V].
    C2 may_be_deduced_from [C1, V].
    C1  -> C2 may_be_deduced_from [V].
    C2  -> C1 may_be_deduced_from [V].
    C1 <-> C2 may_be_deduced_from [V].

Five rules instead of one, where one logical rule would be sufficient.

I wanted to get ride of this, with these generic rules, which I hoped will  
help and solve this recurrent trick:

    A may_be_deduced_from [B, B -> A].
    A may_be_deduced_from [B, B <-> A].
    B may_be_deduced_from [A, B <-> A].
    B -> A may_be_deduced_from [B <-> A].
    A -> B may_be_deduced_from [B <-> A].

I though this would ends to exact matches, and I would have only to write  
rules of the form

    C1 <-> C2 may_be_deduced_from [V].
    ... <-> ... may_be_deduced_from [...].

or of the form

    C1 -> C2 may_be_deduced_from [V].
    ... -> ... may_be_deduced_from [...].

and expected I would have to write only this, avoiding duplications of  
multiple equivalent forms (I expected these equivalent forms would be  
deduced via the above generic rules). Unfortunately, and although this  
looks like there should be exact matches after instantiations, this does  
not work (rules does not match in Simplifier).

Now I really see why there is the restriction of user rules which are to  
be defined on package or subprogram basis (while I still don't enjoy this)  
: this test clearly demonstrate there is no way to write general rules,  
these are always specifics to a VC or a set of VC for a given package or  
subprogram. Or else, all logical equivalences need to be written  
explicitly (which in turn requires great care when one want to review  
rules, as duplication -- I know it after experiences -- does not help  
safety and help to miss weakness).

Note: SPARK is still useful ;) with these words, I don't wanted to say it  
is not good


-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Lost in translation (with SPARK user rules)
  2010-05-27 13:38               ` Phil Thornley
@ 2010-06-03  2:44                 ` Yannick Duchêne (Hibou57)
  0 siblings, 0 replies; 46+ messages in thread
From: Yannick Duchêne (Hibou57) @ 2010-06-03  2:44 UTC (permalink / raw)


Le Thu, 27 May 2010 15:38:27 +0200, Phil Thornley  
<phil.jpthornley@googlemail.com> a écrit:
> I very much agree with this approach - adding a sequence of check
> annotations so that the rules required are easy to validate. However
> you have to balance this against the size of the annotations required.
>
> For example, a few years ago I was trying to prove the correctness of
> code for a red-black tree, the single rotation procedure was four
> lines of code, but it took about 60 lines of check annotations plus a
> bunch of quite complex rules, to prove that the tree stayed a valid
> binary tree (and I never got round to proving the colour invariants).
At least, I've noticed that while user rules can have an impact on speed  
of Simplifier, this is not the most important. Check (and so assert also)  
seems to have the most significant impact. What is worth to note : a long  
and explicit proof does not implies it is a slower one (the Simplifier is  
mostly slow when it fails to prove something, and is faster when it  
success).

-- 
There is even better than a pragma Assert: a SPARK --# check.
--# check C and WhoKnowWhat and YouKnowWho;
--# assert Ada;
--  i.e. forget about previous premises which leads to conclusion
--  and start with new conclusion as premise.



^ permalink raw reply	[flat|nested] 46+ messages in thread

end of thread, other threads:[~2010-06-03  2:44 UTC | newest]

Thread overview: 46+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2010-05-26 10:09 Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
2010-05-26 10:38 ` Phil Thornley
2010-05-26 10:57   ` Yannick Duchêne (Hibou57)
2010-05-26 14:15     ` Pascal Obry
2010-05-26 14:28       ` Dmitry A. Kazakov
2010-05-26 19:28         ` Yannick Duchêne (Hibou57)
2010-05-26 20:14           ` Dmitry A. Kazakov
2010-05-26 21:14             ` Yannick Duchêne (Hibou57)
2010-05-26 21:15               ` Yannick Duchêne (Hibou57)
2010-05-26 22:01             ` Peter C. Chapin
2010-05-27 12:32               ` Mark Lorenzen
2010-05-27 18:34               ` Pascal Obry
2010-05-27 19:18                 ` Yannick Duchêne (Hibou57)
2010-05-28  9:39                 ` Maciej Sobczak
2010-05-28 11:57                 ` SPARK and testing. Was: " Peter C. Chapin
2010-05-28 12:59                   ` SPARK and testing Peter C. Chapin
2010-05-28 23:06                     ` Yannick Duchêne (Hibou57)
2010-05-26 19:16       ` Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
2010-05-26 19:32         ` Pascal Obry
2010-05-26 20:56           ` Yannick Duchêne (Hibou57)
2010-05-26 22:06           ` Peter C. Chapin
2010-05-27 18:39             ` Pascal Obry
2010-05-26 22:17         ` Peter C. Chapin
2010-05-27 10:11       ` Sockets package in SPARK (Was: Lost in translation (with SPARK user rules)) Jacob Sparre Andersen
2010-05-27 18:41         ` Pascal Obry
2010-05-27 19:20           ` Yannick Duchêne (Hibou57)
2010-05-28  7:43             ` Sockets package in SPARK Jacob Sparre Andersen
2010-05-27  8:13     ` Lost in translation (with SPARK user rules) Yannick Duchêne (Hibou57)
2010-05-27 10:55       ` Yannick Duchêne (Hibou57)
2010-05-27 11:32         ` Yannick Duchêne (Hibou57)
2010-05-27 11:41           ` Phil Thornley
2010-05-27 12:42             ` Yannick Duchêne (Hibou57)
2010-05-27 12:22         ` stefan-lucks
2010-05-27 11:37           ` Yannick Duchêne (Hibou57)
2010-05-27 11:57           ` Phil Thornley
2010-05-27 12:36             ` Yannick Duchêne (Hibou57)
2010-05-27 13:38               ` Phil Thornley
2010-06-03  2:44                 ` Yannick Duchêne (Hibou57)
2010-05-27 19:53 ` Warren
2010-05-29 23:03 ` Yannick Duchêne (Hibou57)
2010-05-30  6:55   ` Yannick Duchêne (Hibou57)
2010-05-30  9:30     ` Phil Thornley
2010-05-30  9:46       ` Yannick Duchêne (Hibou57)
2010-05-30  9:26   ` Phil Thornley
2010-05-30  9:57     ` Yannick Duchêne (Hibou57)
2010-06-01  5:42 ` Yannick Duchêne (Hibou57)

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox