comp.lang.ada
 help / color / mirror / Atom feed
* [Spark] Converting Arrays
@ 2003-03-10 17:06 Lutz Donnerhacke
  2003-03-10 20:03 ` James S. Rogers
  2003-03-11 10:14 ` Rod Chapman
  0 siblings, 2 replies; 18+ messages in thread
From: Lutz Donnerhacke @ 2003-03-10 17:06 UTC (permalink / raw)


I run into a difficult problem (for me):
$ cat test.ada <<END
package test is
   type Fullpath is array(Positive range <>) of Character; 
   procedure To_Fullpath (s : String; path : out Fullpath);    
   --# derives path from s;       
end test;

package body test is
   procedure To_Fullpath (s : String; path : out Fullpath) is 
   begin    
      path := Fullpath'(others => ASCII.NUL);       
      for i in Positive range 1 .. s'Length loop             
        path (path'First + i) := s (s'First + i);        
      end loop;
   end To_Fullpath;
end test;
END
$ spark test.ada
          *******************************************************
       SPARK95 Examiner with VC and RTC Generator Release 6.3 / 11.02
                           Demonstration Version
          *******************************************************


                       DATE : 10-MAR-2003 18:05:17.49


           Examining the specification of package test ...

           Examining the body of package test ...

  10        path := Fullpath'(others => ASCII.NUL);
            ^
***   Semantic Error    : 39: Illegal use of unconstrained type.
                    ^
***   Semantic Error    : 39: Illegal use of unconstrained type.
                    ^
***   Semantic Error    : 39: Illegal use of unconstrained type [Barnes 6.7].

           Generating listing file test.lst ...

           Generating report file ...


-----------End of SPARK Examination--------------------------------


Any idea, how to write this correctly?



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-10 17:06 [Spark] Converting Arrays Lutz Donnerhacke
@ 2003-03-10 20:03 ` James S. Rogers
  2003-03-10 22:33   ` Lutz Donnerhacke
  2003-03-11 10:14 ` Rod Chapman
  1 sibling, 1 reply; 18+ messages in thread
From: James S. Rogers @ 2003-03-10 20:03 UTC (permalink / raw)


"Lutz Donnerhacke" <lutz@iks-jena.de> wrote in message
news:slrnb6phge.1gd.lutz@taranis.iks-jena.de...
> I run into a difficult problem (for me):
> $ cat test.ada <<END
> package test is
>    type Fullpath is array(Positive range <>) of Character;
>    procedure To_Fullpath (s : String; path : out Fullpath);
>    --# derives path from s;
> end test;
>
> package body test is
>    procedure To_Fullpath (s : String; path : out Fullpath) is
>    begin
>       path := Fullpath'(others => ASCII.NUL);
>       for i in Positive range 1 .. s'Length loop
>         path (path'First + i) := s (s'First + i);
>       end loop;
>    end To_Fullpath;
> end test;

change the initialization of "path" as follows:

path := (others => ASCII.NUL);

Note that another problem looms in your code.
The "for" loop will attempt to access one element
beyond the end of string "s".
It appears that you are trying to copy s into path
while skipping the first element of s. This can be done
more directly and correctly with array slices:

if s'Length <= path'Length then
   path(path'First..(path'First + s'Length - 1)) := s((s'First +
1)..s'Last);
end if;

You will need to decide how to handle the case when
path is shorter than s.

Jim Rogers





^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-10 20:03 ` James S. Rogers
@ 2003-03-10 22:33   ` Lutz Donnerhacke
  0 siblings, 0 replies; 18+ messages in thread
From: Lutz Donnerhacke @ 2003-03-10 22:33 UTC (permalink / raw)


* James S. Rogers wrote:
> "Lutz Donnerhacke" <lutz@iks-jena.de> wrote in message
>> I run into a difficult problem (for me):
> 
> change the initialization of "path" as follows:
> 
> path := (others => ASCII.NUL);

path := (1 .. s'Length => s, others => ASCII.NUL);
would be the approbriate Ada construct.

But I do not talk about Ada. I talk about the Spark subset of Ada which
refuses such construct because:
  - it's a untyped aggragate (illegal in Spark).
  - constains non static components (slicing in illegal in Spark).
  - assignes to non statically known bounds. (illegal in Spark).

> Note that another problem looms in your code.
> The "for" loop will attempt to access one element
> beyond the end of string "s".

Yep. Spark would found this later. But currently it refuses the construct as
a whole. What does not compile, is already erronous. ;-)

> It appears that you are trying to copy s into path
> while skipping the first element of s. This can be done
> more directly and correctly with array slices:

Slicing is illegal in Spark.

A possible solution is to assign the values in two loops. But Spark refuses
this construct, too, because the data flow analysis handle arrays as a
single variable and therefore compains about uninitialized values, reuse of
uninitialized values and contradiction to the predicted information flow.

> You will need to decide how to handle the case when
> path is shorter than s.

Very simple: I have a precondition on the specification claiming that s is
really shorter than path, so on every call to this function, Spark proofs
that this procedure will not fail due to unexpected parameters.

I fear, that I have to use a constraint array type for out variables. :-(



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-10 17:06 [Spark] Converting Arrays Lutz Donnerhacke
  2003-03-10 20:03 ` James S. Rogers
@ 2003-03-11 10:14 ` Rod Chapman
  2003-03-11 10:51   ` Lutz Donnerhacke
  2003-03-11 10:52   ` Lutz Donnerhacke
  1 sibling, 2 replies; 18+ messages in thread
From: Rod Chapman @ 2003-03-11 10:14 UTC (permalink / raw)


"Out" mode unconstrained arrays are indeed an interesting part
of SPARK.  They have the rather unusual property of having a value
which is not well-defined in such a body, but having attributes
(i.e. 'First and 'Last) which do have well-defined values.

A common trick in this case is to use a local,
hidden procedure to initialise the out parameter:

package body test is

   procedure To_Fullpath (s : String; path : out Fullpath) is

      procedure Init_Path;
      --# global out Path;
      --# derives Path from ;
      is
        --# hide Init_Path;
      begin
         Path := (others => ASCII.NUL); -- note Ada95 not SPARK!
      end Init_Path; -- Warning 10 expected here - hidden body.

   begin    
      Init_Path;       
      ... -- as before...
   end To_Fullpath;
end test;

This is a fine example of "moving the SPARK boundary" for perfectly
good reasons.  You can even Inline such a local procedure if efficiency
is as issue.
 - Rod, SPARK Team



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-11 10:14 ` Rod Chapman
@ 2003-03-11 10:51   ` Lutz Donnerhacke
  2003-03-11 10:52   ` Lutz Donnerhacke
  1 sibling, 0 replies; 18+ messages in thread
From: Lutz Donnerhacke @ 2003-03-11 10:51 UTC (permalink / raw)


* Rod Chapman wrote:
> A common trick in this case is to use a local,
> hidden procedure to initialise the out parameter:

Cool idea. I read the book yesterday evening till midnight and came to the
conclusion, that I have to replace Fullpath by a large constraint array.
This was obviosly wrong. You can always switch of Spark checkings when
running into trouble. :-/

But how can I write the required pre conditions?

          *******************************************************
                           Listing of SPARK Text
       SPARK95 Examiner with VC and RTC Generator Release 6.3 / 11.02
                           Demonstration Version
          *******************************************************


                       DATE : 11-MAR-2003 11:49:13.68

Line
   1  with Kernel.Linux_i86;
   2  --# inherit Kernel, Kernel.Linux_i86;
   3  
   4  package Kernel.IO is
   5     pragma Pure;
   6  
   7     subtype long is Linux_i86.long;
   8  
   9     type Fullpath is array(Positive range <>) of Character;
  10     procedure To_Fullpath (s : String; path : out Fullpath);
  11     --# derives path from s;
  12     --# pre s'Length < path'Length;
                            ^1
*** ( 1) Semantic Error:322: Only imports may be referenced in pre-conditions
         or return expressions.

  13     pragma Inline (To_Fullpath);
  14     pragma Inline_Always (To_Fullpath);
[...]
5 summarised warning(s), comprising:
     5 pragma(s)*
(*Note: the above warnings may affect the validity of the analysis.)






^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-11 10:14 ` Rod Chapman
  2003-03-11 10:51   ` Lutz Donnerhacke
@ 2003-03-11 10:52   ` Lutz Donnerhacke
  2003-03-11 20:46     ` JP Thornley
  2003-03-12  9:43     ` Rod Chapman
  1 sibling, 2 replies; 18+ messages in thread
From: Lutz Donnerhacke @ 2003-03-11 10:52 UTC (permalink / raw)


* Rod Chapman wrote:
> A common trick in this case is to use a local,
> hidden procedure to initialise the out parameter:

Cool idea. I read the book yesterday evening till midnight and came to the
conclusion, that I have to replace Fullpath by a large constraint array.
This was obviosly wrong. You can always switch off Spark checkings when
running into trouble. :-/

But how can I write the required pre conditions?

          *******************************************************
                           Listing of SPARK Text
       SPARK95 Examiner with VC and RTC Generator Release 6.3 / 11.02
                           Demonstration Version
          *******************************************************


                       DATE : 11-MAR-2003 11:49:13.68

Line
   1  with Kernel.Linux_i86;
   2  --# inherit Kernel, Kernel.Linux_i86;
   3  
   4  package Kernel.IO is
   5     pragma Pure;
   6  
   7     subtype long is Linux_i86.long;
   8  
   9     type Fullpath is array(Positive range <>) of Character;
  10     procedure To_Fullpath (s : String; path : out Fullpath);
  11     --# derives path from s;
  12     --# pre s'Length < path'Length;
                            ^1
*** ( 1) Semantic Error:322: Only imports may be referenced in pre-conditions
         or return expressions.

  13     pragma Inline (To_Fullpath);
  14     pragma Inline_Always (To_Fullpath);
[...]
5 summarised warning(s), comprising:
     5 pragma(s)*
(*Note: the above warnings may affect the validity of the analysis.)






^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-11 10:52   ` Lutz Donnerhacke
@ 2003-03-11 20:46     ` JP Thornley
  2003-03-12  8:43       ` Phil Thornley
  2003-03-12  9:43     ` Rod Chapman
  1 sibling, 1 reply; 18+ messages in thread
From: JP Thornley @ 2003-03-11 20:46 UTC (permalink / raw)


In article <slrnb6rfur.nq.lutz@taranis.iks-jena.de>, Lutz Donnerhacke 
<lutz@iks-jena.de> writes
[snip]
>
>But how can I write the required pre conditions?
>
>          *******************************************************
>                           Listing of SPARK Text
>       SPARK95 Examiner with VC and RTC Generator Release 6.3 / 11.02
>                           Demonstration Version
>          *******************************************************
>
>
>                       DATE : 11-MAR-2003 11:49:13.68
>
>Line
>   1  with Kernel.Linux_i86;
>   2  --# inherit Kernel, Kernel.Linux_i86;
>   3
>   4  package Kernel.IO is
>   5     pragma Pure;
>   6
>   7     subtype long is Linux_i86.long;
>   8
>   9     type Fullpath is array(Positive range <>) of Character;
>  10     procedure To_Fullpath (s : String; path : out Fullpath);
>  11     --# derives path from s;
>  12     --# pre s'Length < path'Length;
>                            ^1
>*** ( 1) Semantic Error:322: Only imports may be referenced in pre-conditions
>         or return expressions.
>
>  13     pragma Inline (To_Fullpath);
>  14     pragma Inline_Always (To_Fullpath);
>[...]
>5 summarised warning(s), comprising:
>     5 pragma(s)*
>(*Note: the above warnings may affect the validity of the analysis.)
>
>
>

Since you can't write it immediately, the only way is probably to 
declare a proof function:
    --# function Length_Of_Path return Integer;

then the precondition can be
    --# pre s'Length < Length_Of_Path;

You can then write the proof rule for Length_Of_Path:
    to_fullpath_rules(100): length_of_path may_be_replaced_by
                            path__last - path__first + 1 .

Append this to the generated rule file before running the Simplifier.
(The rule number needs to be larger than the highest rule number 
generated by the Examiner).

Cheers,

Phil

-- 
JP Thornley



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-11 20:46     ` JP Thornley
@ 2003-03-12  8:43       ` Phil Thornley
  2003-03-12 11:57         ` Lutz Donnerhacke
  0 siblings, 1 reply; 18+ messages in thread
From: Phil Thornley @ 2003-03-12  8:43 UTC (permalink / raw)


"JP Thornley" <jpt@diphi.demon.co.uk> wrote in message
news:ON9GtiBcskb+IwAh@diphi.demon.co.uk...

> You can then write the proof rule for Length_Of_Path:
>     to_fullpath_rules(100): length_of_path may_be_replaced_by
>                             path__last - path__first + 1 .
>
> Append this to the generated rule file before running the Simplifier.
> (The rule number needs to be larger than the highest rule number
> generated by the Examiner).

Perhaps I should add that this is a slightly quick and dirty way of
getting the proof rule in. It has the advantage that the Simplifier will
be able to use it, but the disadvantage that the rule has to be created
for every subprogram that calls To_Fullpath as well as for To_Fullpath
itself (and I'm sure that no-one would ever contemplate creating a
different rule for different situations :-).

The more 'correct' way is to write a new rule family. Create a file
called (say) test.rls with contents:
   rule_family test:
      X requires [X:any].
   test_rule(1): length_of_path may_be_replaced_by
                            path__last - path__first + 1 .

(Warning - the above is as written, and unchecked.)

Then 'consult' this file in the Checker and use the rule in a 'replace'
command.

The advantage of this is that the rule is only written once, so there
can't be any mistakes made in duplicating it, but has the disadvantage
that the Simplifier won't be able to do anything with 'length_of_path'
and you end up with more VCs to prove in the Checker.

Cheers,

Phil

--
Phil Thornley
BAE SYSTEMS





^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-11 10:52   ` Lutz Donnerhacke
  2003-03-11 20:46     ` JP Thornley
@ 2003-03-12  9:43     ` Rod Chapman
  2003-03-12 10:15       ` Lutz Donnerhacke
  1 sibling, 1 reply; 18+ messages in thread
From: Rod Chapman @ 2003-03-12  9:43 UTC (permalink / raw)


Lutz Donnerhacke <lutz@iks-jena.de> wrote in message news:<slrnb6rfur.nq.lutz@taranis.iks-jena.de>...
> * Rod Chapman wrote:
> > A common trick in this case is to use a local,
> > hidden procedure to initialise the out parameter:
> 
> Cool idea. I read the book yesterday evening till midnight and came to the
> conclusion, that I have to replace Fullpath by a large constraint array.
> This was obviosly wrong. You can always switch off Spark checkings when
> running into trouble. :-/
> 
> But how can I write the required pre conditions?
> 

You can pass the Length of the Path parameter as an addition "in"
parameter.  e.g.

  procedure To_Fullname (S           : in     String;
                         Path        :    out Fullname;
                         Path_Length : in     Natural);
  --# derives Path from S, Path_Length;
  --# pre S'Length < Path_Length;

Then a caller simply does:

  To_Fullname (LS, P, P'Length);

or similar.
 - Rod


I



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-12  9:43     ` Rod Chapman
@ 2003-03-12 10:15       ` Lutz Donnerhacke
  0 siblings, 0 replies; 18+ messages in thread
From: Lutz Donnerhacke @ 2003-03-12 10:15 UTC (permalink / raw)


* Rod Chapman wrote:
> You can pass the Length of the Path parameter as an addition "in"
> parameter.  e.g.
> 
>   procedure To_Fullname (S           : in     String;
>                          Path        :    out Fullname;
>                          Path_Length : in     Natural);
>   --# derives Path from S, Path_Length;
>   --# pre S'Length < Path_Length;
> 
> Then a caller simply does:
> 
>   To_Fullname (LS, P, P'Length);

I do not want to write C. ;-)



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-12  8:43       ` Phil Thornley
@ 2003-03-12 11:57         ` Lutz Donnerhacke
  2003-03-12 18:46           ` JP Thornley
  0 siblings, 1 reply; 18+ messages in thread
From: Lutz Donnerhacke @ 2003-03-12 11:57 UTC (permalink / raw)


* Phil Thornley wrote:
> "JP Thornley" <jpt@diphi.demon.co.uk> wrote in message
>> You can then write the proof rule for Length_Of_Path:
>>     to_fullpath_rules(100): length_of_path may_be_replaced_by
>>                             path__last - path__first + 1 .
>>
>> Append this to the generated rule file before running the Simplifier.
>> (The rule number needs to be larger than the highest rule number
>> generated by the Examiner).

Good idea. In order to make it handy rule files must be regenerated by a
perl script.

> Perhaps I should add that this is a slightly quick and dirty way of
> getting the proof rule in. It has the advantage that the Simplifier will
> be able to use it,

There are quite a lot of other problems which can be solved in this way.
I.e. transfering user defined ranges from one file to an other in order to
be checked there.

> but the disadvantage that the rule has to be created for every subprogram
> that calls To_Fullpath as well as for To_Fullpath itself (and I'm sure
> that no-one would ever contemplate creating a different rule for
> different situations :-).

Of course. And you have to be very familar with the verification language.

> The more 'correct' way is to write a new rule family. Create a file
> called (say) test.rls with contents:
>    rule_family test:
>       X requires [X:any].
>    test_rule(1): length_of_path may_be_replaced_by
>                             path__last - path__first + 1 .
> 
> (Warning - the above is as written, and unchecked.)

I'd like to see it working an any way, currently it's boring.

> Then 'consult' this file in the Checker and use the rule in a 'replace'
> command.

I do not have the Checker yet.

> The advantage of this is that the rule is only written once, so there
> can't be any mistakes made in duplicating it, but has the disadvantage
> that the Simplifier won't be able to do anything with 'length_of_path'
> and you end up with more VCs to prove in the Checker.

length_of_xxx is a bad concept anyway, because it has to be generated for
every unconstraint out parameter seperately. (Unless you name them uniquely.)

My tests result in the following program:
cat > test.ads <<END
package test is
   --# function Length_Of_Dest return Integer;
   procedure Test_It (src : String; dest : out String);
   --# derives dest from src;
   --# pre src'Length = dest'Length;
end test;
END
cat > test.adb <<END
package body test is
   procedure Test_It (src : String; dest : out String) is
      --# hide Test_It;
   begin
      dest := src;
   end Test_It;
end test;
END
cat > main.adb <<END
with test, Spark_IO;
--# inherit test, Spark_IO;
--# main_program;
procedure main
  --# global in out Spark_IO.Outputs;
  --# derives Spark_IO.Outputs from Spark_IO.Outputs;
  is
   subtype Test_Len is Positive range 1 .. 5;
   subtype Test_String is String(Test_Len);

   my : Test_String;
begin
   test.Test_It("Hallo", my);
   Spark_IO.Put_Line(Spark_IO.Standard_Output, my, 0);
end main;
cat > main.idx <<END
Spark_IO specification is in .../spark_io.ads
Spark_IO body          is in .../spark_io.adb
test     specification is in test.ads
test     body          is in test.adb
END


Running "spark -i=main.idx -e main.adb" results in the expected error
message, but generates the right VC!
$ cat main.vcg
          *******************************************************
                      Semantic Analysis of SPARK Text
       SPARK95 Examiner with VC and RTC Generator Release 6.3 / 11.02
                           Demonstration Version
          *******************************************************


                       DATE : 12-MAR-2003 12:32:31.79

                               procedure main




For path(s) from start to precondition check associated with statement of line 14:

procedure_main_1.
H1:    true .
        ->
C1:    test_it__src__last - test_it__src__first + 1 = 
           test_it__dest__last - test_it__dest__first + 1 .
 

For path(s) from start to run-time check associated with statement of line 15:

procedure_main_2.
H1:    true .
H2:    test_it__src__last - test_it__src__first + 1 = 
           test_it__dest__last - test_it__dest__first + 1 .
H3:     for_all (i___1: integer, ((i___1 >= positive__first) and (
           i___1 <= positive__last)) -> ((element(my__1, [
           i___1]) >= character__first) and (element(my__1, [
           i___1]) <= character__last))) .
        ->
C1:    0 >= natural__first .
C2:    0 <= natural__last .
 

For path(s) from start to finish:

procedure_main_3.
*** true .          /* trivially true VC removed by Examiner */

Trying the other variant

cat > test.ads <<END
package test is
   --# function Length_Of_Dest return Integer;
   procedure Test_It (src : String; dest : out String);
   --# derives dest from src;
   --# pre src'Length = Length_Of_Dest;
end test;
END

results in
$ cat main.vcg
          *******************************************************
                      Semantic Analysis of SPARK Text
       SPARK95 Examiner with VC and RTC Generator Release 6.3 / 11.02
                           Demonstration Version
          *******************************************************


                       DATE : 12-MAR-2003 12:43:31.25

                               procedure main




For path(s) from start to precondition check associated with statement of line 14:

procedure_main_1.
H1:    true .
        ->
C1:    test_it__src__last - test_it__src__first + 1 = 
           test__length_of_dest .
 

For path(s) from start to run-time check associated with statement of line 15:

procedure_main_2.
H1:    true .
H2:    test_it__src__last - test_it__src__first + 1 = 
           test__length_of_dest .
H3:     for_all (i___1: integer, ((i___1 >= positive__first) and (
           i___1 <= positive__last)) -> ((element(my__1, [
           i___1]) >= character__first) and (element(my__1, [
           i___1]) <= character__last))) .
        ->
C1:    0 >= natural__first .
C2:    0 <= natural__last .
 

For path(s) from start to finish:

procedure_main_3.
*** true .          /* trivially true VC removed by Examiner */


Adding a rule to main.rls:
$ echo >> main.rls "
main_rules(100): test__length_of_dest may_be_replaced_by test_it__dest__last - test_it__dest__first + 1.
main_rules(101): test__length_of_dest may_be_replaced_by my__last - my__first + 1.
"
$ sparksimp
$ diff -u main.siv*
--- main.siv	2003-03-12 12:47:03.000000000 +0100
+++ main.siv-	2003-03-12 12:43:44.000000000 +0100
@@ -5,7 +5,7 @@
           *******************************************************
 
 
-      CREATED 12-MAR-2003, 12:43:31  SIMPLIFIED 12-MAR-2003, 12:47:00
+      CREATED 12-MAR-2003, 12:43:31  SIMPLIFIED 12-MAR-2003, 12:43:41
                (Simplified by SPADE Simplifier, Demo Version)
 
                                procedure main
@@ -19,15 +19,13 @@
 procedure_main_1.
 H1:    true .
        ->
-C1:    test_it__src__last - test_it__src__first + 1 = test_it__dest__last - 
-          test_it__dest__first + 1 .
+C1:    test_it__src__last - test_it__src__first + 1 = test__length_of_dest .
 
[...]

Which is correct, but does not solve the problem.
How do I this correctly?

I believe there is a much more difficult problem in this area, because the
more simple assertion:
  --# pre src'Length = 5;
results in
  C1:    test_it__src__last - test_it__src__first + 1 = 5 .     (vcg)
and
  C1:    test_it__src__last - test_it__src__first = 4 .         (siv)

IMHO the substitution of the formal parameter by the calling enviroment is
missing.



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-12 11:57         ` Lutz Donnerhacke
@ 2003-03-12 18:46           ` JP Thornley
  2003-03-13 10:14             ` Lutz Donnerhacke
  0 siblings, 1 reply; 18+ messages in thread
From: JP Thornley @ 2003-03-12 18:46 UTC (permalink / raw)


In article <slrnb6u84d.nt.lutz@taranis.iks-jena.de>, Lutz Donnerhacke 
<lutz@iks-jena.de> writes
>main_rules(100): test__length_of_dest may_be_replaced_by test_it__dest__last - test_it__dest__first + 1.
>main_rules(101): test__length_of_dest may_be_replaced_by my__last - my__first + 1.

Note that here you have given two different replacements for the same 
item. It is not obvious what the Simplifier will do with these.

The restriction that an out mode parameter can't be referenced in a 
precondition seems to be the major cause of the difficulty, and I have 
(reluctantly) come to the conclusion that the way to avoid explicit 
rules for each instance of the output parameter is to pass the length as 
an additional parameter, as Rod suggests.

Cheers,

Phil

-- 
JP Thornley



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
@ 2003-03-13  5:55 Grein, Christoph
  2003-03-13  9:47 ` Peter Amey
  0 siblings, 1 reply; 18+ messages in thread
From: Grein, Christoph @ 2003-03-13  5:55 UTC (permalink / raw)
  To: comp.lang.ada

> The restriction that an out mode parameter can't be referenced in a 
> precondition seems to be the major cause of the difficulty, and I have 
> (reluctantly) come to the conclusion that the way to avoid explicit 
> rules for each instance of the output parameter is to pass the length as 
> an additional parameter, as Rod suggests.

I've never used SPARK myself, but I've read the PDF excerpt of Barnes' book with 
interest.

I'm wondering whether it wouldn't be feasible to change the SPARK rules in such 
a way to allow references of attributes of out parameters in preconditions. 
(Even in Ada 83, where reading of out parameters was forbidden, reading of their 
attributes and discriminants was allowed - and for a good reason.)



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-13  5:55 Grein, Christoph
@ 2003-03-13  9:47 ` Peter Amey
  2003-03-13 10:15   ` Lutz Donnerhacke
  0 siblings, 1 reply; 18+ messages in thread
From: Peter Amey @ 2003-03-13  9:47 UTC (permalink / raw)




"Grein, Christoph" wrote:
> 
> > The restriction that an out mode parameter can't be referenced in a
> > precondition seems to be the major cause of the difficulty, and I have
> > (reluctantly) come to the conclusion that the way to avoid explicit
> > rules for each instance of the output parameter is to pass the length as
> > an additional parameter, as Rod suggests.
> 
> I've never used SPARK myself, but I've read the PDF excerpt of Barnes' book with
> interest.
> 
> I'm wondering whether it wouldn't be feasible to change the SPARK rules in such
> a way to allow references of attributes of out parameters in preconditions.
> (Even in Ada 83, where reading of out parameters was forbidden, reading of their
> attributes and discriminants was allowed - and for a good reason.)

That specific topic is the subject of an open "performance report" and
we have been thinking about it for quite a while.  FWIW, it is not
something that has been identified as a significant problem for most
SPARK users; however, that is no reason to ignore it for ever!

Peter



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-12 18:46           ` JP Thornley
@ 2003-03-13 10:14             ` Lutz Donnerhacke
  0 siblings, 0 replies; 18+ messages in thread
From: Lutz Donnerhacke @ 2003-03-13 10:14 UTC (permalink / raw)


* JP Thornley wrote:
>* Lutz Donnerhacke <lutz@iks-jena.de> writes
>>main_rules(100): test__length_of_dest may_be_replaced_by
>>                 test_it__dest__last - test_it__dest__first + 1.
>>main_rules(101): test__length_of_dest may_be_replaced_by
>>                 my__last - my__first + 1.
>
> Note that here you have given two different replacements for the same 
> item. It is not obvious what the Simplifier will do with these.

I did this after some testing.

> The restriction that an out mode parameter can't be referenced in a
> precondition seems to be the major cause of the difficulty,

The constraints of an unconstraint OUT type are IN parameters. Spark does
not reflect this. => Problem.

> and I have (reluctantly) come to the conclusion that the way to avoid
> explicit rules for each instance of the output parameter is to pass the
> length as an additional parameter, as Rod suggests.

This is not really acceptable, because it seperates information crutial for
proofing exeption freedom from the objects. This renders the whole proof
useless.

But even the much more simple example does not work:
with Spark_IO;
--# inherit Spark_IO;

--# main_program;
procedure main
  --# global in out Spark_IO.Outputs;
  --# derives Spark_IO.Outputs from Spark_IO.Outputs;
  is
   subtype My_Index is Positive range 1 .. 5;
   subtype My_String is String(My_Index);
   my : My_String;
   world : constant My_String := "Hello";
   
   procedure Copy(s : String; d : out String)
     --# derives d from s;
     --# pre s'Length = My_String'Length;
     is
      --# hide Copy;
   begin
      d := s;
   end Copy;
begin
   Copy(world, my);
   Spark_IO.Put_Line(Spark_IO.Standard_Output, my, 0);
end main;

It does not matter, how the parameters of Copy are given in the main program.
The examiner does not generate the correct rules in the calling enviroment.
It fails to transform the called parameters by the calling ones. :-(

Of course the simple change to constraint arrays works like a charm:
with Spark_IO;
--# inherit Spark_IO;

--# main_program;
procedure main
  --# global in out Spark_IO.Outputs;
  --# derives Spark_IO.Outputs from Spark_IO.Outputs;
  is
   subtype My_Index is Positive range 1 .. 5;
   subtype My_String is String(My_Index);
   my : My_String;
   world : constant My_String := "Hello";
   
   procedure Copy(s : My_String; d : out My_String)
     --# derives d from s;
     --# pre s'Length = My_String'Length;
     is
   begin
      d := s;
   end Copy;
begin
   Copy(world, my);
   Spark_IO.Put_Line(Spark_IO.Standard_Output, my, 0);
end main;



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-13  9:47 ` Peter Amey
@ 2003-03-13 10:15   ` Lutz Donnerhacke
  2003-03-21 15:05     ` Peter Amey
  0 siblings, 1 reply; 18+ messages in thread
From: Lutz Donnerhacke @ 2003-03-13 10:15 UTC (permalink / raw)


* Peter Amey wrote:
> That specific topic is the subject of an open "performance report" and
> we have been thinking about it for quite a while.  FWIW, it is not
> something that has been identified as a significant problem for most
> SPARK users; however, that is no reason to ignore it for ever!

Thank you. Sorry for caming across this so early.



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-13 10:15   ` Lutz Donnerhacke
@ 2003-03-21 15:05     ` Peter Amey
  2003-03-21 15:17       ` Lutz Donnerhacke
  0 siblings, 1 reply; 18+ messages in thread
From: Peter Amey @ 2003-03-21 15:05 UTC (permalink / raw)




Lutz Donnerhacke wrote:
> 
> * Peter Amey wrote:
> > That specific topic is the subject of an open "performance report" and
> > we have been thinking about it for quite a while.  FWIW, it is not
> > something that has been identified as a significant problem for most
> > SPARK users; however, that is no reason to ignore it for ever!
> 
> Thank you. Sorry for caming across this so early.

Ok, I feel nagged and have finally implemented something to solve the
problem you described.  The next release of the Examiner will allow
exported, formal parameters of unconstrained types to appear in
preconditions.  The idea will be to prove that the constraint provided
by the actual parameter in the calling environment is strong enough. 
The only useful preconditions that can be written will involve
attributes of the parameter.

e.g.

   procedure Copy (S : in     String;
                   D :    out String);
   --# derives D from S;
   --# pre D'Last >= S'Last;

is now provable for Copy ("AB", X); and Copy ("ABC", X);  but not
provable for Copy ("ABCD", X); 

regards

Peter and the SPARK team



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [Spark] Converting Arrays
  2003-03-21 15:05     ` Peter Amey
@ 2003-03-21 15:17       ` Lutz Donnerhacke
  0 siblings, 0 replies; 18+ messages in thread
From: Lutz Donnerhacke @ 2003-03-21 15:17 UTC (permalink / raw)


* Peter Amey wrote:
> The next release of the Examiner will allow exported, formal parameters
> of unconstrained types to appear in preconditions.

That's great, thank you. Currently a valid workaroud is to declare the
parameter "in out" and initialize it outside the procedure, if necessary.



^ permalink raw reply	[flat|nested] 18+ messages in thread

end of thread, other threads:[~2003-03-21 15:17 UTC | newest]

Thread overview: 18+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2003-03-10 17:06 [Spark] Converting Arrays Lutz Donnerhacke
2003-03-10 20:03 ` James S. Rogers
2003-03-10 22:33   ` Lutz Donnerhacke
2003-03-11 10:14 ` Rod Chapman
2003-03-11 10:51   ` Lutz Donnerhacke
2003-03-11 10:52   ` Lutz Donnerhacke
2003-03-11 20:46     ` JP Thornley
2003-03-12  8:43       ` Phil Thornley
2003-03-12 11:57         ` Lutz Donnerhacke
2003-03-12 18:46           ` JP Thornley
2003-03-13 10:14             ` Lutz Donnerhacke
2003-03-12  9:43     ` Rod Chapman
2003-03-12 10:15       ` Lutz Donnerhacke
  -- strict thread matches above, loose matches on Subject: below --
2003-03-13  5:55 Grein, Christoph
2003-03-13  9:47 ` Peter Amey
2003-03-13 10:15   ` Lutz Donnerhacke
2003-03-21 15:05     ` Peter Amey
2003-03-21 15:17       ` Lutz Donnerhacke

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox