* Spark_IO: improving with postconditions?
@ 2009-08-26 8:57 Maciej Sobczak
2009-08-26 9:23 ` Georg Bauhaus
0 siblings, 1 reply; 4+ messages in thread
From: Maciej Sobczak @ 2009-08-26 8:57 UTC (permalink / raw)
The Spark_IO package defines several procedures that "return" string
values by filling the given String object and the end-of-sequence
index value. This end-of-sequence parameter is consistently defined
as:
Stop : out Natural
One example of such a function is:
procedure Get_Line (File : in File_Type;
Item : out String;
Stop : out Natural);
--# global in out Inputs;
--# derives Inputs,
--# Item,
--# Stop from Inputs, File;
--# declare delay;
Wouldn't it be good to enhance this specification with additional:
--# post Stop <= Item'Last;
It would save the caller from one redundant check if the Stop value is
ever used again as index into the returned string value.
--
Maciej Sobczak * www.msobczak.com * www.inspirel.com
Database Access Library for Ada: www.inspirel.com/soci-ada
^ permalink raw reply [flat|nested] 4+ messages in thread
* Re: Spark_IO: improving with postconditions?
2009-08-26 8:57 Spark_IO: improving with postconditions? Maciej Sobczak
@ 2009-08-26 9:23 ` Georg Bauhaus
2009-08-26 14:47 ` xavier grave
0 siblings, 1 reply; 4+ messages in thread
From: Georg Bauhaus @ 2009-08-26 9:23 UTC (permalink / raw)
Maciej Sobczak schrieb:
> procedure Get_Line (File : in File_Type;
> Item : out String;
> Stop : out Natural);
> --# global in out Inputs;
> --# derives Inputs,
> --# Item,
> --# Stop from Inputs, File;
> --# declare delay;
>
> Wouldn't it be good to enhance this specification with additional:
>
> --# post Stop <= Item'Last;
Will Spark accept this String?
X : String (-1 .. -2);
If so, another precodition might be needed, Item'Last in Natural;
If Get_Line (or for that matter, a program) can do useful
things with empty string buffers. Is it desirable?
^ permalink raw reply [flat|nested] 4+ messages in thread
* Re: Spark_IO: improving with postconditions?
2009-08-26 9:23 ` Georg Bauhaus
@ 2009-08-26 14:47 ` xavier grave
2009-08-26 15:47 ` Georg Bauhaus
0 siblings, 1 reply; 4+ messages in thread
From: xavier grave @ 2009-08-26 14:47 UTC (permalink / raw)
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Georg Bauhaus a �crit :
> Maciej Sobczak schrieb:
>
>> procedure Get_Line (File : in File_Type;
>> Item : out String;
>> Stop : out Natural);
>> --# global in out Inputs;
>> --# derives Inputs,
>> --# Item,
>> --# Stop from Inputs, File;
>> --# declare delay;
>>
>> Wouldn't it be good to enhance this specification with additional:
>>
>> --# post Stop <= Item'Last;
>
> Will Spark accept this String?
>
> X : String (-1 .. -2);
In Ada the String declaration is the following :
type String is array(Positive range <>) of Character;
So I think Spark wouldn't accept your String declaration.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkqVSxAACgkQVIZi0A5BZF5RYwCeNsyk06Gs8NePyq8rrG4EC81W
dfIAnAsEAkODXWQjCLbvt90q7TTRsiFZ
=FTnE
-----END PGP SIGNATURE-----
^ permalink raw reply [flat|nested] 4+ messages in thread
* Re: Spark_IO: improving with postconditions?
2009-08-26 14:47 ` xavier grave
@ 2009-08-26 15:47 ` Georg Bauhaus
0 siblings, 0 replies; 4+ messages in thread
From: Georg Bauhaus @ 2009-08-26 15:47 UTC (permalink / raw)
xavier grave wrote:
> Georg Bauhaus a �crit :
>> Maciej Sobczak schrieb:
>
>>> procedure Get_Line (File : in File_Type;
>>> Item : out String;
>>> Stop : out Natural);
>>> --# global in out Inputs;
>>> --# derives Inputs,
>>> --# Item,
>>> --# Stop from Inputs, File;
>>> --# declare delay;
>>>
>>> Wouldn't it be good to enhance this specification with additional:
>>>
>>> --# post Stop <= Item'Last;
>> Will Spark accept this String?
>
>> X : String (-1 .. -2);
>
> In Ada the String declaration is the following :
> type String is array(Positive range <>) of Character;
>
> So I think Spark wouldn't accept your String declaration.
7p6 doesn't, and the messages refer to two different
things. (It is all logical, I had already forgotten
this nice property of SPARK.) Say,
subtype Null_Range is Positive range 1 .. 0;
subtype XString is String(Null_Range);
X : XString;
A null range is not accepted (It is, in Ada).
String types must have a lower bound of 1 (by rule 3.6.3)
So range -1 .. -2, which again is valid Ada,
is of anonymous type in SPARK, and hence not valid.
It's just that the second message is confusing without
the first
5 subtype Null_Range is Positive range 1 .. 0;
^
*** Semantic Error :409: Empty range specified.
6 subtype XString is String(Null_Range);
^
*** Semantic Error :417: String subtypes must have a lower
index bound of
1.
My initial comment is really meaningless.
^ permalink raw reply [flat|nested] 4+ messages in thread
end of thread, other threads:[~2009-08-26 15:47 UTC | newest]
Thread overview: 4+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2009-08-26 8:57 Spark_IO: improving with postconditions? Maciej Sobczak
2009-08-26 9:23 ` Georg Bauhaus
2009-08-26 14:47 ` xavier grave
2009-08-26 15:47 ` Georg Bauhaus
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox