comp.lang.ada
 help / color / mirror / Atom feed
* [Spark] Arrays of Strings
@ 2003-04-08 12:02 Lutz Donnerhacke
  2003-04-08 18:08 ` Martin Krischik
                   ` (3 more replies)
  0 siblings, 4 replies; 42+ messages in thread
From: Lutz Donnerhacke @ 2003-04-08 12:02 UTC (permalink / raw)


In order to implement thin an execve binding, I wonder how to emulate the
C-Type "char const * const x []". Any bright ideas?



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-08 12:02 [Spark] Arrays of Strings Lutz Donnerhacke
@ 2003-04-08 18:08 ` Martin Krischik
  2003-04-09  9:23   ` Lutz Donnerhacke
  2003-04-09  7:50 ` Eric G. Miller
                   ` (2 subsequent siblings)
  3 siblings, 1 reply; 42+ messages in thread
From: Martin Krischik @ 2003-04-08 18:08 UTC (permalink / raw)


On Tue, 08 Apr 2003 12:02:09 +0000, Lutz Donnerhacke wrote:

> In order to implement thin an execve binding, I wonder how to emulate the
> C-Type "char const * const x []". Any bright ideas?

Depends if you need a 1 to 1 emulation or you want a more high level OO
aproach.

With Regards

Martin

-- 
Martin Krischik
mailto://Martin@krischik.com
http://www.ada.krischik.com




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-08 12:02 [Spark] Arrays of Strings Lutz Donnerhacke
  2003-04-08 18:08 ` Martin Krischik
@ 2003-04-09  7:50 ` Eric G. Miller
  2003-04-09  8:10   ` Lutz Donnerhacke
  2003-04-09 18:23   ` Matthew Heaney
  2003-04-09 17:42 ` Matthew Heaney
  2003-04-11 12:32 ` Rod Chapman
  3 siblings, 2 replies; 42+ messages in thread
From: Eric G. Miller @ 2003-04-09  7:50 UTC (permalink / raw)


In <slrnb95ehu.ob.lutz@taranis.iks-jena.de>, Lutz Donnerhacke wrote:

> In order to implement thin an execve binding, I wonder how to emulate the
> C-Type "char const * const x []". Any bright ideas?

Errm, my local definition of execve looks like:

int execve(const char *filename, 
           char *const argv [], char *const envp[]);

What'd the const pointer be for?  Even if your definition had such, that'd
be a burden upon the C function, but shouldn't much affect the Ada side of
things.

Try the following out (rename "/tmp/hello" as appropriate)...

-- executor.ads
with Interfaces.C.Strings;

package Executor is

   package C renames Interfaces.C;

   function Execve (Filename : C.Strings.Chars_Ptr;
                    Argv     : C.Strings.Chars_Ptr_Array;
                    Envp     : C.Strings.Chars_Ptr_Array) return C.Int;

   pragma Import (Convention => C,
                  Entity => Execve,
                  External_Name => "execve");

end Executor;


-- try_exec.adb
with Ada.Text_Io;
use Ada.Text_Io;
with Executor;
with Interfaces.C.Strings;

procedure Try_Exec
is
   package C renames Interfaces.C;

   Argv : C.Strings.Chars_Ptr_Array :=
     (C.Strings.New_String("John"),
      C.Strings.New_String("Bill"),
      C.Strings.New_String("Bob"),
      C.Strings.Null_Ptr);

   -- GNAT says there can't only be one??
   Envp : C.Strings.Chars_Ptr_Array :=
     (C.Strings.Null_Ptr, C.Strings.Null_Ptr);

   FileName : C.Strings.Chars_Ptr :=
     C.Strings.New_String("/tmp/hello");

   Status : C.Int;

begin

   Status := Executor.Execve(FileName => FileName,
                    Argv => Argv, Envp => Envp);

   -- Got here?
   Put_Line ("That didn't work! Got " & C.Int'Image(Status));

end Try_Exec;

-- hello.c
#include <stdio.h>

int main (int argc, char *argv[])
{
  int i;

  for (i = 0; i < argc; ++i) {
    printf ("Hello %s!\n", argv[i]);
  }

  return 0;
}

-- 
echo ">gra.fcw@2ztr< eryyvZ .T pveR" | rot13 | reverse




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09  7:50 ` Eric G. Miller
@ 2003-04-09  8:10   ` Lutz Donnerhacke
  2003-04-09 18:23   ` Matthew Heaney
  1 sibling, 0 replies; 42+ messages in thread
From: Lutz Donnerhacke @ 2003-04-09  8:10 UTC (permalink / raw)


* Eric G. Miller wrote:
> In <slrnb95ehu.ob.lutz@taranis.iks-jena.de>, Lutz Donnerhacke wrote:
> 
>> In order to implement thin an execve binding, I wonder how to emulate the
>> C-Type "char const * const x []". Any bright ideas?
> 
> Errm, my local definition of execve looks like:
> 
> int execve(const char *filename, 
>            char *const argv [], char *const envp[]);
> 
> What'd the const pointer be for?  Even if your definition had such, that'd
> be a burden upon the C function, but shouldn't much affect the Ada side of
> things.

"char const * const []" is an array of constant pointers to constant chars.
The modifier const is a postfix modifier. For the first element an
equivalent prefix notation exists for syntactic shugar.

> Try the following out (rename "/tmp/hello" as appropriate)...
> 
> with Ada.Text_Io;
> with Interfaces.C.Strings;

The use of this packages is not possible in Spark. I need an emulation or
shadow of the package Interfaces.C.Strings.

> use Ada.Text_Io;

"use package" is not allowed in Spark.

>    Argv : C.Strings.Chars_Ptr_Array :=
>      (C.Strings.New_String("John"),
>       C.Strings.New_String("Bill"),
>       C.Strings.New_String("Bob"),
>       C.Strings.Null_Ptr);

This involves dynamic memory, which should be avioded as much as possible in
Spark. Language based dynamic memory management is not supported in Spark.

I hope it's clear, what I expect:

package Interfaces is
end Interfaces;

package Interfaces.C is
   type int is range Integer'First .. Integer'Last; -- has to be correct.
end Interfaces.C;

package Interfaces.C.Strings is
   type Chars_Ptr is private;
   type Chars_Ptr_Array is array (Positive range <>) of Chars_Ptr;
private
   type Chars_Ptr is mod 2**32; -- has to be correct.
end Interfaces.C.Strings;


with Interfaces.C.Strings;
--# inherit Interfaces.C.Strings, Interfaces.C;

package Executor is                                                           
   function Execve (
     Filename : Interfaces.C.Strings.Chars_Ptr;
     Argv     : Interfaces.C.Strings.Chars_Ptr_Array;
     Envp     : Interfaces.C.Strings.Chars_Ptr_Array)
     return Interfaces.C.Int;
   
   pragma Import (Convention => C,                                            
     Entity => Execve,                                           
     External_Name => "execve");                                 
end Executor;                                                                 




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-08 18:08 ` Martin Krischik
@ 2003-04-09  9:23   ` Lutz Donnerhacke
  2003-04-09 12:38     ` Hyman Rosen
  0 siblings, 1 reply; 42+ messages in thread
From: Lutz Donnerhacke @ 2003-04-09  9:23 UTC (permalink / raw)


* Martin Krischik wrote:
> On Tue, 08 Apr 2003 12:02:09 +0000, Lutz Donnerhacke wrote:
>> In order to implement thin an execve binding, I wonder how to emulate the
>> C-Type "char const * const x []". Any bright ideas?
>
> Depends if you need a 1 to 1 emulation or you want a more high level OO
> aproach.

The emulation should be native Spark and to be used in an inlined syscall.
The syscall interface is so thin, that the inlined code prepare the
registers and switch directly to the kernel.

Example:
pragma No_Run_Time;

with Kernel.Processes, Kernel.IO, Kernel.Linux_i86, Addon;
use type Kernel.IO.long;
--# inherit Kernel, Kernel.Processes, Kernel.IO;

--# main_program;
procedure test_kernel
  --# global in out Kernel.State;
  --# derives Kernel.State from Kernel.State;
  is
   pragma Suppress(All_Checks);

   subtype Buffer_Index is Positive range 1 .. 10;
   subtype Buffer is String(Buffer_Index);
   outbuf : Buffer;
   outlast : Natural;
   outwritten : Natural;
   exitcode : Kernel.Processes.exit_code_t;
   
   child : Kernel.Processes.child_data_t;
   res : Kernel.Linux_i86.syscall_result;
begin
   Kernel.Processes.fork(child, res);
   if res.ok then
      if child.child then
	 outbuf := Buffer'(
	   1 => 'C', 2 => 'h', 3 => 'i', 4 => 'l', 5 => 'd',
	   others => ' ');
	 outlast := 5;
      else
	 outbuf := Buffer'(
	   1 => 'M', 2 => 'a', 3 => 's', 4 => 't', 5 => 'e', 6 => 'r',
	   others => ' ');
	 outlast := 6;
      end if;
   else
      outbuf := Buffer'(
	1 => 'E', 2 => 'r', 3 => 'r', 4 => 'o', 5 => 'r',
	others => ' ');
      outlast := 5;
   end if;
   
   outlast := outlast + 1;
   outbuf(outlast) := ASCII.LF;

   Kernel.IO.write(Kernel.IO.stdout, outbuf, outlast, outwritten, res);
   if not res.ok or else outwritten /= outlast then
      exitcode := 1;
   else
      exitcode := 0;
   end if;
   
   Kernel.Processes.sysexit(exitcode);
end test_kernel;

00000000 <_ada_test_kernel>:
   0:	55                   	push   %ebp
   1:	b8 02 00 00 00       	mov    $0x2,%eax    <= the fork call.
   6:	57                   	push   %edi
   7:	56                   	push   %esi
   8:	53                   	push   %ebx
   9:	83 ec 54             	sub    $0x54,%esp
   c:	cd 80                	int    $0x80        <= the fork call.
   e:	31 db                	xor    %ebx,%ebx
  10:	89 c1                	mov    %eax,%ecx    <= return code for fork
  12:	83 c0 7c             	add    $0x7c,%eax
  15:	89 4c 24 28          	mov    %ecx,0x28(%esp,1)
  19:	83 f8 7b             	cmp    $0x7b,%eax
  1c:	89 d0                	mov    %edx,%eax
  1e:	0f 96 c3             	setbe  %bl          <= set the error bit
  21:	31 f6                	xor    %esi,%esi    <= the if construct
  23:	30 c0                	xor    %al,%al
  25:	89 c2                	mov    %eax,%edx
  27:	09 da                	or     %ebx,%edx
  29:	31 db                	xor    %ebx,%ebx
  2b:	89 54 24 24          	mov    %edx,0x24(%esp,1)
  2f:	84 d2                	test   %dl,%dl
  31:	74 10                	je     43 <_ada_test_kernel+0x43>
 ...



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09  9:23   ` Lutz Donnerhacke
@ 2003-04-09 12:38     ` Hyman Rosen
  2003-04-09 12:47       ` Vinzent Hoefler
  0 siblings, 1 reply; 42+ messages in thread
From: Hyman Rosen @ 2003-04-09 12:38 UTC (permalink / raw)


Lutz Donnerhacke wrote:
>    pragma Suppress(All_Checks);
>    subtype Buffer_Index is Positive range 1 .. 10;
>    subtype Buffer is String(Buffer_Index);
>    outbuf : Buffer;
>    outlast : Natural;
>       outbuf := Buffer'(
> 	1 => 'E', 2 => 'r', 3 => 'r', 4 => 'o', 5 => 'r',
> 	others => ' ');
>       outlast := 5;
>    end if;
>    
>    outlast := outlast + 1;
>    outbuf(outlast) := ASCII.LF;

My God, but this code is cringe-inducing! Is there really no way
to fill the buffer except with this character indexing stuff?
Not only that, but all checks are suppressed, and a type Natural
variable is being used to index and write into a fixed size buffer.
You are just begging for a buffer overflow error here.

This is what the reliable Spark subset of Ada gives you? I always
figure people are fooling themselves when they claim their languages
give them reliability, but I expected a little more from Ada!




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09 12:38     ` Hyman Rosen
@ 2003-04-09 12:47       ` Vinzent Hoefler
  2003-04-09 14:27         ` Hyman Rosen
  2003-04-09 17:21         ` Hyman Rosen
  0 siblings, 2 replies; 42+ messages in thread
From: Vinzent Hoefler @ 2003-04-09 12:47 UTC (permalink / raw)


Hyman Rosen <hyrosen@mail.com> wrote:

>Not only that, but all checks are suppressed, and a type Natural
>variable is being used to index and write into a fixed size buffer.
>You are just begging for a buffer overflow error here.

You might read a little bit more about SPARK.

>This is what the reliable Spark subset of Ada gives you?

The reliability is not in the subset like in MISRA-C, it is in the
static analysis. I think, it is called *proof*.


Vinzent.

-- 
Parents strongly cautioned  --  this  posting  is  intended for mature
audiences  over  18.  It  may  contain some material that many parents
would not find suitable for children and may include intense violence,
sexual situations, coarse language and suggestive dialogue.



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09 12:47       ` Vinzent Hoefler
@ 2003-04-09 14:27         ` Hyman Rosen
  2003-04-09 15:13           ` Vinzent Hoefler
  2003-04-09 17:21         ` Hyman Rosen
  1 sibling, 1 reply; 42+ messages in thread
From: Hyman Rosen @ 2003-04-09 14:27 UTC (permalink / raw)


Vinzent Hoefler wrote:
> The reliability is not in the subset like in MISRA-C, it is in the
> static analysis. I think, it is called *proof*.

I'll accept that, but the code *looks* hideous.




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09 14:27         ` Hyman Rosen
@ 2003-04-09 15:13           ` Vinzent Hoefler
  0 siblings, 0 replies; 42+ messages in thread
From: Vinzent Hoefler @ 2003-04-09 15:13 UTC (permalink / raw)


Hyman Rosen <hyrosen@mail.com> wrote:

>Vinzent Hoefler wrote:
>> The reliability is not in the subset like in MISRA-C, it is in the
>> static analysis. I think, it is called *proof*.
>
>I'll accept that, but the code *looks* hideous.

Perhaps. I'd guess

|Buffer'("Error     ")

*might* work too, but then this makes it a little bit harder to count
the lengths correct again. So, as far as I can see, both would have
slight drawbacks. Depends.


Vinzent.

-- 
Parents strongly cautioned  --  this  posting  is  intended for mature
audiences  over  18.  It  may  contain some material that many parents
would not find suitable for children and may include intense violence,
sexual situations, coarse language and suggestive dialogue.



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09 12:47       ` Vinzent Hoefler
  2003-04-09 14:27         ` Hyman Rosen
@ 2003-04-09 17:21         ` Hyman Rosen
  2003-04-09 18:41           ` Vinzent Hoefler
                             ` (2 more replies)
  1 sibling, 3 replies; 42+ messages in thread
From: Hyman Rosen @ 2003-04-09 17:21 UTC (permalink / raw)


Vinzent Hoefler wrote:
> The reliability is not in the subset like in MISRA-C, it is in the
> static analysis. I think, it is called *proof*.

I am starting to be a little disturbed now that I've thought about
this a little more. You seem to be telling me that it's OK to have
variables declared loosely (Natural instead of the array range type)
becuase a program verifier will notice problems regardless. To me,
this seems contrary to to the design of Ada, which emphasizes saying
what you mean using the type system. I've been told here frequently
that Ada's style lends itself to avoiding buffer overflows because
you declare variables that loop over array ranges, and so there is
never an opportunity to go off the end.

I find that the posted code looks very much like something you would
see in C (except for that awful buffer setting stuff).




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-08 12:02 [Spark] Arrays of Strings Lutz Donnerhacke
  2003-04-08 18:08 ` Martin Krischik
  2003-04-09  7:50 ` Eric G. Miller
@ 2003-04-09 17:42 ` Matthew Heaney
  2003-04-09 21:06   ` Randy Brukardt
  2003-04-10  8:23   ` Lutz Donnerhacke
  2003-04-11 12:32 ` Rod Chapman
  3 siblings, 2 replies; 42+ messages in thread
From: Matthew Heaney @ 2003-04-09 17:42 UTC (permalink / raw)


Lutz Donnerhacke <lutz@iks-jena.de> wrote in message news:<slrnb95ehu.ob.lutz@taranis.iks-jena.de>...
> In order to implement thin an execve binding, I wonder how to emulate the
> C-Type "char const * const x []". Any bright ideas?

Realize that an array in C "decays" to a pointer to the first element,
so x has the type

  const char* const*

If you want to use the char* type in Interfaces.C.Strings, then your
problem reduces to:

  type chars_ptr_access is access constant C.Strings.chars_ptr;
  for chars_ptr_access'Storage_Size use 0;
  pragma Convention (C, chars_ptr_access);

Now just declare an array of aliased chars_ptr, e.g.

  type chars_ptr_array is (Natural range <>) of aliased chars_ptr;
  pragma Convention (C, chars_ptr_array);

  argv_array : constant chars_ptr_array := ...;
  argv_ptr : constant chars_ptr_access := argv_array (0)'Access;

  execv(filename, argv_ptr);


Interestingly, the chars_ptr_array declared in RM95 B.3.1 (6) does not
declare its component subtype as aliased, nor is the array marked as
begin C-compatible.  This seems wrong to me, which is why I declared
the chars_ptr_array type myself, as above.



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09  7:50 ` Eric G. Miller
  2003-04-09  8:10   ` Lutz Donnerhacke
@ 2003-04-09 18:23   ` Matthew Heaney
  1 sibling, 0 replies; 42+ messages in thread
From: Matthew Heaney @ 2003-04-09 18:23 UTC (permalink / raw)


"Eric G. Miller" <egm2@Jay-Pee-eSs.net> wrote in message news:<pan.2003.04.09.07.50.03.727731@Jay-Pee-eSs.net>...
> 
> Try the following out (rename "/tmp/hello" as appropriate)...
> 
> -- executor.ads
> with Interfaces.C.Strings;
> 
> package Executor is
> 
>    package C renames Interfaces.C;
> 
>    function Execve (Filename : C.Strings.Chars_Ptr;
>                     Argv     : C.Strings.Chars_Ptr_Array;
>                     Envp     : C.Strings.Chars_Ptr_Array) return C.Int;
> 
>    pragma Import (Convention => C,
>                   Entity => Execve,
>                   External_Name => "execve");
> 

The problem is that C.Strings.chars_ptr_array isn't necessarily
C-compatible, because the RM doesn't promise that it is.

The other issue is that char_ptr_array is unconstrained, which means
it'll try to pass dope over the interface, which is basically
meaningless.  Instead of passing array (unconstrained) array type,
pass an array access type instead.  A component pointer (here, an
access type that designates type chars_ptr) would work too, as I
showed in my previous post.



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09 17:21         ` Hyman Rosen
@ 2003-04-09 18:41           ` Vinzent Hoefler
  2003-04-09 21:04           ` Randy Brukardt
  2003-04-10 23:21           ` John R. Strohm
  2 siblings, 0 replies; 42+ messages in thread
From: Vinzent Hoefler @ 2003-04-09 18:41 UTC (permalink / raw)


Hyman Rosen <hyrosen@mail.com> wrote:

>Vinzent Hoefler wrote:
>> The reliability is not in the subset like in MISRA-C, it is in the
>> static analysis. I think, it is called *proof*.
>
>I am starting to be a little disturbed now that I've thought about
>this a little more.

Thinking never hurted anybody. :)

>You seem to be telling me that it's OK to have
>variables declared loosely (Natural instead of the array range type)

Yes, I noticed that in Lutz' code later myself. My quoting was a
little bit misleading (even to myself), I guess.

But still it had nothing to do with your comments about "subset", I
was thinking more about the "character indexing stuff" you complained
about.

>becuase a program verifier will notice problems regardless.

Although this *might* be true sometimes, it was definitely not my
intention to say such stupid thing.

But sometimes a restricted subset might lead to code that could be
expressed (far) more easily with the original superset of the
language. That was what I was trying to say.

>I've been told here frequently
>that Ada's style lends itself to avoiding buffer overflows because
>you declare variables that loop over array ranges, and so there is
>never an opportunity to go off the end.

This doesn't apply to the code here, because the assignments are quite
static anyway, but generally I'd say, yes, this is right.

Call it experience. :-)

>I find that the posted code looks very much like something you would
>see in C (except for that awful buffer setting stuff).

Well, it's still code interfacing to a C-kernel. ;->

And it was called test_...


Vinzent.

-- 
Parents strongly cautioned  --  this  posting  is  intended for mature
audiences  over  18.  It  may  contain some material that many parents
would not find suitable for children and may include intense violence,
sexual situations, coarse language and suggestive dialogue.



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09 17:21         ` Hyman Rosen
  2003-04-09 18:41           ` Vinzent Hoefler
@ 2003-04-09 21:04           ` Randy Brukardt
  2003-04-10 23:21           ` John R. Strohm
  2 siblings, 0 replies; 42+ messages in thread
From: Randy Brukardt @ 2003-04-09 21:04 UTC (permalink / raw)


Hyman Rosen wrote in message <1049908902.143649@master.nyc.kbcfp.com>...
>I find that the posted code looks very much like something you would
>see in C (except for that awful buffer setting stuff).


Ada code intended to interface to directly to C often tends to look like
C - that is, it's ugly, range checking has to be turned off, and so on.
That's why thick bindings like Claw are so important - they hide that
cruft from view so that the majority of your program actually can
benefit from Ada. If you let the C interfacing stuff spread across your
program, you have gained nothing by using Ada.

            Randy.





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09 17:42 ` Matthew Heaney
@ 2003-04-09 21:06   ` Randy Brukardt
  2003-04-10  8:23   ` Lutz Donnerhacke
  1 sibling, 0 replies; 42+ messages in thread
From: Randy Brukardt @ 2003-04-09 21:06 UTC (permalink / raw)


Matthew Heaney wrote in message
<1ec946d1.0304090942.3106b4e4@posting.google.com>...
>Interestingly, the chars_ptr_array declared in RM95 B.3.1 (6) does not
>declare its component subtype as aliased, nor is the array marked as
>begin C-compatible.  This seems wrong to me, which is why I declared
>the chars_ptr_array type myself, as above.

This has come up before. See AI-276. (Personally, I think that type
ought to be deleted, as it has nothing to do with the Ada Strings
package - having no operations - and it is easily written by any
programmer that needs it. But that's too radical.)

         Randy.





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09 17:42 ` Matthew Heaney
  2003-04-09 21:06   ` Randy Brukardt
@ 2003-04-10  8:23   ` Lutz Donnerhacke
  2003-04-10 14:09     ` Matthew Heaney
  1 sibling, 1 reply; 42+ messages in thread
From: Lutz Donnerhacke @ 2003-04-10  8:23 UTC (permalink / raw)


* Matthew Heaney wrote:
> Lutz Donnerhacke <lutz@iks-jena.de> wrote in message news:<slrnb95ehu.ob.lutz@taranis.iks-jena.de>...
>> In order to implement thin an execve binding, I wonder how to emulate the
>> C-Type "char const * const x []". Any bright ideas?
>
> Realize that an array in C "decays" to a pointer to the first element,
> so x has the type
> 
>   const char* const*

Known.

> If you want to use the char* type in Interfaces.C.Strings, then your
> problem reduces to:
> 
>   type chars_ptr_access is access constant C.Strings.chars_ptr;
>   for chars_ptr_access'Storage_Size use 0;
>   pragma Convention (C, chars_ptr_access);

Ack. And Spark does not allow access types. So my question is still open.




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10  8:23   ` Lutz Donnerhacke
@ 2003-04-10 14:09     ` Matthew Heaney
  2003-04-10 14:48       ` Hyman Rosen
  2003-04-10 15:02       ` Lutz Donnerhacke
  0 siblings, 2 replies; 42+ messages in thread
From: Matthew Heaney @ 2003-04-10 14:09 UTC (permalink / raw)


Lutz Donnerhacke <lutz@iks-jena.de> wrote in message news:<slrnb9aagi.p2.lutz@taranis.iks-jena.de>...
> * Matthew Heaney wrote:
> 
> > If you want to use the char* type in Interfaces.C.Strings, then your
> > problem reduces to:
> > 
> >   type chars_ptr_access is access constant C.Strings.chars_ptr;
> >   for chars_ptr_access'Storage_Size use 0;
> >   pragma Convention (C, chars_ptr_access);
> 
> Ack. And Spark does not allow access types. So my question is still open.

Doesn't Spark have a #hide clause, to turn off Spark checking?  After
all, you're calling a C library function, so what difference does
Spark make?

You could simply pass type System.Address.

  execv(filename, argv_array (argv_array'First)'Address);

Or you could use an access parameter:

  execv(filename, argv_array (argv_array'First)'Access);



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10 14:09     ` Matthew Heaney
@ 2003-04-10 14:48       ` Hyman Rosen
  2003-04-11  6:20         ` Chad R. Meiners
                           ` (2 more replies)
  2003-04-10 15:02       ` Lutz Donnerhacke
  1 sibling, 3 replies; 42+ messages in thread
From: Hyman Rosen @ 2003-04-10 14:48 UTC (permalink / raw)


Matthew Heaney wrote:
> Doesn't Spark have a #hide clause, to turn off Spark checking?  After
> all, you're calling a C library function, so what difference does
> Spark make?

Getting rid of features that are presumed to cause problems is a bit of
hubris that language designers always seem to fall victim to. Ada itself
had a huge problem because the designers thought that function pointers
could be eliminated. Spark gets rid of all pointers, Java gets rid of
templates, and so on and so on.

Then everyone who uses these languages has to figure out how to work
around the lack of the feature they need, essentially duplicating it in
some kludgy way. Meanwhile the language designers have their heads in the
sand and their noses in the air while they pat themselves on the back (:-)
in self-congratulation on how they have created perfection.




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10 14:09     ` Matthew Heaney
  2003-04-10 14:48       ` Hyman Rosen
@ 2003-04-10 15:02       ` Lutz Donnerhacke
  2003-04-10 15:50         ` Hyman Rosen
  2003-04-11  6:28         ` Chad R. Meiners
  1 sibling, 2 replies; 42+ messages in thread
From: Lutz Donnerhacke @ 2003-04-10 15:02 UTC (permalink / raw)


* Matthew Heaney wrote:
> Lutz Donnerhacke <lutz@iks-jena.de> wrote in message news:<slrnb9aagi.p2.lutz@taranis.iks-jena.de>...
>> * Matthew Heaney wrote:
>> > If you want to use the char* type in Interfaces.C.Strings, then your
>> > problem reduces to:
>> >
>> >   type chars_ptr_access is access constant C.Strings.chars_ptr;
>> >   for chars_ptr_access'Storage_Size use 0;
>> >   pragma Convention (C, chars_ptr_access);
>>
>> Ack. And Spark does not allow access types. So my question is still open.
>
> Doesn't Spark have a #hide clause, to turn off Spark checking?  After
> all, you're calling a C library function, so what difference does
> Spark make?

Right, but I'm looking for the Spark frontend of this hide clause.

> You could simply pass type System.Address.

Of course. I can pass any private type, but how do I initialize such a thing?

>   execv(filename, argv_array (argv_array'First)'Address);

This call has to occur in normal Spark.
argv_array has which type? array (Positive range <>) of System.Address?

And the examinder says:
***        Semantic Error    : 54: Illegal attribute: Address.

> Or you could use an access parameter:
>
>   execv(filename, argv_array (argv_array'First)'Access);

How do I do this in a Spark frontend? Of course, I can hide the whole
program, but this is not really a solution.



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10 15:02       ` Lutz Donnerhacke
@ 2003-04-10 15:50         ` Hyman Rosen
  2003-04-10 18:32           ` Randy Brukardt
  2003-04-11  6:28         ` Chad R. Meiners
  1 sibling, 1 reply; 42+ messages in thread
From: Hyman Rosen @ 2003-04-10 15:50 UTC (permalink / raw)


Lutz Donnerhacke wrote:
> How do I do this in a Spark frontend? Of course, I can hide the whole
> program, but this is not really a solution.

"Doctor, it hurts when I do that."
"Then don't do that!"

Why have you chosen to work in a language which deliberately
prevents you from expressing what you explicitly need to do?

This whole thing has an air of the naughty Victorians about it.
No one is willing to talk about what they're doing, but they're
all doing it. The program stands there full of hypocrisy, saying
"Pointers? Who me? I'm in Spark! No pointers here!" while dealing
in pointers with its shady C-based friends.




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10 15:50         ` Hyman Rosen
@ 2003-04-10 18:32           ` Randy Brukardt
  0 siblings, 0 replies; 42+ messages in thread
From: Randy Brukardt @ 2003-04-10 18:32 UTC (permalink / raw)


Hyman Rosen wrote in message <1049989859.683662@master.nyc.kbcfp.com>...
>Lutz Donnerhacke wrote:
>> How do I do this in a Spark frontend? Of course, I can hide the whole
>> program, but this is not really a solution.
>
>"Doctor, it hurts when I do that."
>"Then don't do that!"
>
>Why have you chosen to work in a language which deliberately
>prevents you from expressing what you explicitly need to do?


I hate to do it, but I have to agree with Hyman here.

There is no way to make the code directly interfacing with C any safer
than the underlying C, and that is never going to be anywhere near the
standards of Spark.

What has to be done in cases like this is to write an appropriate thick
wrapper that IS espressible in Spark, and then the ugly interfacing code
has to be in 'real' Ada with the Spark checking turned off.

I don't use Spark, but the Prog_Call function of Janus/Ada (which of
course calls one of the exec functions) has an interface like:

      function Prog_Call (Program_Name : in String;
                                        Command : in String;
                                        Wait : Boolean := True);

and all of the ugly pointer stuff needed to implement that interface is
hidden in the body of the package.

Abstracting the environment variable stuff will be hard (if you actually
need that; it's rare -- Prog_Call just passes null, which is a copy of
the caller's for Windows, I forget what Unix does with that), but I
don't think you have any choice in order to make proper Spark.

The overhead of such a wrapper can be reduced by in-lining and the like,
but it hardly can matter: exec functions are a lot slower than large
volumes of Ada code, and we're certainly not talking about that.

                 Randy.






^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-09 17:21         ` Hyman Rosen
  2003-04-09 18:41           ` Vinzent Hoefler
  2003-04-09 21:04           ` Randy Brukardt
@ 2003-04-10 23:21           ` John R. Strohm
  2003-04-11 12:19             ` Hyman Rosen
  2 siblings, 1 reply; 42+ messages in thread
From: John R. Strohm @ 2003-04-10 23:21 UTC (permalink / raw)


"Hyman Rosen" <hyrosen@mail.com> wrote in message
news:1049908902.143649@master.nyc.kbcfp.com...
> Vinzent Hoefler wrote:
> > The reliability is not in the subset like in MISRA-C, it is in the
> > static analysis. I think, it is called *proof*.
>
> I am starting to be a little disturbed now that I've thought about
> this a little more. You seem to be telling me that it's OK to have
> variables declared loosely (Natural instead of the array range type)
> becuase a program verifier will notice problems regardless. To me,
> this seems contrary to to the design of Ada, which emphasizes saying
> what you mean using the type system. I've been told here frequently
> that Ada's style lends itself to avoiding buffer overflows because
> you declare variables that loop over array ranges, and so there is
> never an opportunity to go off the end.
>
> I find that the posted code looks very much like something you would
> see in C (except for that awful buffer setting stuff).

Disclaimer: This is not a discussion of SPARK as such, but instead of formal
verification in the general sense.

When you declare an array to be of a given size, you create a proof
obligation that the subscript expression will always be in bounds.
Satisfying this proof obligation is what you do during verification.

Consider

procedure main is
i:integer;
x: array(1..10) of real;
begin
  for i in 2..4 loop
    x(2*i+1) := 0;
  end loop;
end;

The array X is declared to be (1..10).  This generates an obligation that
for all X(<foo>), 1 le <foo> and <foo> le 10.

In this particular case, the verifier must satisfy

for all i:integer, i in 2..4
  --> (implies)
2*i+1 in 1..10

or

for all i:integer, i in 2..4
  -->
2*i in 0..9

or

i in 2..4
  -->
i in 0..4

which is trivially true if your verifier knows how to deal with this kind of
inequality processing.  (That capability has existed since the late 1970s.)
And even if your verifier doesn't understand inequality intervals, it can
exhaustively enumerate all values of i that satisfy the hypothesis and check
that they satisfy the conclusion.  (Some model checkers do that kind of
thing.)

What normally happens is that you will generate a proof obligation that you
may not be able to prove.  This is normal.  Usually, you can't DISPROVE a
proof obligation, you can only fail to find a proof for it.  (There exist
mechanical procedures for finding proofs of theorems, that WILL succeed if a
proof does in fact exist.  They may take a while.)  There are typically two
reasons for this: 1) you haven't brought enough information into context, 2)
you've uncovered a bug.  (There is a third possibility: you've uncovered an
instance of Godel's Theorem, but in practice that doesn't seem to happen.)

Consider

procedure main is
i:integer;
x: array(1..10) of real;
begin
  for i in 2..4 loop
    x(2*i+3) := 0;
  end loop;
end;

The proof obligation will become

i in 2..4 --> 2*i+3 in 1..10
i in 2..4 --> 2*i in -2..7
i in 2..4 --> i in -1..3   *TILT*

At this point, your theorem prover will do something roughly analogous to
throwing up its hands and saying "Beats me, daddy-O".  (With apologies to
Maynard G. Krebs...)  You are expected to look at the problem, look at the
code, and say "DOH!" and fix the obvious error.

Yes, it would be a bit more self-documenting if i were declared to be
integer(1..10), but it doesn't actually matter for verification.

OK, but what if the subscript is coming from somewhere else?

Simple.  Consider

procedure foo(i:integer) is
x:array(1..10) of real;
begin
  x(i) := 42;
end;

Again, a proof obligation is generated, to prove that i is in 1..10.  The
proof obligation must then be propagated to all of the callers of foo, and
proved individually.  Whether that obligation is propagated automatically by
the verification environment, or manually by the programmer adding an entry
specification annotation, is implementation-dependent.  Obviously, either
way will work.

Made-up example:

procedure foo(i:integer) is
x:array(1..10) of real;
entryspec i in 1..10;
begin
  x(i) := 42;
end;

The use of an entry specification generally simplifies the verification of
foo, in that it tells the verifier what it is allowed to assume.  At the
same time, it explicitly generates a proof obligation on the callers of foo.





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10 14:48       ` Hyman Rosen
@ 2003-04-11  6:20         ` Chad R. Meiners
  2003-04-11 12:31           ` Hyman Rosen
  2003-04-11  7:35         ` Phil Thornley
  2003-04-12 23:51         ` Robert A Duff
  2 siblings, 1 reply; 42+ messages in thread
From: Chad R. Meiners @ 2003-04-11  6:20 UTC (permalink / raw)



"Hyman Rosen" <hyrosen@mail.com> wrote in message
news:1049986095.779228@master.nyc.kbcfp.com...
> Getting rid of features that are presumed to cause problems is a bit of
> hubris that language designers always seem to fall victim to. Ada itself
> had a huge problem because the designers thought that function pointers
> could be eliminated. Spark gets rid of all pointers, Java gets rid of
> templates, and so on and so on.

You are missing the point of Spark which is to provide a (super)subset of
Ada in which formal proofs about the implementation are possible.  Lack of
pointers was not a case of hubris but a deliberate and sound decision in
Spark's case.  Formal validation and verification methodologies require
unambiguous specifications and implementations.  Spark is intended to
satisfy such formal needs.

> Then everyone who uses these languages has to figure out how to work
> around the lack of the feature they need, essentially duplicating it in
> some kludgy way. Meanwhile the language designers have their heads in the
> sand and their noses in the air while they pat themselves on the back (:-)
> in self-congratulation on how they have created perfection.

Of one should know something about a language before one chooses to
overgeneralize ;)





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10 15:02       ` Lutz Donnerhacke
  2003-04-10 15:50         ` Hyman Rosen
@ 2003-04-11  6:28         ` Chad R. Meiners
  2003-04-11  8:11           ` Lutz Donnerhacke
  1 sibling, 1 reply; 42+ messages in thread
From: Chad R. Meiners @ 2003-04-11  6:28 UTC (permalink / raw)



"Lutz Donnerhacke" <lutz@iks-jena.de> wrote in message
news:slrnb9b1r8.p2.lutz@taranis.iks-jena.de...


> Right, but I'm looking for the Spark frontend of this hide clause. );
>
> How do I do this in a Spark frontend? Of course, I can hide the whole
> program, but this is not really a solution.

Couldn't you just create an inlined procedure for each literal call to
execv?
This way you could just hide the execv call in the body while keeping the
rest of your code in Spark.





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10 14:48       ` Hyman Rosen
  2003-04-11  6:20         ` Chad R. Meiners
@ 2003-04-11  7:35         ` Phil Thornley
  2003-04-11 12:05           ` Marin David Condic
  2003-04-12 23:51         ` Robert A Duff
  2 siblings, 1 reply; 42+ messages in thread
From: Phil Thornley @ 2003-04-11  7:35 UTC (permalink / raw)


"Hyman Rosen" <hyrosen@mail.com> wrote in message
news:1049986095.779228@master.nyc.kbcfp.com...
> Getting rid of features that are presumed to cause problems is a bit
of
> hubris that language designers always seem to fall victim to. Ada
itself
> had a huge problem because the designers thought that function
pointers
> could be eliminated. Spark gets rid of all pointers, Java gets rid of
> templates, and so on and so on.
>
> Then everyone who uses these languages has to figure out how to work
> around the lack of the feature they need, essentially duplicating it
in
> some kludgy way. [abuse snipped]

There must be very few SPARK based systems that don't have to include
some non-SPARK code. (One whole chapter of the SPARK book is about
interfacing to such code.)

But to argue that this makes languages such as SPARK wholly useless is
specious.

If I am writing the code for a safety-critical system then all of that
code has to be brought to the same integrity level. If 99% of that is
SPARK code then the effort to validate that code is minimised and I am
quite content to put a lot of manual analysis (per line of code) into
bringing the remaining 1% up to the same integrity level.

Phil Thornley

--
Phil Thornley
BAE SYSTEMS






^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11  6:28         ` Chad R. Meiners
@ 2003-04-11  8:11           ` Lutz Donnerhacke
  0 siblings, 0 replies; 42+ messages in thread
From: Lutz Donnerhacke @ 2003-04-11  8:11 UTC (permalink / raw)


* Chad R. Meiners wrote:
> "Lutz Donnerhacke" <lutz@iks-jena.de> wrote in message
>> Right, but I'm looking for the Spark frontend of this hide clause. );
> 
> Couldn't you just create an inlined procedure for each literal call to
> execv?

execv(cmd, arg0);
execv(cmd, arg0, arg1);
execv(cmd, arg0, arg1, arg2);
...

Yep, this is possible, but looks insane.

I'm working on a solution like:

procedure t is 
  argsstring : constant C_String_Collection :=  -- limited type 
    "/usr/bin/sh" & ASCII.NUL & "-c" & ASCII.NUL & "xxx" & ASCII.NUL;   
  args : C_String_Array;                  -- contains address of collection
  envstring : Enviroment_Collection(1 .. 400); 
  env : Enviroment;   
begin     
  Array_From_Collection(argsstring, args);     

  Bind_Enviroment(envstring, env);        
  Add_Enviroment("HOME", "/var/home/lutz");          
  Add_Enviroment("USER", "lutz");            
  
  execve(To_Fullpath("/usr/bin/sh"), args, env);               
end t;                 
		
















^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11  7:35         ` Phil Thornley
@ 2003-04-11 12:05           ` Marin David Condic
  2003-04-11 13:19             ` John R. Strohm
  2003-04-11 18:47             ` Chad R. Meiners
  0 siblings, 2 replies; 42+ messages in thread
From: Marin David Condic @ 2003-04-11 12:05 UTC (permalink / raw)


I can agree with Hyman's point to some extent. Trying to eliminate all
possible sources of errors via language design is not going to work and only
succeeds in making the language less useful in solving problems. (Sort of
like trying to teach a pig to dance - it doesn't work and only succeeds in
annoying the pig. :-) There isn't really any substitute for solid design and
rigorous testing if you want a stable, reliable end product. A "safe"
language can lead developers to find ugly, difficult ways around the safety
features in order to get what the designers wanted out of the system.

OTOH, there is some value in something like SPARK wherein one can substitute
formal proof of correctness for formal verification by testing. Verification
can be very time consuming. Some features can make it difficult or
impossible to to test an end product and have a high level of confidence
that the end product will not break. To the extent that the formal proof is
faster and more dependable than formal verification, it gets you a better
product in a less costly way - hence your point about the value of SPARK for
a large percentage of the code and other verification methods for the
smaller percentage.

Its all a balancing act. I believe that the safety features of Ada are a
good thing and hence I agree that SPARK can be a good thing too. But a
desire for absolute safety can end up hamstringing a language and you'll
never get the absolute safety anyway. One would hope that language designers
would keep this in mind and look to the target audience to see what features
are useful as well as what features are safe.

MDC
--
======================================================================
Marin David Condic
I work for: http://www.belcan.com/
My project is: http://www.jsf.mil/

Send Replies To: m c o n d i c @ a c m . o r g

    "Going cold turkey isn't as delicious as it sounds."
        -- H. Simpson
======================================================================

Phil Thornley <phil.thornley@baesystems.com> wrote in message
news:3e966fc9$1@baen1673807.greenlnk.net...
>
> There must be very few SPARK based systems that don't have to include
> some non-SPARK code. (One whole chapter of the SPARK book is about
> interfacing to such code.)
>
> But to argue that this makes languages such as SPARK wholly useless is
> specious.
>
> If I am writing the code for a safety-critical system then all of that
> code has to be brought to the same integrity level. If 99% of that is
> SPARK code then the effort to validate that code is minimised and I am
> quite content to put a lot of manual analysis (per line of code) into
> bringing the remaining 1% up to the same integrity level.
>






^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10 23:21           ` John R. Strohm
@ 2003-04-11 12:19             ` Hyman Rosen
  2003-04-11 13:14               ` John R. Strohm
  0 siblings, 1 reply; 42+ messages in thread
From: Hyman Rosen @ 2003-04-11 12:19 UTC (permalink / raw)


John R. Strohm wrote:
> When you declare an array to be of a given size, you create a proof
> obligation that the subscript expression will always be in bounds.
> Satisfying this proof obligation is what you do during verification.


Sure, I understand what program proof and verification are about.

What I'm suggesting is that there is a tension between normal Ada
usage, which is to declare your types "snugly" so that they hold
only the ranges you expect them to have, and programs which will
always be verified before they are permitted to run.




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11  6:20         ` Chad R. Meiners
@ 2003-04-11 12:31           ` Hyman Rosen
  2003-04-11 18:27             ` Chad R. Meiners
  0 siblings, 1 reply; 42+ messages in thread
From: Hyman Rosen @ 2003-04-11 12:31 UTC (permalink / raw)


Chad R. Meiners wrote:
> You are missing the point of Spark which is to provide a (super)subset of
> Ada in which formal proofs about the implementation are possible.  Lack of
> pointers was not a case of hubris but a deliberate and sound decision in
> Spark's case.  Formal validation and verification methodologies require
> unambiguous specifications and implementations.  Spark is intended to
> satisfy such formal needs.

I think you are demonstrating *my* point. Despite the intentions of Spark,
the OP needs to generate and pass a pointer, and he's desperately trying
to work around the constraints of his programming language. It's all good
and well that you can verify programs written in Spark, but it's not of
much use if programs written in Spark can't do anything!

>>Then everyone who uses these languages has to figure out how to work
>>around the lack of the feature they need, essentially duplicating it in
>>some kludgy way. Meanwhile the language designers have their heads in the
>>sand and their noses in the air while they pat themselves on the back (:-)
>>in self-congratulation on how they have created perfection.
> 
> Of one should know something about a language before one chooses to
> overgeneralize ;)

Do you remember how, in the days of Ada83, it would be suggested that
people use task types and pointers to them to work around the lack of
procedure pointers? Overgeneralize, my foot!




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-08 12:02 [Spark] Arrays of Strings Lutz Donnerhacke
                   ` (2 preceding siblings ...)
  2003-04-09 17:42 ` Matthew Heaney
@ 2003-04-11 12:32 ` Rod Chapman
  2003-04-11 14:50   ` Peter Amey
  2003-04-11 18:41   ` Matthew Heaney
  3 siblings, 2 replies; 42+ messages in thread
From: Rod Chapman @ 2003-04-11 12:32 UTC (permalink / raw)


Lutz Donnerhacke <lutz@iks-jena.de> wrote in message news:<slrnb95ehu.ob.lutz@taranis.iks-jena.de>...
> In order to implement thin an execve binding, I wonder how to emulate the
> C-Type "char const * const x []". Any bright ideas?


Lutz and others,
  Here are some thoughts and a possible solution to this problem.
The execve() API is indeed a complex one for SPARK, since the
type "pointer to unconstrained array of pointer to null terminated
C String" is a tricky one for SPARK to model.

 The trick with bindings like this is to come up with a package
specification which is pure SPARK, and presents a clean,
elegant model to the outside world.  No details of the
C type model, or the "glue" needed to bind the two languages
together should be allowed to leak into the package spec!
All the glue and C-related stuff should be hidden in the body.

I'm also working on the assumption that Lutz wants a binding
that is as "thin" as possible, and there are good reasons
to support this approach.

Let's look at how the GNAT.OS_Lib does it...a few fragments:

   type String_Access is access all String;
   type String_List is array (Positive range <>) of String_Access;
   subtype Argument_List is String_List;

   procedure Spawn
     (Program_Name : String;
      Args         : Argument_List;
      Success      : out Boolean);

OK - reasonably elegant, but unfortunately not much use
in this context, since SPARK does not allow explicit declaration
of access types.

The trick here (thanks to Tucker Taft here, who just thought of this
during lunch...) is to change the abstraction a bit to make the
"Args" parameter a legal SPARK type...

I suggest using a single String parameter, where distinct arguments
are separated and terminated by a special character.  ASCII.Nul
is probably a good choice!

So...the SPARK-friendly package spec looks something like:

  package Kernel
  --# own State;
  --# initializes State;
  is
     -- "Args" is a list of arguments, each of which is
     -- termianted by ASCII.Nul
     --
     -- For Example
     --   Kernel.Spawn (Program_Name => "gcc",
     --                 Args         => "-c" & ASCII.Nul &
     --                                 "-g" & ASCII.Nul &
     --                                 "spark.adb" & ASCII.Nul,
     --                 Success      => Success);

     procedure Spawn (Program_Name : in     String;
                      Args         : in     String;
                      Success      :    out Boolean);
     --# global in out State;
     --# derives Success from State, Program_Name, Args &
     --#         State from *, Program_Name, Args;

  end Kernel;

This meets our requirements - perfectly legal SPARK, and none of the
C-related glue is visible.

The body is Ada95 (not SPARK), and uses the facilities of Annex B and
Interfaces to do its stuff.  I've left out a few details but it
goes something like:

   with Interfaces;
   with Interfaces.C;
   with Interfaces.C.Strings;
   use type Interfaces.C.Char;
   use type Interfaces.C.Size_T;
   use type Interfaces.C.Int;
   package body Kernel
   is

     package C renames Interfaces.C;

     -- Count the number of Nul-terminated argments in a given
String...
     function Number_of_Args (Args : in String) return C.Size_T
     is
     begin
       return 2;
     end Number_of_Args;

     procedure Spawn (Program_Name : in     String;
                      Args         : in     String;
                      Success      :    out Boolean)
     is
        subtype Args_Count is C.size_t range 1 .. Number_Of_Args
(Args);
        type Args_Ptr_Vector is array (Args_Count) of aliased
C.Strings.Chars_Ptr;
        Args_Ptrs      : Args_Ptr_Vector;

        subtype C_Args_Index is C.size_t range C.Size_T(Args'First) ..
C.Size_T(Args'Last);
        subtype C_Args_Type  is C.char_array (C_Args_Index);
        C_Args         : C_Args_Type := C.To_C (Args, False);

        C_Program_Name : C.Strings.chars_ptr;
        Ret            : C.Int;

        -- Bind to C here - I've missed out envp here for
simplicity...
        function Execve (Filename : in C.Strings.Chars_Ptr;
                         Argv     : in C.Strings.Chars_Ptr) return
C.Int;
        pragma Import (C, Execve);

        -- Sets the elements of Args_Ptrs to be the 'Access
        -- of each of the N arguments contained in C_Args...
        procedure Set_Args_Ptrs
        --# global out Args_Ptrs;
        --#        in  C_Args;
        is
        begin
           -- proof left to reader... :-)
           null;
        end Set_Args_Ptrs;


     begin
        C_Args := C.To_C (Args, False);

        Set_Args_Ptrs;

        C_Program_Name := C.Strings.New_String (Program_Name);

        Ret := Execve (C_Program_Name, Args_Ptrs (1)'Access);

        C.Strings.Free (C_Program_Name);

        Success := (Ret = 0); -- or something like that...
     end Spawn;
  end Kernel;



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11 12:19             ` Hyman Rosen
@ 2003-04-11 13:14               ` John R. Strohm
  0 siblings, 0 replies; 42+ messages in thread
From: John R. Strohm @ 2003-04-11 13:14 UTC (permalink / raw)


"Hyman Rosen" <hyrosen@mail.com> wrote in message
news:1050063540.619475@master.nyc.kbcfp.com...
> John R. Strohm wrote:
> > When you declare an array to be of a given size, you create a proof
> > obligation that the subscript expression will always be in bounds.
> > Satisfying this proof obligation is what you do during verification.
>
>
> Sure, I understand what program proof and verification are about.
>
> What I'm suggesting is that there is a tension between normal Ada
> usage, which is to declare your types "snugly" so that they hold
> only the ranges you expect them to have, and programs which will
> always be verified before they are permitted to run.

Nope.

Declaring the types "snugly" is a good idea.  It gives more information to
the compiler and to the verification system.  It may OR MAY NOT make the
verification easier.
The point is that it is not mandatory: the verification system can figure
out the actual constraints.

Declaring the types "snugly" may make the verification harder.  Every time a
snugly-declared variable is assigned, a proof obligation is generated, to
prove that the variable will be in bounds.  Some of these obligations will
be trivial; some may not be.





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11 12:05           ` Marin David Condic
@ 2003-04-11 13:19             ` John R. Strohm
  2003-04-12 23:09               ` Robert A Duff
  2003-04-11 18:47             ` Chad R. Meiners
  1 sibling, 1 reply; 42+ messages in thread
From: John R. Strohm @ 2003-04-11 13:19 UTC (permalink / raw)


"Marin David Condic" <mcondic.auntie.spam@acm.org> wrote in message
news:b76b3a$slo$1@slb9.atl.mindspring.net...
> I can agree with Hyman's point to some extent. Trying to eliminate all
> possible sources of errors via language design is not going to work and
only
> succeeds in making the language less useful in solving problems. (Sort of
> like trying to teach a pig to dance - it doesn't work and only succeeds in
> annoying the pig. :-) There isn't really any substitute for solid design
and
> rigorous testing if you want a stable, reliable end product. A "safe"
> language can lead developers to find ugly, difficult ways around the
safety
> features in order to get what the designers wanted out of the system.

"Testing cannot be used to show the absence of errors, only their
resence."  --Edsger W. Dijkstra

Yes, you can force some ugly and difficult workarounds for lack of e.g.
pointers and GOTO statements.

HOWEVER, no matter how ugly and difficult those workarounds may be, they are
known to be SAFE.  The simple, easy-to-use constructs they replaced were NOT
SAFE.

Secondly, the presence of those ugly, difficult workarounds draws the
maintenance programmer's eye, alerting him that there are demons and tygers
lurking hereabouts.  This is PRECISELY why Ada made the GOTO label syntax
INCREDIBLY ugly: they were required by the customer to provide a GOTO
statement, but they could, and did, warn against its use.





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11 12:32 ` Rod Chapman
@ 2003-04-11 14:50   ` Peter Amey
  2003-04-11 18:41   ` Matthew Heaney
  1 sibling, 0 replies; 42+ messages in thread
From: Peter Amey @ 2003-04-11 14:50 UTC (permalink / raw)




Rod Chapman wrote:
> Lutz Donnerhacke <lutz@iks-jena.de> wrote in message news:<slrnb95ehu.ob.lutz@taranis.iks-jena.de>...
> 
[Rod's solution snipped]

The crucial thing about all this is that the nasty details of 
interfacing to the dubious C interface are all contrained to a thin 
layer in the package body.  Outside this interface we have a nice clean 
link to the rather purer world of SPARK.

I tend to se this in the same light as Ada's view of things like 
unchecked conversion: it's allowed but we have to be open and explicit 
about what we are doing.

Peter







^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11 12:31           ` Hyman Rosen
@ 2003-04-11 18:27             ` Chad R. Meiners
  0 siblings, 0 replies; 42+ messages in thread
From: Chad R. Meiners @ 2003-04-11 18:27 UTC (permalink / raw)



"Hyman Rosen" <hyrosen@mail.com> wrote in message
news:1050064265.685430@master.nyc.kbcfp.com...

> I think you are demonstrating *my* point.

No I am not.  You are not listening to the argument and questioning your own
opinion.

> Despite the intentions of Spark,
> the OP needs to generate and pass a pointer, and he's desperately trying
> to work around the constraints of his programming language.

As Rod Chapman and Peter Amey point out, the data type the OP is worrying
needs to be modeled in Spark-friendly way that can used to accomplish the
actually call in the none Spark body.  As for trying desperately to work
around the constraints of Spark, I also believe that you misunderstood the
OP.  It is my opinion he was trying to get advice on how to work with the
language constraint so that he could separate the provable part of the
program for the part left unproven.  So the OP might have been frustrated
but learning usually involves some frustration due to misunderstandings of
lack of knowledge.

> It's all good
> and well that you can verify programs written in Spark, but it's not of
> much use if programs written in Spark can't do anything!

This is a ludicrous argument.  Of course you can useful programs in Spark
(been done with published case studies see www.sparkada.com).  The nice
point about Spark is that you can only apply it where necessary and
interface it cleanly to the rest of your Ada code base (thus you can also
interface to other programming languages as well)

> Do you remember how, in the days of Ada83, it would be suggested that
> people use task types and pointers to them to work around the lack of
> procedure pointers?

You are distracting from the argument.  The above is irrelevant.

> Overgeneralize, my foot!

Of course you overgeneralized!  The design process and goals of Java and
Spark are unrelated and you cannot attribute any perceived language faults
out of either to the same process (this vague notion of language design
hubris).  You also presented a strawman argument.  There was no hubris
uninvolved in the creation of Spark.  You should at least read the history
and rational of Spark before you pass judgment upon its designers hubris.
Spark does not pretend to provide you with all the capabilities of most
general purpose programming languages.  It instead realized that some code
is going to have to exist outside the Spark boundary and provides you with a
nice method to interface with this code.  This is a very nice feature since
it allows you to explicitly state where you are separating your concerns.
This is not hubris but instead a very good idea!

You must also remember that Spark is a formal method.  Learning formal
methodologies require a `slight' paradigm change on your developmental
thinking.  This is why they are often misunderstood and mischaracterized by
strawman arguments.





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11 12:32 ` Rod Chapman
  2003-04-11 14:50   ` Peter Amey
@ 2003-04-11 18:41   ` Matthew Heaney
  2003-04-11 21:25     ` Chad R. Meiners
  2003-04-12 10:08     ` Peter Amey
  1 sibling, 2 replies; 42+ messages in thread
From: Matthew Heaney @ 2003-04-11 18:41 UTC (permalink / raw)


rod.chapman@praxis-cs.co.uk (Rod Chapman) wrote in message news:<cf2c6063.0304110211.596b6b2b@posting.google.com>...
> Lutz Donnerhacke <lutz@iks-jena.de> wrote in message news:<slrnb95ehu.ob.lutz@taranis.iks-jena.de>...
> > In order to implement thin an execve binding, I wonder how to emulate the
> > C-Type "char const * const x []". Any bright ideas?
> 
> 
> Lutz and others,
>   Here are some thoughts and a possible solution to this problem.
> The execve() API is indeed a complex one for SPARK, since the
> type "pointer to unconstrained array of pointer to null terminated
> C String" is a tricky one for SPARK to model.

[Rod's example snipped]

Personally, I find Rod's solution way too heavy.

Why can't Spark model an access constant to a constant object?  For
example:

declare
   X : aliased constant String := "arg1";
   Y : aliased constant String := "arg2":

   type String_Constant_Access is 
     access constant String;

   type String_Constant_Access_Array is
     (Postiive range <>) of aliased String_Constant_Access;

   A : constant String_Constant_Access_Array := 
     (X'Access, Y'Access);
begin
   Op (A (A'First)'Access);
end;

Everything is static.  Yes, X and Y are aliased by the components of
array A, but neither X nor Y is variable, and the aliased view through
A is only a constant view.  So what's the problem?

I could understand your difficulty if X and Y were variable, but that
is not the case here.

I must say, I think the simplest solution is:

exec (filename, X);  -- for argv with one arg
exec (filename, X, Y); --for argv with two args
etc

and then turn off Spark checking in the body of exec.

Contrary to some of the views expressed in this thread, I find the C
syntax perfectly natural and perfectly elegant.  Rod's solution is
neither.



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11 12:05           ` Marin David Condic
  2003-04-11 13:19             ` John R. Strohm
@ 2003-04-11 18:47             ` Chad R. Meiners
  1 sibling, 0 replies; 42+ messages in thread
From: Chad R. Meiners @ 2003-04-11 18:47 UTC (permalink / raw)



"Marin David Condic" <mcondic.auntie.spam@acm.org> wrote in message
news:b76b3a$slo$1@slb9.atl.mindspring.net...
> I can agree with Hyman's point to some extent. Trying to eliminate all
> possible sources of errors via language design is not going to work and
only
> succeeds in making the language less useful in solving problems. (Sort of
> like trying to teach a pig to dance - it doesn't work and only succeeds in
> annoying the pig. :-)

If this was Hyman's point, then it one that most people that deal with
formal methods agree with.  Almost all the people that deal with formal
methods understand logic and expressiblity; thus they are not going to
intentional try to eliminate all sources of errors via language design.
That is simple, "My language allow nothing to be expressed".  Of course you
can also go to the opposite extreme, "All things are expressions of my
language".  Both language are useless.   The challenge with formal methods
is to get the balance right with reguards to requirments, specifications and
finally implementations.  In my profesional opinion (formal methods are the
focus of my graduate studies), I think that Spark demonstrates a very good
sense of balance in this respect.





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11 18:41   ` Matthew Heaney
@ 2003-04-11 21:25     ` Chad R. Meiners
  2003-04-12 10:08     ` Peter Amey
  1 sibling, 0 replies; 42+ messages in thread
From: Chad R. Meiners @ 2003-04-11 21:25 UTC (permalink / raw)



"Matthew Heaney" <mheaney@on2.com> wrote in message >
>Personally, I find Rod's solution way too heavy.
>
> Why can't Spark model an access constant to a constant object?  For
> example:

You probably could make rules to allow access constants to constant object,
but why?  Its usefulness would probably be outweighted by its added
complexity.  Furthermore, it would make the language more complicated to
learn.  If you intend on proving result about your implementation, you
should understand the implementation language so you can validate your proof
in respect to your implementations requirements.

> declare
>    X : aliased constant String := "arg1";
>    Y : aliased constant String := "arg2":
>
>    type String_Constant_Access is
>      access constant String;
>
>    type String_Constant_Access_Array is
>      (Postiive range <>) of aliased String_Constant_Access;
>
>    A : constant String_Constant_Access_Array :=
>      (X'Access, Y'Access);
> begin
>    Op (A (A'First)'Access);
> end;
>
> Everything is static.  Yes, X and Y are aliased by the components of
> array A, but neither X nor Y is variable, and the aliased view through
> A is only a constant view.  So what's the problem?

I would assume that the value added does not offset the cost to provide.

> I must say, I think the simplest solution is:
>
> exec (filename, X);  -- for argv with one arg
> exec (filename, X, Y); --for argv with two args
> etc
>

How is that so much different from Spawn(filename, X & Seperator & Y,
MySuccess); ?

> and then turn off Spark checking in the body of exec.
>
> Contrary to some of the views expressed in this thread, I find the C
> syntax perfectly natural and perfectly elegant.  Rod's solution is
> neither.

Rod's solution is correct though; furthermore, it isn't too heavy since all
the work in deciding what needs to be proved and what will be left unproved
needs to be done explicitly one way or the other.





^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11 18:41   ` Matthew Heaney
  2003-04-11 21:25     ` Chad R. Meiners
@ 2003-04-12 10:08     ` Peter Amey
  1 sibling, 0 replies; 42+ messages in thread
From: Peter Amey @ 2003-04-12 10:08 UTC (permalink / raw)




Matthew Heaney wrote:
> rod.chapman@praxis-cs.co.uk (Rod Chapman) wrote in message news:<cf2c6063.0304110211.596b6b2b@posting.google.com>...
> 
>>Lutz Donnerhacke <lutz@iks-jena.de> wrote in message news:<slrnb95ehu.ob.lutz@taranis.iks-jena.de>...
>>
>>>In order to implement thin an execve binding, I wonder how to emulate the
>>>C-Type "char const * const x []". Any bright ideas?
>>
>>
>>Lutz and others,
>>  Here are some thoughts and a possible solution to this problem.
>>The execve() API is indeed a complex one for SPARK, since the
>>type "pointer to unconstrained array of pointer to null terminated
>>C String" is a tricky one for SPARK to model.
> 
> 
> [Rod's example snipped]
> 
 >
 > [Mathew Heaney...]
> Personally, I find Rod's solution way too heavy.
> 
> Why can't Spark model an access constant to a constant object?  For
> example:
> 

The answer is it probably _could_ (but whether it _should_ is a much 
closer value judgement).  In fact we are looking very seriously at 
"static" access discriminants as part of our work on adding the 
Ravenscar tasking profile to SPARK. (We have a working Ravenscar SPARK 
already without this feature).

We have never claimed that SPARK is the largest subset of Ada for which 
formal verification is feasible; we claim only that it is big enough to 
write a large class of programs efficiently and that it represents a 
reasonable tradeoff of power and simplicity.

Peter




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-11 13:19             ` John R. Strohm
@ 2003-04-12 23:09               ` Robert A Duff
  0 siblings, 0 replies; 42+ messages in thread
From: Robert A Duff @ 2003-04-12 23:09 UTC (permalink / raw)


"John R. Strohm" <strohm@airmail.net> writes:

> "Testing cannot be used to show the absence of errors, only their
> resence."  --Edsger W. Dijkstra
 ^
 p

True, but this fact does not imply that so-called "proof of correctness"
can show the absence of errors.  In fact, it cannot.

- Bob



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-10 14:48       ` Hyman Rosen
  2003-04-11  6:20         ` Chad R. Meiners
  2003-04-11  7:35         ` Phil Thornley
@ 2003-04-12 23:51         ` Robert A Duff
  2003-04-13  5:47           ` Hyman Rosen
  2 siblings, 1 reply; 42+ messages in thread
From: Robert A Duff @ 2003-04-12 23:51 UTC (permalink / raw)


Hyman Rosen <hyrosen@mail.com> writes:

> Matthew Heaney wrote:
> > Doesn't Spark have a #hide clause, to turn off Spark checking?  After
> > all, you're calling a C library function, so what difference does
> > Spark make?
> 
> Getting rid of features that are presumed to cause problems is a bit of
> hubris that language designers always seem to fall victim to. 

As one of the designers of Ada 95, I must say I wholeheartedly agree
with the sentiment here.  Language designers ought to make it easy to
write good programs, not try to make it hard to write bad programs.  Ada
is guilty of the latter in some cases.

However, I think you're being unfair to SPARK here.  SPARK does not say
"you can't do X".  It says, "If you don't do X, then we can prove some
useful properties of your program.  But if you need to do X, then you
can isolate that code, and apply #hide".  That doesn't strike me as
hubris; they're offering you a benefit, in those cases where you can
abide by some restrictions.  You hope that the vast majority of your
code can naturally abide by the restrictions, and isolate the rest in
small packages.

Rod Chapman said it well in another post:

>  The trick with bindings like this is to come up with a package
> specification which is pure SPARK, ...

Non-spark Ada does the same thing in many cases.  Ada does not say, "you
can't do bit-wise type conversions".  Instead it says, "if you want to
do bit-wise conversions, you have to say 'Unchecked_Conversion'".  That
doesn't seem like a restriction to me.  It seems liberating.  It allows
me to have some confidence that the code I'm reading obeys the type
model, so long as I don't see Unchecked_Conversion (or address clauses,
or...).

Contrast with C, where perfectly safe conversions use the same syntax as
potentially troublesome unsafe ones.

My point is:  The language designer should not say "you can't do X"
(assuming X is sometimes useful).  But it's perfectly reasonable for the
language designer to say "if you're going to do X, then you have to say
so".

Back to Hyman:

>...Ada itself
> had a huge problem because the designers thought that function pointers
> could be eliminated.

Agreed.  And C has a huge problem because nested functions are not
allowed.  And both languages (C and Ada) have a huge problem because you
can't pass [pointers to] nested procedures as parameters.

>... Spark gets rid of all pointers, Java gets rid of
> templates, and so on and so on.

I'm not sure Java's lack of templates/generics is due to hubris.
More likely, it's due to trying to [over]simplify the language.

> Then everyone who uses these languages has to figure out how to work
> around the lack of the feature they need, essentially duplicating it in
> some kludgy way. Meanwhile the language designers have their heads in the
> sand and their noses in the air while they pat themselves on the back (:-)
> in self-congratulation on how they have created perfection.

Agreed.

- Bob



^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-12 23:51         ` Robert A Duff
@ 2003-04-13  5:47           ` Hyman Rosen
  2003-04-14  8:05             ` Lutz Donnerhacke
  0 siblings, 1 reply; 42+ messages in thread
From: Hyman Rosen @ 2003-04-13  5:47 UTC (permalink / raw)


Robert A Duff wrote:
> However, I think you're being unfair to SPARK here.

I'm willing to accept that. As I say here frequently, I don't know Ada,
much less subsets thereof (although I did once work with VHDL). I was
reacting to the OP's code. I agree with the principle that Spark is what
it is, and that non-Spark code should be segregated away into hidden
modules that do what they need to. That's much more reasonable than
trying to express the inexpressible in Spark itself.

Whether or not it's reasonable to program in a language without pointers
is another matter, but I guess if people are happy, more power to them.
I suspect that a lot of Spark code may wind up using arrays and indexes,
just like Fortran.




^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [Spark] Arrays of Strings
  2003-04-13  5:47           ` Hyman Rosen
@ 2003-04-14  8:05             ` Lutz Donnerhacke
  0 siblings, 0 replies; 42+ messages in thread
From: Lutz Donnerhacke @ 2003-04-14  8:05 UTC (permalink / raw)


* Hyman Rosen wrote:
> Whether or not it's reasonable to program in a language without pointers
> is another matter, but I guess if people are happy, more power to them.
> I suspect that a lot of Spark code may wind up using arrays and indexes,
> just like Fortran.

In order to clarify this and finish my part in this discussion, I prefer the
Spark way of programming for two reasons. First I learned, that the failing
proofs point me directly to logical errors in my program, which are
otherwise hard to find, because I can't trigger them (but my users will). So
developing programs with those constraints results in much more stable
programs for me. Second the prohibition of dynamic storage (including
recursion) urges me to look for more efficient algorithms with bounded space
constraints. Due to the fact I'm playing with embedded (or long life)
systems, my code written is Spark is better adopted to my problems than Ada
code.

Of course, there are several points, I'd like to change. Especially user
constraint types (in order to represent dependent types as a single one)
would be fine.

But back to the topic: Rod's idea is right, but does not go far enough. It
requires a dynamically allocated array of pointers or an error message from
the syscall wrapper. Therefore I seperated this part further and use a user
provided space for this purpose.



^ permalink raw reply	[flat|nested] 42+ messages in thread

end of thread, other threads:[~2003-04-14  8:05 UTC | newest]

Thread overview: 42+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2003-04-08 12:02 [Spark] Arrays of Strings Lutz Donnerhacke
2003-04-08 18:08 ` Martin Krischik
2003-04-09  9:23   ` Lutz Donnerhacke
2003-04-09 12:38     ` Hyman Rosen
2003-04-09 12:47       ` Vinzent Hoefler
2003-04-09 14:27         ` Hyman Rosen
2003-04-09 15:13           ` Vinzent Hoefler
2003-04-09 17:21         ` Hyman Rosen
2003-04-09 18:41           ` Vinzent Hoefler
2003-04-09 21:04           ` Randy Brukardt
2003-04-10 23:21           ` John R. Strohm
2003-04-11 12:19             ` Hyman Rosen
2003-04-11 13:14               ` John R. Strohm
2003-04-09  7:50 ` Eric G. Miller
2003-04-09  8:10   ` Lutz Donnerhacke
2003-04-09 18:23   ` Matthew Heaney
2003-04-09 17:42 ` Matthew Heaney
2003-04-09 21:06   ` Randy Brukardt
2003-04-10  8:23   ` Lutz Donnerhacke
2003-04-10 14:09     ` Matthew Heaney
2003-04-10 14:48       ` Hyman Rosen
2003-04-11  6:20         ` Chad R. Meiners
2003-04-11 12:31           ` Hyman Rosen
2003-04-11 18:27             ` Chad R. Meiners
2003-04-11  7:35         ` Phil Thornley
2003-04-11 12:05           ` Marin David Condic
2003-04-11 13:19             ` John R. Strohm
2003-04-12 23:09               ` Robert A Duff
2003-04-11 18:47             ` Chad R. Meiners
2003-04-12 23:51         ` Robert A Duff
2003-04-13  5:47           ` Hyman Rosen
2003-04-14  8:05             ` Lutz Donnerhacke
2003-04-10 15:02       ` Lutz Donnerhacke
2003-04-10 15:50         ` Hyman Rosen
2003-04-10 18:32           ` Randy Brukardt
2003-04-11  6:28         ` Chad R. Meiners
2003-04-11  8:11           ` Lutz Donnerhacke
2003-04-11 12:32 ` Rod Chapman
2003-04-11 14:50   ` Peter Amey
2003-04-11 18:41   ` Matthew Heaney
2003-04-11 21:25     ` Chad R. Meiners
2003-04-12 10:08     ` Peter Amey

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox