From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,463c5796782db6d8 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2003-04-10 11:32:12 PST Path: archiver1.google.com!news1.google.com!sn-xit-02!sn-xit-03!sn-xit-06!sn-post-02!sn-post-01!supernews.com!corp.supernews.com!not-for-mail From: "Randy Brukardt" Newsgroups: comp.lang.ada Subject: Re: [Spark] Arrays of Strings Date: Thu, 10 Apr 2003 13:32:55 -0500 Organization: Posted via Supernews, http://www.supernews.com Message-ID: References: <1ec946d1.0304090942.3106b4e4@posting.google.com> <1ec946d1.0304100609.52b0fac0@posting.google.com> <1049989859.683662@master.nyc.kbcfp.com> MIME-Version: 1.0 Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: 7bit X-Newsreader: Microsoft Outlook Express 4.72.3612.1700 X-MimeOLE: Produced By Microsoft MimeOLE V4.72.3719.2500 X-Complaints-To: abuse@supernews.com Xref: archiver1.google.com comp.lang.ada:36062 Date: 2003-04-10T13:32:55-05:00 List-Id: Hyman Rosen wrote in message <1049989859.683662@master.nyc.kbcfp.com>... >Lutz Donnerhacke wrote: >> How do I do this in a Spark frontend? Of course, I can hide the whole >> program, but this is not really a solution. > >"Doctor, it hurts when I do that." >"Then don't do that!" > >Why have you chosen to work in a language which deliberately >prevents you from expressing what you explicitly need to do? I hate to do it, but I have to agree with Hyman here. There is no way to make the code directly interfacing with C any safer than the underlying C, and that is never going to be anywhere near the standards of Spark. What has to be done in cases like this is to write an appropriate thick wrapper that IS espressible in Spark, and then the ugly interfacing code has to be in 'real' Ada with the Spark checking turned off. I don't use Spark, but the Prog_Call function of Janus/Ada (which of course calls one of the exec functions) has an interface like: function Prog_Call (Program_Name : in String; Command : in String; Wait : Boolean := True); and all of the ugly pointer stuff needed to implement that interface is hidden in the body of the package. Abstracting the environment variable stuff will be hard (if you actually need that; it's rare -- Prog_Call just passes null, which is a copy of the caller's for Windows, I forget what Unix does with that), but I don't think you have any choice in order to make proper Spark. The overhead of such a wrapper can be reduced by in-lining and the like, but it hardly can matter: exec functions are a lot slower than large volumes of Ada code, and we're certainly not talking about that. Randy.