From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,463c5796782db6d8 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2003-04-14 01:05:27 PST Path: archiver1.google.com!news1.google.com!sn-xit-02!sn-xit-03!sn-xit-01!sn-xit-08!supernews.com!newsfeed.news2me.com!newsfeed.icl.net!newsfeed.fjserv.net!news-FFM2.ecrc.net!news.iks-jena.de!not-for-mail From: Lutz Donnerhacke Newsgroups: comp.lang.ada Subject: Re: [Spark] Arrays of Strings Date: Mon, 14 Apr 2003 08:05:27 +0000 (UTC) Organization: IKS GmbH Jena Message-ID: References: <1ec946d1.0304090942.3106b4e4@posting.google.com> <1ec946d1.0304100609.52b0fac0@posting.google.com> <1049986095.779228@master.nyc.kbcfp.com> NNTP-Posting-Host: taranis.iks-jena.de X-Trace: branwen.iks-jena.de 1050307527 27069 217.17.192.37 (14 Apr 2003 08:05:27 GMT) X-Complaints-To: usenet@iks-jena.de NNTP-Posting-Date: Mon, 14 Apr 2003 08:05:27 +0000 (UTC) User-Agent: slrn/0.9.7.4 (Linux) Xref: archiver1.google.com comp.lang.ada:36126 Date: 2003-04-14T08:05:27+00:00 List-Id: * Hyman Rosen wrote: > Whether or not it's reasonable to program in a language without pointers > is another matter, but I guess if people are happy, more power to them. > I suspect that a lot of Spark code may wind up using arrays and indexes, > just like Fortran. In order to clarify this and finish my part in this discussion, I prefer the Spark way of programming for two reasons. First I learned, that the failing proofs point me directly to logical errors in my program, which are otherwise hard to find, because I can't trigger them (but my users will). So developing programs with those constraints results in much more stable programs for me. Second the prohibition of dynamic storage (including recursion) urges me to look for more efficient algorithms with bounded space constraints. Due to the fact I'm playing with embedded (or long life) systems, my code written is Spark is better adopted to my problems than Ada code. Of course, there are several points, I'd like to change. Especially user constraint types (in order to represent dependent types as a single one) would be fine. But back to the topic: Rod's idea is right, but does not go far enough. It requires a dynamically allocated array of pointers or an error message from the syscall wrapper. Therefore I seperated this part further and use a user provided space for this purpose.