From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00,FREEMAIL_FROM autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,463c5796782db6d8 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2003-04-12 22:47:14 PST Path: archiver1.google.com!news1.google.com!newsfeed.stanford.edu!news-spur1.maxwell.syr.edu!news.maxwell.syr.edu!newsfeed1.cidera.com!Cidera!cyclone1.gnilink.net!spamkiller2.gnilink.net!nwrdny01.gnilink.net.POSTED!53ab2750!not-for-mail From: Hyman Rosen User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.4a) Gecko/20030329 X-Accept-Language: en-us, en MIME-Version: 1.0 Newsgroups: comp.lang.ada Subject: Re: [Spark] Arrays of Strings References: <1ec946d1.0304090942.3106b4e4@posting.google.com> <1ec946d1.0304100609.52b0fac0@posting.google.com> <1049986095.779228@master.nyc.kbcfp.com> In-Reply-To: Content-Type: text/plain; charset=us-ascii; format=flowed Content-Transfer-Encoding: 7bit Message-ID: Date: Sun, 13 Apr 2003 05:47:13 GMT NNTP-Posting-Host: 162.83.250.200 X-Complaints-To: abuse@verizon.net X-Trace: nwrdny01.gnilink.net 1050212833 162.83.250.200 (Sun, 13 Apr 2003 01:47:13 EDT) NNTP-Posting-Date: Sun, 13 Apr 2003 01:47:13 EDT Xref: archiver1.google.com comp.lang.ada:36111 Date: 2003-04-13T05:47:13+00:00 List-Id: Robert A Duff wrote: > However, I think you're being unfair to SPARK here. I'm willing to accept that. As I say here frequently, I don't know Ada, much less subsets thereof (although I did once work with VHDL). I was reacting to the OP's code. I agree with the principle that Spark is what it is, and that non-Spark code should be segregated away into hidden modules that do what they need to. That's much more reasonable than trying to express the inexpressible in Spark itself. Whether or not it's reasonable to program in a language without pointers is another matter, but I guess if people are happy, more power to them. I suspect that a lot of Spark code may wind up using arrays and indexes, just like Fortran.