From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,ae395e5c11de7bc9 X-Google-NewGroupId: yes X-Google-Attributes: gida07f3367d7,domainid0,public,usenet X-Google-Language: ENGLISH,ASCII-7-bit Path: g2news1.google.com!news2.google.com!npeer01.iad.highwinds-media.com!news.highwinds-media.com!feed-me.highwinds-media.com!nntp.club.cc.cmu.edu!feeder.erje.net!news.mixmin.net!aioe.org!not-for-mail From: tmoran@acm.org Newsgroups: comp.lang.ada Subject: Re: segfault with large-ish array with GNAT Date: Thu, 18 Mar 2010 16:46:19 +0000 (UTC) Organization: Aioe.org NNTP Server Message-ID: References: NNTP-Posting-Host: J4HSNf9Eqj44wTz1J3b8lQ.user.speranza.aioe.org X-Complaints-To: abuse@aioe.org X-Notice: Filtered by postfilter v. 0.8.2 X-Newsreader: Tom's custom newsreader Xref: g2news1.google.com comp.lang.ada:9653 Date: 2010-03-18T16:46:19+00:00 List-Id: > So here's me being naive: I would have thought that Ada (or GNAT > specifically) would be smart enough to allocate memory for large > objects such as my long array in a transparent way so that I don't > have to worry about it, thus (in the Ada spirit) making it harder to > screw up. (Like not having to worry about whether arguments to > subprograms are passed by value or by reference--it just happens.) > > But it seems that I will have to allocate memory for large objects > using pointers (and thus take the memory from the heap). Is that > right? A couple of years ago I wrote some code to look at the (large) Netflix data set. It used Janus Ada and ran in a 2 GB Windows system. I thought about switching to Gnat (for faster floating point) but discovered that would require changing all large arrays to heap allocation, so I dropped that idea. IMO, that's a ridiculous limitation in this day and age.