From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,124905131f269735 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2001-09-25 16:10:20 PST Path: archiver1.google.com!newsfeed.google.com!newsfeed.stanford.edu!feed.textport.net!out.nntp.be!propagator-SanJose!news-in-sanjose!newshub2.rdc1.sfba.home.com!news.home.com!news1.rdc1.bc.home.com.POSTED!not-for-mail Message-ID: <3BB10E2C.8C57FFE1@linuxchip.demon.co.uk> From: Dr Adrian Wrigley X-Mailer: Mozilla 4.76 [en] (X11; U; Linux 2.4.2-2 i686) X-Accept-Language: en MIME-Version: 1.0 Newsgroups: comp.lang.ada Subject: Re: gnat and heap size References: <1001442590.557811@news.drenet.dnd.ca> Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Date: Tue, 25 Sep 2001 23:10:20 GMT NNTP-Posting-Host: 24.78.155.97 X-Complaints-To: abuse@home.net X-Trace: news1.rdc1.bc.home.com 1001459420 24.78.155.97 (Tue, 25 Sep 2001 16:10:20 PDT) NNTP-Posting-Date: Tue, 25 Sep 2001 16:10:20 PDT Organization: Excite@Home - The Leader in Broadband http://home.com/faster Xref: archiver1.google.com comp.lang.ada:13372 Date: 2001-09-25T23:10:20+00:00 List-Id: Claude Marinier wrote: > We want to use large arrays (well, large for us: 10000 x 10000 complex > numbers). We are using gnat 3.13p on Solaris 7. We run out of heap (heap > exhausted) and have not yet found a way to increase it. I have had a similar problem, and raised it on this NG a few months ago... I was trying to store the past five years of the US stock markets in a big array. I wanted to reload the data from disk "instantly", so I could do statistics easily. The solution I came up with was to use an "mmap" call to map the data. That way, I could get an access value to data as big as the OS could give. This was big enough for my data (about 0.5GB). Mapping the data back in was extremely quick. Unfortunately, it makes the code a bit less portable, but that wasn't too bad. But there were problems with mapping big records, and taking 'size attributes. Since Ada works in bits, the sizes can overflow (signed) 32-bit representation. This makes handling data in excess of 256MB tricky, since locations of record elements are wrong, as are size attributes (eg a problem in generics). I solved this by calculating the sizes of each element, multiplying by the number of elements, adding in each record component's size. Nasty. But it was 10-100 times the speed of more obvious solutions. I suspect you have a 64-bit architecture, and *may* not hit the 'size problems I had with GNAT. If you stick to simple arrays, you might be OK. I was using GNAT 3.12p on a '686 processor. I don't know what these other guys seem so surprised at. Doesn't everybody else have 1280MB of RAM nowadays, too? ;-) It's only a few hundred bucks! -- Adrian Wrigley (by the way... what do you use such large matrices for?)