From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,390580afe3b3407e X-Google-Attributes: gid103376,public X-Google-Language: ENGLISH,ASCII-7-bit Path: g2news1.google.com!news2.google.com!news.maxwell.syr.edu!border1.nntp.dca.giganews.com!border2.nntp.dca.giganews.com!nntp.giganews.com!elnk-atl-nf1!newsfeed.earthlink.net!stamper.news.atl.earthlink.net!newsread3.news.atl.earthlink.net.POSTED!14bb18d8!not-for-mail Sender: mheaney@MHEANEYX200 Newsgroups: comp.lang.ada Subject: Re: Resizing Charles Map References: <415A9A01.9050909@mailinator.com> From: Matthew Heaney Message-ID: User-Agent: Gnus/5.09 (Gnus v5.9.0) Emacs/21.3 MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Date: Wed, 29 Sep 2004 13:13:26 GMT NNTP-Posting-Host: 64.185.133.124 X-Complaints-To: abuse@earthlink.net X-Trace: newsread3.news.atl.earthlink.net 1096463606 64.185.133.124 (Wed, 29 Sep 2004 06:13:26 PDT) NNTP-Posting-Date: Wed, 29 Sep 2004 06:13:26 PDT Organization: EarthLink Inc. -- http://www.EarthLink.net Xref: g2news1.google.com comp.lang.ada:4380 Date: 2004-09-29T13:13:26+00:00 List-Id: "Alex R. Mosteo" writes: > I'm using a Charles Map (Hashed strings) to hold a monotonic > increasing collection of objects. It will contain thousands of them, > and while the upper limit is unknown, I think that it rarely will > reach, for example, a million. Is there something that makes not > recommendable to issue a Resize (Map, Natural'Last) at start? That will preallocate the hash table array to a length of >= Natural'Last. That's very big, so there's a good chance the allocation will fail. If you think the upper limit is around 1 million, then you'd be better off specifying that as the resize value. Better yet, if your map can contain thousands of items, then just use a resize value of 1000 or 10_000 or 50_000 or whatever. And then let the map expand automatically in the rare case of more items than that. Realize that expansion is automatic, so you don't need to manually resize unless you're trying to optimize away expansion (which is admittedly expensive, if the number of items is large). On the other hand, if you allocate a very large hash table, then that requires a large chunk of contiguous virtual memory addresses. So you have to find the right balance between too many rehashing events, and too much memory. I recommend you join the charles.tigris.org mailing lists, and post questions like this there.