From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,FREEMAIL_FROM, INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,c9db882405e08110 X-Google-Attributes: gid103376,public From: "Nick Roberts" Subject: Re: Large Look-up Table question ? Date: 1999/06/02 Message-ID: <3755a21c@eeyore.callnetuk.com>#1/1 X-Deja-AN: 485018756 References: <375299F4.CDEA8A81@hotmail.com> X-Original-NNTP-Posting-Host: da136d243.dialup.callnetuk.com X-Mimeole: Produced By Microsoft MimeOLE V4.72.3110.3 X-Complaints-To: newsabuse@remarq.com X-Trace: 928358819 02H499TBW8004D443C uk21.supernews.com Organization: RemarQ http://www.remarQ.com Newsgroups: comp.lang.ada Date: 1999-06-02T00:00:00+00:00 List-Id: I think the main question here is: how much memory does/will the target machine have? If it will have enough memory, then use an array (as the simplest and fastest implementation); otherwise, you will have to use something else (maybe a hashed array), or get more memory. Note that I use the word 'memory' here, not RAM, since in virtual memory systems, I am talking about virtual memory (not the same as RAM). A huge array, particularly a linear (one-dimensional) one, can work on a virtual memory machine very efficiently when implemented in the language as a simple array: the virtual memory management system effectively converts the array into a sparse array invisibly. However, this strategy can sink like the Titanic if the granularity of array usage (i.e. the typical gap between used components) is similar to the granularity (i.e. the page size) of the virtual memory system. Sometimes a good hashing scheme can be very effective in delivering fast access, but making very good use of memory. Sometimes there's nothing for it but a balanced binary tree, or even an indexed file or database table (especially if the data is updated in place and precious). Occasionally, a re-think about the structure of your data can mean you can drastically reduce the size of the array. Think, in particular, about the use of redirection indexes or pointers. If it is practical to impose some serious limits on numbers of things, it may well be possible to use an index array to map a big index type to a much smaller one. Then your 'big' array can become much, much smaller. Quite often, you simply need to be devious about these things! For example, it may be perfectly acceptable to ask the user "please give a number between 1 and 100 to identify this book (the LLIN)", and then use the 'LLIN' to index books (in a lending library), instead of say, the humungous ISBN. There's never, ever, anything inherently 'clunky' about using a big array, no matter how big. It is only ever clunky to use an array that's bigger than it could ever actually need to be (a common mistake), or to use an array of any size when some other structure is called for. ------------------------------------- Nick Roberts http://www.adapower.com/lab/adaos ------------------------------------- Nick Wilson wrote in message <375299F4.CDEA8A81@hotmail.com>... |I'm implementing a large matrix of values which is basically just a big |lookup table. What would be the best way to implement this in only ada83 |code ? My initial idea is to use a static table for all the values but |this seems clunky, can anyone offer some advice ? | |Thanks, | |