comp.lang.ada
 help / color / mirror / Atom feed
From: Adam Beneschan <adam@irvine.com>
Subject: Re: Memory Useage
Date: Fri, 08 Jun 2007 17:43:24 -0700
Date: 2007-06-08T17:43:24-07:00	[thread overview]
Message-ID: <1181349804.839474.212720@r19g2000prf.googlegroups.com> (raw)
In-Reply-To: <1181335115.659050.135860@q69g2000hsb.googlegroups.com>

On Jun 8, 1:38 pm, mhamel...@yahoo.com wrote:
> Hello c.l.a.  Another question, I have a program that stores data on
> the disk using sequential_io.  When I later read that data into an
> array, the memory growth after ingesting a file is much much larger
> than the disk footprint.  A file that takes 26.8MB on disk (over 134k
> records) causes the program to swell by over 600MB!  Holy bloatware.
> A short overview of what I'm trying to do - each sequential_io data
> file has an associated header file with stuff like number of records,
> etc.  The header is read, and an array is then created based on how
> many records are said to be in the data file.  The data file is then
> read, sticking a node into the array.  Some abbreviated code below,
> the spec:
>
> generic
>   type Node_Type is private;
> package Node_Manager is
>
>   package Seq is new Sequential_Io (Node_Type);
>
>   type Node_Array is array (positive range <>) of Node_Type;
>   type Node_Ptr is access Node_Array;
>
>   type Data_Rec is
>     record
>       Hdr : Node_Hdr;
>       Data : Node_Ptr;
>     end record;
>
> Body stuff:
>
>   procedure Free is new Unchecked_Deallocation (Node_Array, Node_Ptr);
>   procedure Open (File : in out Data_Rec;
>                            Name : in String) is
>     Curr : Positive := 1;
>     Node : Node_Type;
>   begin
>     Read_Hdr (Name, File.Hdr);
>     File.Data := new Node_Array (1 .. File.Hdr.Size);
>
>     Seq.Open (Dat_File, Seq.In_File, Name & ".dat");
>     while not Seq.End_of_File (Dat_File) loop
>       Seq.Read (Dat_File, Node);
>       File.data.all (Curr) := Node;
>       Curr := Curr + 1;
>     end loop;
>     Seq.Close (Dat_File);
>     ...
>
> The program works as I've wanted, though up until recently I've only
> dealt with very small data sets, which is why I've never noticed undue
> memory growth.  Now that I'm working with some "large" data sets, the
> bloat is unbearable.  Any suggestions? (Besides look for another line
> of work ;) )
> Platform is ObjectAda 7.2 on WinNT.


You sure File.Hdr.Size is correct?  (I.e. is it the same as the number
of records in the file?)

                   -- Adam




  reply	other threads:[~2007-06-09  0:43 UTC|newest]

Thread overview: 5+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2007-06-08 20:38 Memory Useage mhamel_98
2007-06-09  0:43 ` Adam Beneschan [this message]
2007-06-09  3:09   ` mhamel_98
2007-06-09  5:25 ` Niklas Holsti
2007-06-11 15:28   ` mhamel_98
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox