From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00, T_FILL_THIS_FORM_SHORT autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,1769ac558c6fa259 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2002-02-22 12:26:04 PST Path: archiver1.google.com!news1.google.com!newsfeed.stanford.edu!newsfeeds.belnet.be!news.belnet.be!news.tele.dk!small.news.tele.dk!207.115.63.138!newscon04.news.prodigy.com!newsmst01.news.prodigy.com!prodigy.com!postmaster.news.prodigy.com!newssvr21.news.prodigy.com.POSTED!not-for-mail From: tmoran@acm.org Newsgroups: comp.lang.ada Subject: Re: How to speed up stream & record handling? References: X-Newsreader: Tom's custom newsreader Message-ID: NNTP-Posting-Host: 67.112.202.135 X-Complaints-To: abuse@prodigy.net X-Trace: newssvr21.news.prodigy.com 1014409550 ST000 67.112.202.135 (Fri, 22 Feb 2002 15:25:50 EST) NNTP-Posting-Date: Fri, 22 Feb 2002 15:25:50 EST Organization: Prodigy Internet http://www.prodigy.com X-UserInfo1: SCSYQN_@FS@[SPTX\JKXOFXBWR\HPCTL@XT^OBPLAH[\BPIB_NVUAH_[BL[\IRKIANGGJBFNJF_DOLSCENSY^U@FRFUEXR@KFXYDBPWBCDQJA@X_DCBHXR[C@\EOKCJLED_SZ@RMWYXYWE_P@\\GOIW^@SYFFSWHFIXMADO@^[ADPRPETLBJ]RDGENSKQQZN Date: Fri, 22 Feb 2002 20:25:50 GMT Xref: archiver1.google.com comp.lang.ada:20263 Date: 2002-02-22T20:25:50+00:00 List-Id: >How do I avoid calling 'Unsigned_8'Read' 200 times per packet while still >using streams? Go up a level. Define your own My_Record'Read. I made an "is new Ada.Streams.Root_Stream_Type" (see below) and changed the code to: procedure My_Record_Read (Stream : access Ada.Streams.Root_Stream_Type'Class; Item : out My_Record) is use type Ada.Streams.Stream_Element_Offset; The_Data_Bytes : Ada.Streams.Stream_Element_Array (1 .. Data_Array'size/Unsigned_8'Size); for The_Data_Bytes'Address use Item.A'Address; Last : Ada.Streams.Stream_Element_Offset; begin Not_Slow.Read(Not_Slow.My_File_Type(Stream.all), The_Data_Bytes, Last); if Last /= The_Data_Bytes'last then null; -- what to do? end if; B_And_C'Read(Stream, Item.BC); end My_Record_Read; My_File : aliased Not_Slow.My_File_Type; ... Not_Slow.Open(My_File, Mode => Ada.Streams.Stream_IO.In_File, Name => "r:big"); for I in 1 .. 100000 loop My_Record'Read (My_File'access, Item); end loop; Not_Slow.Close(My_File); and it ran in 0.22 seconds. (Gnat 3.14p, -O2) Of course the fixed record size makes Sequential_IO still a faster option. with Ada.Streams, Ada.Streams.Stream_IO; package Not_Slow is type My_File_Type is new Ada.Streams.Root_Stream_Type with private; procedure Open(Stream : in out My_File_Type; Mode : in Ada.Streams.Stream_IO.File_Mode; Name : in String; Form : in String := ""); procedure Close(Stream : in out My_File_Type); procedure Read( Stream : in out My_File_Type; Item : out Ada.Streams.Stream_Element_Array; Last : out Ada.Streams.Stream_Element_Offset); procedure Write( Stream : in out My_File_Type; Item : in Ada.Streams.Stream_Element_Array); private type My_File_Type is new Ada.Streams.Root_Stream_Type with record The_File : Ada.Streams.Stream_IO.File_Type; end record; end Not_Slow; package body Not_Slow is procedure Open(Stream : in out My_File_Type; Mode : in Ada.Streams.Stream_IO.File_Mode; Name : in String; Form : in String := "") is begin Ada.Streams.Stream_IO.Open(Stream.The_File, Mode, Name, Form); end Open; procedure Close(Stream : in out My_File_Type) is begin Ada.Streams.Stream_IO.Close(Stream.The_File); end Close; procedure Read( Stream : in out My_File_Type; Item : out Ada.Streams.Stream_Element_Array; Last : out Ada.Streams.Stream_Element_Offset) is begin Ada.Streams.Stream_IO.Read(Stream.The_File, Item, Last); end Read; procedure Write( Stream : in out My_File_Type; Item : in Ada.Streams.Stream_Element_Array) is begin Ada.Streams.Stream_IO.Write(Stream.The_File, Item); end Write; end Not_Slow;