From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=0.7 required=5.0 tests=BAYES_00,MSGID_RANDY autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,8c3f76cf9b2829c4 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2001-01-31 13:00:10 PST Path: supernews.google.com!sn-xit-02!supernews.com!news.tele.dk!128.230.129.106!news.maxwell.syr.edu!nntp2.deja.com!nnrp1.deja.com!not-for-mail From: Ted Dennison Newsgroups: comp.lang.ada Subject: Re: Duration vs. Ada.Real_Time Date: Wed, 31 Jan 2001 20:53:26 GMT Organization: Deja.com Message-ID: <959u02$1o0$1@nnrp1.deja.com> References: <980495512.529981@edh3> <3A71814B.7E8CCF60@acm.org> <94s5bl$r1r$1@nnrp1.deja.com> <3A71E4F6.6D7015AD@acm.org> <94vo82$kst$1@nnrp1.deja.com> <959d0c$gsg$1@nnrp1.deja.com> <3A786497.1C791722@acm.org> NNTP-Posting-Host: 204.48.27.130 X-Article-Creation-Date: Wed Jan 31 20:53:26 2001 GMT X-Http-User-Agent: Mozilla/5.0 (Windows; U; WinNT4.0; en-US; 0.7) Gecko/20010109 X-Http-Proxy: 1.0 x73.deja.com:80 (Squid/1.1.22) for client 204.48.27.130 X-MyDeja-Info: XMYDJUIDtedennison Xref: supernews.google.com comp.lang.ada:4781 Date: 2001-01-31T20:53:26+00:00 List-Id: In article <3A786497.1C791722@acm.org>, Marin David Condic wrote: > In your example you have three possible time sources. Which would you > pick for dealing with Duration? Which would you pick for > Ada.Real_Time.Time_Span? For a PC, if I were making TEDAda, I'd probably want to match duration with the units the OS uses for its time of day primitives (which are possibly based on source 2) or the Ada minimum requirements, whichever is higher-res. For Ada.Real_Time.Time, It'd be nice to use units of ticks on the high-frequency clock (time source 3), as that will give me the most possible resolution (which is critical for being able to accurately measure elapsed time for small tasks). For Ada.Real_Time.Time_Span, it would probably be best to use the same units, as I'd want to be able to use Real_Time.Time_Span in calculations with Real_Time.Time without loosing accuracy. That means any use of "delay" or "delay until" is going require a conversion from one of those other time sources into the OS's units for its thread timed rescueduling primitives (perhaps based on source 1). However, that is often in RTC "ticks", so a conversion would have been nessecary anyway. I know our Ada vendor (GreenHills) chose to use microseconds as its units for Ada.Real_Time.Time. I believe they made that decision because support for the high-res timer is a kernel-configurable item (not always available), and because they use the same compiler codebase on multiple architectures under the same OS. So it was a good decision for them. But it has the unfortunate effect that any use of the high-res timer has to be done via direct OS calls. > I'm not saying all Ada compilers everywhere should do this for every > target - just where it makes some sense because of what the hardware > or OS provides you. It might be possible to make it user configurable Is there some RTC hardware out there that uses seconds instead of Hz as its units? If so, and if its OS rescheduling primitive kept the same units, then yes, the Ada vendor for that platform should also use those same units. -- T.E.D. http://www.telepath.com/~dennison/Ted/TED.html Sent via Deja.com http://www.deja.com/