From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.4 required=5.0 tests=AC_FROM_MANY_DOTS,BAYES_00 autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,8c3f76cf9b2829c4 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2001-01-26 13:00:45 PST Path: supernews.google.com!sn-xit-02!sn-xit-03!supernews.com!news-out.usenetserver.com!newsfeed.direct.ca!look.ca!newsfeed1.earthlink.net!newsfeed2.earthlink.net!newsfeed.earthlink.net!news.mindspring.net!not-for-mail From: Marin David Condic Newsgroups: comp.lang.ada Subject: Re: Duration vs. Ada.Real_Time Date: Fri, 26 Jan 2001 15:58:31 -0500 Organization: MindSpring Enterprises Message-ID: <3A71E4F6.6D7015AD@acm.org> References: <980495512.529981@edh3> <3A71814B.7E8CCF60@acm.org> <94s5bl$r1r$1@nnrp1.deja.com> NNTP-Posting-Host: d1.56.b1.b9 Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit X-Server-Date: 26 Jan 2001 20:59:41 GMT X-Mailer: Mozilla 4.07 [en] (WinNT; I) Xref: supernews.google.com comp.lang.ada:4574 Date: 2001-01-26T20:59:41+00:00 List-Id: Ah! We appear to have here a good illustration of the difference between accuracy and precision. Duration'Small represents the precision which objects of type Duration can represent. The LSB has to be at least 20mSec. However, if the clock has accuracy no better than 1Sec, the LSBs are meaningless. The precision exceeds the accuracy of the actual device. (my misunderstanding of the original question.) I would think that a reasonable implementation of Ada for realtime systems would want to insure that the precision had some relationship to the accuracy of the clock available. Obviously, for workstations/PCs with a non-realtime OS, etc., you can't exactly insist that the platform be changed for the language. But stating, for example, that the precision of a time representation can go down to atto-seconds can easily mislead one to believe that the measurement of time is going to be something close to that precision. Clearly, one needs to know two things: What is the accuracy of the clock(s) available to me through the hardware/OS? (Talk to the hardware manufacturer.) And what is the relationship between my hardware and the Ada implementation I have? (Talk to the compiler vendor.) Id est, does it do me any good to say "delay 0.020;" with a given compiler and target system? MDC Robert Dewar wrote: > This may be confusing, the question was not about precision > of Duration, it was about resolution of the timer > > The value of Duration'Small is only an upper bound for > resolution, there is nothing in the RM that forbids an > implementation where clock only updates every second (and > indeed do not be surprised if some Unix implementations > are like this). -- ====================================================================== Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/ Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m Visit my web site at: http://www.mcondic.com/ "I'd trade it all for just a little more" -- Charles Montgomery Burns, [4F10] ======================================================================