From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.4 required=5.0 tests=AC_FROM_MANY_DOTS,BAYES_00 autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,8c3f76cf9b2829c4 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2001-01-31 06:56:56 PST Path: supernews.google.com!sn-xit-02!sn-xit-01!supernews.com!newshub2.rdc1.sfba.home.com!news.home.com!newsfeed.direct.ca!look.ca!newsfeed1.earthlink.net!newsfeed2.earthlink.net!newsfeed.earthlink.net!news.mindspring.net!not-for-mail From: Marin David Condic Newsgroups: comp.lang.ada Subject: Re: Duration vs. Ada.Real_Time Date: Wed, 31 Jan 2001 09:55:36 -0500 Organization: MindSpring Enterprises Message-ID: <3A782767.32574700@acm.org> References: <980495512.529981@edh3> <3A71814B.7E8CCF60@acm.org> <94s5bl$r1r$1@nnrp1.deja.com> <3A71E4F6.6D7015AD@acm.org> <94vo82$kst$1@nnrp1.deja.com> NNTP-Posting-Host: d1.56.bf.c6 Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit X-Server-Date: 31 Jan 2001 14:55:16 GMT X-Mailer: Mozilla 4.07 [en] (WinNT; I) Xref: supernews.google.com comp.lang.ada:4757 Date: 2001-01-31T14:55:16+00:00 List-Id: Stephen Leake wrote: > Perhaps it would be more appropriate for Marin to request that type > Ada.Real_Time.Time_Span (LRM appendix D) match the hardware clock; > that is more clearly intended to be a "hard real-time" clock type. > Oh, I'm not really fussy about how you get there. That might work reasonably well. And I certainly don't insist that *every* implementation of Ada go off and do it my way. My point is that when one is working with an embedded compiler for a specific target that has a real time clock, one just kind of expects Duration to have some relationship to that clock. If you do delays, you kind of expect to be able to get a delay approximately as good as the resolution of the clock - no more - no less. An analogy would be, for example, someone implementing the type Character as a 32 bit word. (Don't know if this is *legal*, but suppose that it was?) Yes, a 32 bit word will hold a single ASCII character just fine and provided the compiler is consistent in this, it will mostly be invisible to the programmer. But if someone declares an object of type Character (or aray of them) one rather expects to allocate and use a single byte. Anything else would be regarded as kind of silly. For most applications, it may not matter, but for embedded work, it could be quite important because you are so close to the hardware. > > > And also, programs do various calculations with Duration, if there > > is more precision, these calculations are more accurate. > > I agree with Marin here; convert to a type that you know or control > the precision of to do computations. Thanks. :-) MDC -- ====================================================================== Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/ Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m Visit my web site at: http://www.mcondic.com/ "I'd trade it all for just a little more" -- Charles Montgomery Burns, [4F10] ======================================================================