From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,f8311a3a7edd715 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2000-12-12 21:36:57 PST Path: supernews.google.com!sn-xit-02!supernews.com!news.infoave.net!arclight.uoregon.edu!vixen.cso.uiuc.edu!howland.erols.net!newshub2.home.com!news.home.com!news1.frmt1.sfba.home.com.POSTED!not-for-mail From: tmoran@acm.org Newsgroups: comp.lang.ada Subject: Re: Using "delay until" in real-time References: X-Newsreader: Tom's custom newsreader Message-ID: Date: Wed, 13 Dec 2000 05:35:53 GMT NNTP-Posting-Host: 24.20.190.201 X-Complaints-To: abuse@home.net X-Trace: news1.frmt1.sfba.home.com 976685753 24.20.190.201 (Tue, 12 Dec 2000 21:35:53 PST) NNTP-Posting-Date: Tue, 12 Dec 2000 21:35:53 PST Organization: Excite@Home - The Leader in Broadband http://home.com/faster Xref: supernews.google.com comp.lang.ada:3043 Date: 2000-12-13T05:35:53+00:00 List-Id: >It's more likely to be fixed-point. If you're using >Gnat, Ada.Real_Time.Time and Ada.Real_Time.Time_Span are both derived >from Duration, which is a 64-bit fixed-point type with a Small of >exactly 1.0e-9 (one nanosecond). (A win for mathematical oversimplification over engineering, ;) Clearly you need to do your arithmetic in a system that correctly represents a single tick. Integers come to mind. How about Iteration_Count : Natural := 0; ... delay until Start_Time + Ada.Real_Time.Time_Span(Iteration_Count)/60; If Ada.Real_Time.Time_Span(Iteration_Count) is going to get too large before your program is done, then you need something that isn't so big it overflows, but will count ticks without roundoff. A fixed point value with suitable range and a 'small of 1/60 will do that. Then you need a way to convert this accurate time to an Ada.Real_Time.Time How about: with Ada.Real_Time; package Accurate_Time is type Time_Span is delta 1.0/60 range 0.0 .. long enough for Time_Span'small use 1.0/60; type Time is private; function Clock return Time; function "+"(Left : Time; Right : Time_Span) return Time; function To_Real_Time(T : in Time) return Ada.Real_Time.Time; private type Time is record Start_SC : Ada.Real_Time.Seconds_Count; Start_TS : Ada.Real_Time.Time_Span; Since : Time_Span; end record; end Accurate_Time; package body Accurate_Time is use Ada.Real_Time; function Clock return Time is Result : Time; Now : Ada.Real_Time.Time := Ada.Real_Time.Clock; begin Ada.Real_Time.Split(Now, Result.Start_SC, Result.Start_TS); Result.Since := 0.0; return Result; end Clock; function "+"(Left : Time; Right : Time_Span) return Time is Result : Time := Left; begin Result.Since := Result.Since + Right; return Result; end "+"; function To_Real_Time(T : in Time) return Ada.Real_Time.Time is Whole_Seconds_Since : Ada.Real_Time.Seconds_Count; Fractional_Seconds_Since : Time_Span; SC : Ada.Real_Time.Seconds_Count; TS : Ada.Real_Time.Time_Span; begin Whole_Seconds_Since := Ada.Real_Time.Seconds_Count(T.Since); if Time_Span(Whole_Seconds_Since) > T.Since then Whole_Seconds_Since := Whole_Seconds_Since-1; end if; Fractional_Seconds_Since := T.Since - Time_Span(Whole_Seconds_Since); SC := T.Start_SC + Whole_Seconds_Since; TS := T.Start_TS + Ada.Real_Time.To_Time_Span(Duration(Fractional_Seconds_Since)); return Ada.Real_Time.Time_Of(SC, TS); end To_Real_Time; end Accurate_Time; >However, its quite possible that the real-time clock hardware on my >platform (PC) uses something like nannoseconds, and the OS call just >approximates that. If it uses the usual 8253 descendant, that clock hardware ticks at 1,193,182 Hz, (1/3 of the NTSC TV color subcarrier crystal frequency) which is 0.838 mics/tick.