From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Thread: 103376,cae92f92d6a1d4b1 X-Google-NewGroupId: yes X-Google-Attributes: gida07f3367d7,domainid0,public,usenet X-Google-Language: ENGLISH,ASCII-7-bit Path: g2news2.google.com!news3.google.com!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail From: Niklas Holsti Newsgroups: comp.lang.ada Subject: Re: Ada.Execution_Time Date: Sun, 19 Dec 2010 13:00:28 +0200 Organization: Tidorum Ltd Message-ID: <8n66ucFnavU1@mid.individual.net> References: <4d05e737$0$6980$9b4e6d93@newsspool4.arcor-online.net> Mime-Version: 1.0 Content-Type: text/plain; charset=UTF-8; format=flowed Content-Transfer-Encoding: 7bit X-Trace: individual.net AKhCQonFbcvynSmUp7qCKA7QK3wBJDRb0ZROonTm57UT6rFKBN Cancel-Lock: sha1:clzpDs/2WCzIP7aAyKvlj6MIv04= User-Agent: Mozilla-Thunderbird 2.0.0.24 (X11/20100328) In-Reply-To: Xref: g2news2.google.com comp.lang.ada:17015 Date: 2010-12-19T13:00:28+02:00 List-Id: Vinzent Hoefler wrote: > BrianG wrote: > >> If you mean that they both define a Clock and a Split, maybe. If you >> meanany program that actually does anything, that's not possible. >> That was my original comment: Execution_Time does not provide any >>types/operations useful, without also 'with'ing Real_Time. > > Yes, but so what? The intention of Ada.Execution_Time wasn't to provide the > user with means to instrument the software and to Text_IO some mostly > meaningless values (any decent profiler can do that for you), but rather a > way to implement user-defined schedulers based on actual CPU usage. That is also my understanding of the intention. Moreover, since task scheduling for real-time systems unavoidably deals both with execution times and with real times, I think it is natural that both Ada.Execution_Time and Ada.Real_Time are required. > And well, if you're ranting about CPU_Time, Real_Time.Time_Span is not much > better. It's a pain in the ass to convert an Ada.Real.Time_Span to another > type to interface with OS-specific time types (like time_t) if you're > opting for speed, portability and accuracy. If your target type is OS-specific, it seems harsh to require full portability of the conversion. > BTW, has anyone any better ideas to convert TimeSpan into a record > containing seconds and nanoseconds than this: I may not have better ideas, but I do have some comments on your code. > function To_Interval (TS : in Ada.Real_Time.Time_Span) > return ASAAC_Types.TimeInterval is The following are constants independent of the parameters: > Nanos_Per_Sec : constant := 1_000_000_000.0; > One_Second : constant Ada.Real_Time.Time_Span := > Ada.Real_Time.Milliseconds (1000); (Why not ... := Ada.Real_Time.Seconds (1) ?) > Max_Interval : constant Ada.Real_Time.Time_Span := > Integer (ASAAC_Types.Second'Last) * One_Second; ... so I would move the above declarations into the surrounding package, at least for One_Second and Max_Interval. Of course a compiler might do that optimization in the code, too. (By the way, Max_Interval is a bit less than the largest value of TimeInterval, since the above expression has no NSec part.) > Seconds : ASAAC_Types.Second; > Nano_Seconds : ASAAC_Types.Nanosec; > begin > declare > Sub_Seconds : Ada.Real_Time.Time_Span; > begin The following tests for ranges seem unavoidable in any conversion between types defined by different sources. I don't see how Ada.Real_Time can be blamed for this. Of course I cannot judge if the result (saturation at 'Last or 'First) is right for your application. As you say, there are potential overflow problems, already in the computation of Max_Interval above. > if TS >= Max_Interval then > Seconds := ASAAC_Types.Second'Last; > Nano_Seconds := ASAAC_Types.Nanosec'Last; An alternative approach to the over-range condition TS >= Max_Interval is to make the definition of the application-specific type ASAAC_Types.Second depend on the actual range of Ada.Real_Time.Time_Span so that over-range becomes impossible. Unfortunately I don't see how this could be done portably by static expressions in the declaration of ASAAC_Types.Second, so it would have to be a subtype declaration with an upper bound of To_Duration(Time_Span_Last)-1.0. This raises Constraint_Error at elaboration if the base type is too narrow. > elsif TS < Ada.Real_Time.Time_Span_Zero then > Seconds := ASAAC_Types.Second'First; > Nano_Seconds := ASAAC_Types.Nanosec'First; The above under-range test seems to be forced by the fact that ASAAC_Types.TimeInterval is unable to represent negative time intervals, while Ada.Real_Time.Time_Span can do that. This problem is hardly a shortcoming in Ada.Real_Time. > else > Seconds := ASAAC_Types.Second (TS / One_Second); > Sub_Seconds := TS - (Integer (Seconds) * One_Second); > Nano_Seconds := > ASAAC_Types.Nanosec > (Nanos_Per_Sec * Ada.Real_Time.To_Duration (Sub_Seconds)); An alternative method converts the whole TS to Duration and then extracts the seconds and nanoseconds: TS_Dur : Duration; TS_Dur := To_Duration (TS); Seconds := ASAAC_Types.Second (TS_Dur - 0.5); Nano_Seconds := ASAAC_Types.Nanosec ( Nanos_Per_Sec * (TS_Dur - Duration (Seconds))); This, too, risks overflow in the multiplication, since the guaranteed range of Duration only extends to 86_400. Moreover, using Duration may lose precision (see below). > end if; > end; > > return > ASAAC_Types.TimeInterval'(Sec => Seconds, > NSec => Nano_Seconds); > end To_Interval; > > The solution I came up with here generally works, but suffers some > potential overflow problems I think they are unavoidable unless you take care to make the range of the application-defined types (ASAAC_Types) depend on the range of the implementations of the standard types and also do the multiplication in some type with sufficient range, that you define. > and doesn't look very efficient to me (although that'a minor > problem given the context it's usually used in). Apart from the definition of the constants (which can be moved out of the function), and the range checks (which depend on the application types in ASAAC_Types), the real conversion consists of a division, a subtraction, two multiplications and one call of To_Duration. This does not seem excessive to me, considering the nature of that target type. The alternative method that starts by converting all of TS to Duration avoids the division. Still, this example suggests that Ada.Real_Time perhaps should provide a Split operation that divides a Time_Span into an integer number of Seconds and a sub-second Duration. A problem that you don't mention is that the use of Duration may cause loss of precision. Duration'Small may be as large as 20 milliseconds (RM 9.6(27)), although at most 100 microseconds are advised (RM 9.6(30)), while the Time_Span resolution must be 20 microseconds or better (RM D.8(30)). Perhaps Annex D should require better Duration resolution? Loss of precision could be avoided by doing the multiplication in Time_Span instead of in Duration: Nano_Seconds := ASAAC_Types.Nanosec ( To_Duration (Nanos_Per_Sec * Sub_Seconds)); but the overflow risk is perhaps larger, since Time_Span_Last may not be larger than 3600 (RM D.8(31)). I have met with similar tricky problems in conversions between types of different origins in other contexts, too. I don't think that these problems mean that Ada.Real_Time is defective. -- Niklas Holsti Tidorum Ltd niklas holsti tidorum fi . @ .