Jeffrey R. Carter" wrote in message news:o628eh$2vi$1@dont-email.me... > On 01/22/2017 12:37 PM, reinkor wrote: >> As I understand from my textbook, type Duration is guaranteed to cover >> the range -86_400 .. 86_400 (number of seconds for one day). Does this >> mean that I take a considerable risk if I assume "Duration" can represent >> more than one day? > > Representing that range with an accuracy of 20 ms takes about 22 bits, so > most implementations will use at least 32 bits. The additional bits can be > used for a greater range, a smaller accuracy, or both. ±86,400 with an > accuracy of 20 µs takes about 32 bits, so a range of ±86,400 seems likely > for some 32-bit compilers. I believe that in Janus/Ada, we allowed +/- 2 days to make math a bit easier, but the limit is not wildly different than the textbook version. (I think we ignored the recommendation for 20 µs). It would be odd for a non-64-bit compiler to use a 64-bit Duration, as 64-bit operations are relatively expensive on smaller machines. Randy.