From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,353ec4aa58e82326 X-Google-Attributes: gid103376,public From: Matthew Heaney Subject: Re: String to Integer conversion? Date: 1998/08/02 Message-ID: #1/1 X-Deja-AN: 377153094 Sender: matt@mheaney.ni.net References: <6ndi5d$uj3@gcsin3.geccs.gecm.com> NNTP-Posting-Date: Sun, 02 Aug 1998 00:38:36 PDT Newsgroups: comp.lang.ada Date: 1998-08-02T00:00:00+00:00 List-Id: dewar@merv.cs.nyu.edu (Robert Dewar) writes: > < instantiation, so that everyone (ie working programmers like me) can > share the same one. > >> > > Why? Perhaps this is just an Ada 83 problem, because that version of Ada didn't have the attributes Duration'Image and Duration'Value. Without those attributes, there is a real need for a Duration_IO. It may indeed be the case the neophyte programmers in a class-room setting don't require the use of Duration. But they can't avoid using a delay statement forever. In my experience, the need for manipulation of Duration strings is _very_ common. One thing I do quite often is to translate an environment variable that represents the value of an entry call delay, for example, or a timeout for I/O to an external device. This way I can tune delays and timeouts without recompiling anything.