comp.lang.ada
 help / color / mirror / Atom feed
* Re: Duration vs. Ada.Real_Time
       [not found] ` <8mac6.9236$R5.473526@news1.frmt1.sfba.home.com>
@ 2001-01-26 15:30   ` Robert Dewar
  0 siblings, 0 replies; 29+ messages in thread
From: Robert Dewar @ 2001-01-26 15:30 UTC (permalink / raw)


In article <8mac6.9236$R5.473526@news1.frmt1.sfba.home.com>,
  tmoran@acm.org wrote:
> >I figured that using Duration could give imprecise result.
So I wrote a
> >small program to print out Duration'Small.
>   Don't believe everything that a computer prints out.


Don't believe anything you read in CLA!

Of COURSE GNAT is correct when you ask it to print
Duration'Small, and I suggest believing it :-)

What is wrong is the incorrect idea that it has anything
to do with the resolution of the timer.


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
       [not found] ` <3A71814B.7E8CCF60@acm.org>
@ 2001-01-26 15:33   ` Robert Dewar
  2001-01-26 20:58     ` Marin David Condic
  0 siblings, 1 reply; 29+ messages in thread
From: Robert Dewar @ 2001-01-26 15:33 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 2156 bytes --]

In article <3A71814B.7E8CCF60@acm.org>,
  Marin David Condic <mcondic.auntie.spam@acm.org> wrote:
> So if you are concerned about portability, you'd use Duration
> only in applications where 20mSec of accuracy is sufficient.

This may be confusing, the question was not about precision
of Duration, it was about resolution of the timer

The value of Duration'Small is only an upper bound for
resolution, there is nothing in the RM that forbids an
implementation where clock only updates every second (and
indeed do not be surprised if some Unix implementations
are like this).


For portability, your use
> of Real_Time should never assume accuracy greater that
20uSec. If you don't
> need portability, then you can rely on whatever the
implementation gives you
> for Duration'Small etc.
>
> MDC
>
> Atle R�stad wrote:
>
> > Hi
> >
> > I have some code that has a max of 30 milliseconds to
process, and need to
> > measure if this is possible. But the requirement for
Duration is that
> > Duration'Small must be less then 20 milliseconds, and
> > Ada.Real_Time.Time_Unit must be less then 20 microseconds.
> >
> > I figured that using Duration could give imprecise result.
So I wrote a
> > small program to print out Duration'Small.
> >
> > I'm using gnat 3.12 and printed out both Duration'Small and
> > Real_Time.Time_Unit and they were both 1.0E-09. I thought
Duration'Small
> > would be larger then Real_Time.Time_Unit but they were the
same.
> >
> > Why should I use Real_Time when duration has the same
resolution?
> >
> > I will run the program on an Solaris 8. How will this
affect my time
> > measuring? What resolution can I expect?
> >
> > Thanks,
> > Atle
>
> --
>
===============================================================
=======
> Marin David Condic - Quadrus Corporation -
http://www.quadruscorp.com/
> Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o
m
> Visit my web site at:  http://www.mcondic.com/
>
>     "I'd trade it all for just a little more"
>         --  Charles Montgomery Burns, [4F10]
>
===============================================================
=======
>
>


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-26 15:33   ` Robert Dewar
@ 2001-01-26 20:58     ` Marin David Condic
  2001-01-26 21:32       ` Ted Dennison
  2001-01-28  0:13       ` Robert Dewar
  0 siblings, 2 replies; 29+ messages in thread
From: Marin David Condic @ 2001-01-26 20:58 UTC (permalink / raw)


Ah! We appear to have here a good illustration of the difference between
accuracy and precision. Duration'Small represents the precision which
objects of type Duration can represent. The LSB has to be at least
20mSec. However, if the clock has accuracy no better than 1Sec, the LSBs
are meaningless. The precision exceeds the accuracy of the actual
device. (my misunderstanding of the original question.)

I would think that a reasonable implementation of Ada for realtime
systems would want to insure that the precision had some relationship to
the accuracy of the clock available. Obviously, for workstations/PCs
with a non-realtime OS, etc., you can't exactly insist that the platform
be changed for the language. But stating, for example, that the
precision of a time representation can go down to atto-seconds can
easily mislead one to believe that the measurement of time is going to
be something close to that precision.

Clearly, one needs to know two things: What is the accuracy of the
clock(s) available to me through the hardware/OS? (Talk to the hardware
manufacturer.) And what is the relationship between my hardware and the
Ada implementation I have? (Talk to the compiler vendor.) Id est, does
it do me any good to say "delay 0.020;" with a given compiler and target
system?

MDC

Robert Dewar wrote:

> This may be confusing, the question was not about precision
> of Duration, it was about resolution of the timer
>
> The value of Duration'Small is only an upper bound for
> resolution, there is nothing in the RM that forbids an
> implementation where clock only updates every second (and
> indeed do not be surprised if some Unix implementations
> are like this).

--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-26 20:58     ` Marin David Condic
@ 2001-01-26 21:32       ` Ted Dennison
  2001-01-27  5:01         ` Keith Thompson
  2001-01-27 14:34         ` Marin David Condic
  2001-01-28  0:13       ` Robert Dewar
  1 sibling, 2 replies; 29+ messages in thread
From: Ted Dennison @ 2001-01-26 21:32 UTC (permalink / raw)


In article <3A71E4F6.6D7015AD@acm.org>,
  Marin David Condic <mcondic.auntie.spam@acm.org> wrote:

> I would think that a reasonable implementation of Ada for realtime
> systems would want to insure that the precision had some relationship
> to the accuracy of the clock available. Obviously, for
Do you mean "accuracy" or "frequency"?
...
> be changed for the language. But stating, for example, that the
> precision of a time representation can go down to atto-seconds can
> easily mislead one to believe that the measurement of time is going to
> be something close to that precision.

This gets back around to the issue we were discussing a couple of weeks
ago. If the language requires units of seconds, but the system's clock
used some incompatable system like Hz (eg: 60 Hz, which can't be
represented exactly in terms of fixed-point seconds), then you (or at
least *I*) would actually prefer a much higher precision, so that your
error in representation isn't so great when you start to do math with it.

--
T.E.D.

http://www.telepath.com/~dennison/Ted/TED.html


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-26 21:32       ` Ted Dennison
@ 2001-01-27  5:01         ` Keith Thompson
  2001-01-27 14:40           ` Marin David Condic
  2001-01-27 14:34         ` Marin David Condic
  1 sibling, 1 reply; 29+ messages in thread
From: Keith Thompson @ 2001-01-27  5:01 UTC (permalink / raw)


Ted Dennison <dennison@telepath.com> writes:
[...]	
> This gets back around to the issue we were discussing a couple of weeks
> ago. If the language requires units of seconds, but the system's clock
> used some incompatable system like Hz (eg: 60 Hz, which can't be
> represented exactly in terms of fixed-point seconds), then you (or at
> least *I*) would actually prefer a much higher precision, so that your
> error in representation isn't so great when you start to do math with it.

An implementation could easily declare type Duration in such a way
that 1.0/60.0 is exactly representable.

On the other hand, you probably don't want Duration to have that kind
of dependency on a particular system.

-- 
Keith Thompson (The_Other_Keith) kst@cts.com  <http://www.ghoti.net/~kst>
San Diego Supercomputer Center           <*>  <http://www.sdsc.edu/~kst>
MAKE MONEY FAST!!  DON'T FEED IT!!



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-26 21:32       ` Ted Dennison
  2001-01-27  5:01         ` Keith Thompson
@ 2001-01-27 14:34         ` Marin David Condic
  2001-01-28  0:18           ` Robert Dewar
  2001-01-29 14:54           ` Ted Dennison
  1 sibling, 2 replies; 29+ messages in thread
From: Marin David Condic @ 2001-01-27 14:34 UTC (permalink / raw)


Ted Dennison wrote:

> In article <3A71E4F6.6D7015AD@acm.org>,
>   Marin David Condic <mcondic.auntie.spam@acm.org> wrote:
>
> > I would think that a reasonable implementation of Ada for realtime
> > systems would want to insure that the precision had some relationship
> > to the accuracy of the clock available. Obviously, for
> Do you mean "accuracy" or "frequency"?

O.K. Pick nits. :-) I meant "frequency" but this is in a way related to
"accuracy". If my micrometer is marked off in thousanths of an inch, it does
me no good to try to measure ten-thousanths of an inch, so in manufacturing
parts, I can't be any more accurate than to a thousanth of an inch. My
micrometer may have been bounced off the milling machine a few times too many
and may actually not be "accurate" in measuring a 1" dimension.

>
> ...
> > be changed for the language. But stating, for example, that the
> > precision of a time representation can go down to atto-seconds can
> > easily mislead one to believe that the measurement of time is going to
> > be something close to that precision.
>
> This gets back around to the issue we were discussing a couple of weeks
> ago. If the language requires units of seconds, but the system's clock
> used some incompatable system like Hz (eg: 60 Hz, which can't be
> represented exactly in terms of fixed-point seconds), then you (or at
> least *I*) would actually prefer a much higher precision, so that your
> error in representation isn't so great when you start to do math with it.

Well, you do want some precision beyond the smallest unit of the actual clock
time if you have this sort of situation. But most of the clocks I've seen are
some version of a scaled integer, thus allowing Duration'Small to be whatever
it wants to be as long as the LSB is under 20mSec. Am I incorrect in this?

MDC
--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-27  5:01         ` Keith Thompson
@ 2001-01-27 14:40           ` Marin David Condic
  0 siblings, 0 replies; 29+ messages in thread
From: Marin David Condic @ 2001-01-27 14:40 UTC (permalink / raw)


Keith Thompson wrote:

> An implementation could easily declare type Duration in such a way
> that 1.0/60.0 is exactly representable.
>
> On the other hand, you probably don't want Duration to have that kind
> of dependency on a particular system.

Why not? For a specific target that is going to have a specific clock used for
timing of delays, etc., Duration can be an exact line-up with that clock. If
the implementation goes to another target, you'd have to change the declaration
of Duration to match its clock. What would be wrong with dependency on that
sort of system? (Presumption being that in both cases, the clock has a
granularity less than 20mSec so it is within the standard.)

This is assuming we are talking about realtime compilers as opposed to
something that targets, say, Unix & the time source is of questionable value.

MDC
--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-26 20:58     ` Marin David Condic
  2001-01-26 21:32       ` Ted Dennison
@ 2001-01-28  0:13       ` Robert Dewar
  2001-01-29 14:02         ` Marin David Condic
  2001-01-30 14:33         ` Stephen Leake
  1 sibling, 2 replies; 29+ messages in thread
From: Robert Dewar @ 2001-01-28  0:13 UTC (permalink / raw)


In article <3A71E4F6.6D7015AD@acm.org>,
  Marin David Condic <mcondic.auntie.spam@acm.org> wrote:

> I would think that a reasonable implementation of Ada for
> realtime systems would want to insure that the precision had
> some relationship to the accuracy of the clock available.

Why do you think that?

The RM contains no encouragement for this thought!

Duration is used for other things besides the delay statement.

And also, programs do various calculations with Duration, if
there is more precision, these calculations are more accurate.


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-27 14:34         ` Marin David Condic
@ 2001-01-28  0:18           ` Robert Dewar
  2001-01-29 14:54           ` Ted Dennison
  1 sibling, 0 replies; 29+ messages in thread
From: Robert Dewar @ 2001-01-28  0:18 UTC (permalink / raw)


In article <3A72DC5E.4C1CE092@acm.org>,
  Marin David Condic <mcondic.auntie.spam@acm.org> wrote:

> I can't be any more accurate than to a thousanth of an inch.
> My micrometer may have been bounced off the milling machine a
> few times too many and may actually not be "accurate" in
> measuring a 1" dimension.

You are assuming that type Duration is solely for use by the
delay statement or for use in conjunction with values read
from the clock (these may of course have totally unrelated
low order accuracy), but this is just not true.

It would clearly be MUCH less useful if Duration was target
dependent and had only a precision corresponding to the delay
accuracy. No one could really sensibly prefer this!


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
       [not found] <980495512.529981@edh3>
       [not found] ` <8mac6.9236$R5.473526@news1.frmt1.sfba.home.com>
       [not found] ` <3A71814B.7E8CCF60@acm.org>
@ 2001-01-28 19:32 ` Simon Wright
  2001-01-31  6:13   ` Robert Dewar
  2 siblings, 1 reply; 29+ messages in thread
From: Simon Wright @ 2001-01-28 19:32 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 1719 bytes --]

"Atle R�stad" <aer@edh.ericsson.se> writes:

> Hi
> 
> I have some code that has a max of 30 milliseconds to process, and need to
> measure if this is possible. But the requirement for Duration is that
> Duration'Small must be less then 20 milliseconds, and
> Ada.Real_Time.Time_Unit must be less then 20 microseconds.
[...]
> I will run the program on an Solaris 8. How will this affect my time
> measuring? What resolution can I expect?

(1) On Solaris (>=2.6 or so, I think) the default operating system
    tick is 10 mS.

(2) GNAT uses nanosleep() to implement delay (pretty sure of this).

(3) nanosleep() sleeps for *at least* the time you specify, rounded up
    to an integral number of ticks.

(4) So, if you "delay 0.000_001;", you'll delay for at least 10 mS and
    up to 20 mS.

However, you can change the OS tick to 1 mS by writing

  set hires_tick 1

in /etc/system (I may have the exact grammar wrong, mail me at work
for the details). This will give you a maximum repetition frequency of
500 Hz.

NB, that 1 is a boolean meaning 'true', _not_ the number of milliseconds!

Running as root gives you real-time dispatching but doesn't change the
behaviour of nanosleep().

Linux is similar, though there's no easy way to change the tick rate
(you can edit /usr/include/asm/param.h, or figure a way to redefine
HZ, and rebuild the kernel; worked fine for us, but YMMV, though,
since the library won't match). And don't try delays < 2 mS as root,
they're executed as busy loops.

-- 
Simon Wright                       Work Email: simon.j.wright@amsjv.com
Alenia Marconi Systems                        Voice: +44(0)23-9270-1778
Integrated Systems Division                     FAX: +44(0)23-9270-1800



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-28  0:13       ` Robert Dewar
@ 2001-01-29 14:02         ` Marin David Condic
  2001-01-30 14:33         ` Stephen Leake
  1 sibling, 0 replies; 29+ messages in thread
From: Marin David Condic @ 2001-01-29 14:02 UTC (permalink / raw)


Robert Dewar wrote:

> In article <3A71E4F6.6D7015AD@acm.org>,
>   Marin David Condic <mcondic.auntie.spam@acm.org> wrote:
>
> > I would think that a reasonable implementation of Ada for
> > realtime systems would want to insure that the precision had
> > some relationship to the accuracy of the clock available.
>
> Why do you think that?
>

Because as a real-time programmer, I feel like it. :-) Just because it
would be reasonably useful to me if Duration were a scaled integer that
just so happened to match exactly the scaled integer representation of
the time source I had available to me.

>
> The RM contains no encouragement for this thought!
>

And that matters to me why? Just because the ARM didn't make a
pronouncement on it doesn't stop it from being useful to me. And if you
are selling me a product and I express my desires for what that product
should do, and you appeal to the ARM to tell me my desires are not
reasonable, would you be surprised when I pick a different compiler?

>
> Duration is used for other things besides the delay statement.
>

O.K. And I suppose I could probably use Duration for a thousand things
that had nothing whatsoever to do with time. If Duration were the same
size as Long_Long_Float, I might use it to do matrix multiplication -
just because it was there and handy.

I don't think that would change my position on why I'd like the type to
correspond (as much as is possible) to whatever time source I have
available.


>
> And also, programs do various calculations with Duration, if
> there is more precision, these calculations are more accurate.
>

That may very well be true depending on the implementation. I don't
think that changes anything for me. I'd bet a nickle that if you looked
at all the math ops performed on anything of type Duration, you'd find
that most of them were adds, the next level would be subtracts and all
other math ops would fall so far into the weeds as to be of little
concern. Yes, I know you can come up with cases where this may not hold.
In practice, I just don't think it would come up that often.

If Duration is a fixed point type with a delta that lines up to the LSB
of my time source (and I have never seen a hardware source that gave me
time as a floating point number - maybe they exist - I've never seen
one.) then it would seem to me that I am computing at the resolution of
my ability to measure. Given that (for most real time systems) I'm 99%
of the time going to be adding and subtracting, I don't think extra low
order bits are going to help me out here much. If I *really* needed to
do some sort of complex math with a bunch of Durations where I thought
that rounding errors might accumulate & become a problem, I guess I'd
convert them to a Long_Long_Float (or Long_Long_Long_Float? :-), do the
math, then convert back. I find that doing any sort of complex math with
scaled integers is a good way to inject errors into the code anyway.
Maybe you've got to do that because of hardware, but it isn't what I'd
prefer.

MDC
--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-27 14:34         ` Marin David Condic
  2001-01-28  0:18           ` Robert Dewar
@ 2001-01-29 14:54           ` Ted Dennison
  2001-01-29 18:40             ` Marin David Condic
  1 sibling, 1 reply; 29+ messages in thread
From: Ted Dennison @ 2001-01-29 14:54 UTC (permalink / raw)


In article <3A72DC5E.4C1CE092@acm.org>,
  Marin David Condic <mcondic.auntie.spam@acm.org> wrote:

> O.K. Pick nits. :-) I meant "frequency" but this is in a way related
> to "accuracy". If my micrometer is marked off in thousanths of an
> inch, it does me no good to try to measure ten-thousanths of an inch,
> so in manufacturing parts, I can't be any more accurate than to a
> thousanth of an inch. My micrometer may have been bounced off the
> milling machine a few times too many and may actually not be
> "accurate" in measuring a 1" dimension.

Hmmm. Now it sounds like you are talking about frequency vs.
"resolution". :-)  When you say "accuracy", I think of things like clock
drift. I'm currently dealing with networked realtime systems. Thus I've
had problems associated with both the clock frequenceies and with the
clock accuracy (clock drift between two machines that are trying to
operate in lock-step).

> Well, you do want some precision beyond the smallest unit of the
> actual clock time if you have this sort of situation. But most of the
> clocks I've seen are some version of a scaled integer, thus allowing
> Duration'Small to be whatever it wants to be as long as the LSB is
> under 20mSec. Am I incorrect in this?

GreenHills on vxWorks (x86 at least) uses a record type. The units of
the smallest field are in microseconds. Thus if the frequency divides
evenly into micros, you're OK.

However, I don't see how an Ada vendor could arrive at a good number
ahead of time. Your best bet is probably to just use some ridiculously
high resolution like Green Hills did. The frequency on vxWorks is
something that can be changed by programs on the fly. Our (Ada) system
reads the requested frequency from a configuration file and sets it at
startup. The default frequency is 60Hz. We have one system that uses the
default, one that sets it at 240Hz, and one that sets it at 1,000Hz.
With PC's getting faster all the time, I wouldn't be shocked to see
folks wanting to use multiple KHz. Microseconds might even be too coarse
by the end of the decade!

--
T.E.D.

http://www.telepath.com/~dennison/Ted/TED.html


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-29 14:54           ` Ted Dennison
@ 2001-01-29 18:40             ` Marin David Condic
  2001-02-08  3:32               ` Buz Cory
  0 siblings, 1 reply; 29+ messages in thread
From: Marin David Condic @ 2001-01-29 18:40 UTC (permalink / raw)


Ted Dennison wrote:

> Hmmm. Now it sounds like you are talking about frequency vs.
> "resolution". :-)  When you say "accuracy", I think of things like clock
> drift. I'm currently dealing with networked realtime systems. Thus I've
> had problems associated with both the clock frequenceies and with the
> clock accuracy (clock drift between two machines that are trying to
> operate in lock-step).
>

Yes, drift and all that. I think we're really splitting hairs on this when I
think basically, we're talking the same thing. There are lots of ways a
clock can be inaccurate, right?

BTW: I've asked before when I was dealing with that same situation - two
machines that want to operate in lockstep. Do you know of any sort of
formal, published writings on algorithms to do this? We had a 1.024mSec
interrupt on both sides - no clock so no "delay until" - and some hardware
signalling between the two. We came up with ways of doing the sync, but I
kept thinking in the back of my mind that better/faster/easier ways of doing
it. Not horribly important now since I'm not dealing with that same problem,
but it has always bothered me enough to still want to read some books/papers
on the subject.

> GreenHills on vxWorks (x86 at least) uses a record type. The units of
> the smallest field are in microseconds. Thus if the frequency divides
> evenly into micros, you're OK.
>

That may be the internal representation they keep of the current time.
Somewhere there is some underlying hardware register or port or memory
address or something that when interrogated, gives you a collection of bits
that represents time from some epoc that is either a) scaled integer or b)
floating point (never seen a floating point - maybe you have?) If it is a
scaled integer (LSB represents One Fortnight?) If that's the case, there is
an exact representation as a fixed point type.

>
> However, I don't see how an Ada vendor could arrive at a good number
> ahead of time. Your best bet is probably to just use some ridiculously
> high resolution like Green Hills did. The frequency on vxWorks is
> something that can be changed by programs on the fly. Our (Ada) system

Why does the Ada vendor have to determine this "ahead of time"? What do you
mean? Lets say you are targeting board X which is either going to have some
standard clock fixture that always goes with boards of type X, or you are
going to have some custom one-off board where maybe the processor is common,
but everything else is rather unusual. If it is case one, the Ada
implementation uses that standard issue clock that is always on boards of
type X. (Seen that before with 1750a - it had two standard clocks - take
your pick.) If it is case two, you're going to have to provide some low
level packages that can be tailored to identify the time source, etc. and be
recompiled for that target. (Couldn't you have type Duration end-user
configurable? How else would you retarget?)

Sure the type two situation almost certainly blows your validation, but so
what? The vendor validates the compiler on some specific board with some
specific clock that supports the 0.020Sec requirement and that's that. The
end user has a similar board but a different clock? Hey! Once you break the
sticker on the case that says "Caution: No user servicable parts inside" -
the validation is null and void. Does anybody care? (Maybe the Military did
at one point in time - but if they don't require Ada, I can't see how they
could require *validated* Ada. Just rename the language and scratch out the
requirements you don't support, right? :-) Call it Bda?



>
> reads the requested frequency from a configuration file and sets it at
> startup. The default frequency is 60Hz. We have one system that uses the
> default, one that sets it at 240Hz, and one that sets it at 1,000Hz.
> With PC's getting faster all the time, I wouldn't be shocked to see
> folks wanting to use multiple KHz. Microseconds might even be too coarse
> by the end of the decade!

Well, O.K. here you are describing a situation where you are getting time
from a particular OS and you can't do much to change that. This is the same
as if you were getting it from Unix and you can't do anything about the
accuracy. I'd put this in the category of "impractical to have Duration
reflect the clock" category.

Its a little different if you are trying to implement your own vxWorks in
Ada and have direct contact with the hardware.

MDC
--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-28  0:13       ` Robert Dewar
  2001-01-29 14:02         ` Marin David Condic
@ 2001-01-30 14:33         ` Stephen Leake
  2001-01-31 14:55           ` Marin David Condic
  2001-01-31 16:03           ` Ted Dennison
  1 sibling, 2 replies; 29+ messages in thread
From: Stephen Leake @ 2001-01-30 14:33 UTC (permalink / raw)


Robert Dewar <robert_dewar@my-deja.com> writes:

> In article <3A71E4F6.6D7015AD@acm.org>,
>   Marin David Condic <mcondic.auntie.spam@acm.org> wrote:
> 
> > I would think that a reasonable implementation of Ada for
> > realtime systems would want to insure that the precision had
> > some relationship to the accuracy of the clock available.
> 
> Why do you think that?
> 
> The RM contains no encouragement for this thought!
> 
> Duration is used for other things besides the delay statement.

Perhaps it would be more appropriate for Marin to request that type
Ada.Real_Time.Time_Span (LRM appendix D) match the hardware clock;
that is more clearly intended to be a "hard real-time" clock type.

> And also, programs do various calculations with Duration, if there
> is more precision, these calculations are more accurate.

I agree with Marin here; convert to a type that you know or control
the precision of to do computations.

-- 
-- Stephe



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
@ 2001-01-31  5:51 Christoph Grein
  2001-02-01  6:27 ` Simon Wright
  0 siblings, 1 reply; 29+ messages in thread
From: Christoph Grein @ 2001-01-31  5:51 UTC (permalink / raw)
  To: comp.lang.ada

> (1) On Solaris (>=2.6 or so, I think) the default operating system
>    tick is 10 mS.

What do you mean by measuring times in Millisiemens (mS)?

	10 mS = 10E-3 S = 10e-3 / Ohm

which definitely has not the unit of time.

Please see the new thread "Wrong SI unit specification" ;>)





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-28 19:32 ` Simon Wright
@ 2001-01-31  6:13   ` Robert Dewar
  2001-01-31 15:07     ` Marin David Condic
  0 siblings, 1 reply; 29+ messages in thread
From: Robert Dewar @ 2001-01-31  6:13 UTC (permalink / raw)


In article <x7vd7d7bhgt.fsf@smaug.pushface.org>,
  Simon Wright <simon@pushface.org> wrote:

> However, you can change the OS tick to 1 mS by writing
>
>   set hires_tick 1
>
> in /etc/system

Do you suppose that those who favor Duration'Small matching
the tick would want this OS command to magically change the
value of this static constant retroactively? :-)



Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-30 14:33         ` Stephen Leake
@ 2001-01-31 14:55           ` Marin David Condic
  2001-01-31 16:03           ` Ted Dennison
  1 sibling, 0 replies; 29+ messages in thread
From: Marin David Condic @ 2001-01-31 14:55 UTC (permalink / raw)


Stephen Leake wrote:

> Perhaps it would be more appropriate for Marin to request that type
> Ada.Real_Time.Time_Span (LRM appendix D) match the hardware clock;
> that is more clearly intended to be a "hard real-time" clock type.
>

Oh, I'm not really fussy about how you get there. That might work
reasonably well. And I certainly don't insist that *every*
implementation of Ada go off and do it my way. My point is that when one
is working with an embedded compiler for a specific target that has a
real time clock, one just kind of expects Duration to have some
relationship to that clock. If you do delays, you kind of expect to be
able to get a delay approximately as good as the resolution of the clock
- no more - no less.

An analogy would be, for example, someone implementing the type
Character as a 32 bit word. (Don't know if this is *legal*, but suppose
that it was?) Yes, a 32 bit word will hold a single ASCII character just
fine and provided the compiler is consistent in this, it will mostly be
invisible to the programmer. But if someone declares an object of type
Character (or aray of them) one rather expects to allocate and use a
single byte. Anything else would be regarded as kind of silly. For most
applications, it may not matter, but for embedded work, it could be
quite important because you are so close to the hardware.


>
> > And also, programs do various calculations with Duration, if there
> > is more precision, these calculations are more accurate.
>
> I agree with Marin here; convert to a type that you know or control
> the precision of to do computations.

Thanks. :-)

MDC
--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-31  6:13   ` Robert Dewar
@ 2001-01-31 15:07     ` Marin David Condic
  2001-02-01  5:43       ` Robert Dewar
  0 siblings, 1 reply; 29+ messages in thread
From: Marin David Condic @ 2001-01-31 15:07 UTC (permalink / raw)


Robert Dewar wrote:

> IDo you suppose that those who favor Duration'Small matching
> the tick would want this OS command to magically change the
> value of this static constant retroactively? :-)

As I said somewhere back in this thread, I'd only advocate it where it
was practical and useful. When you're going through an OS, you have to
make allowances for what the OS is doing for you. And again, it is only
important for realtime and/or embedded work where timing of things is
pretty critical. Someone selling/using a platform application compiler
that is being used for payroll processing is probably only going to use
delays of a granularity of a second or so just for user interaction
things. They don't really care if its off by a few microseconds. Do
whatever you like.

What I *might* advocate here is that the 'Small be set to something
representing the smallest possible unit that the OS will handle - or
possibly just a little more accuracy than that to allow for rounding. I
just think it would be misleading if a compiler says that 'Small is
going to be 0.000_000_000_000_000_001 and this has no relationship
whatsoever to what the actual hardware/OS is really capable of.

MDC
--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-30 14:33         ` Stephen Leake
  2001-01-31 14:55           ` Marin David Condic
@ 2001-01-31 16:03           ` Ted Dennison
  2001-01-31 19:16             ` Marin David Condic
  1 sibling, 1 reply; 29+ messages in thread
From: Ted Dennison @ 2001-01-31 16:03 UTC (permalink / raw)


In article <uk87d9kjh.fsf@gsfc.nasa.gov>,
  Stephen Leake <stephen.a.leake.1@gsfc.nasa.gov> wrote:

> Perhaps it would be more appropriate for Marin to request that type
> Ada.Real_Time.Time_Span (LRM appendix D) match the hardware clock;
> that is more clearly intended to be a "hard real-time" clock type.

OK. I've tried to just say "its not that simple". Since no one seems to
be accepting that, let's look at this suggestion in detail. Let's
postulate a PC system, since that's the most common one in use.

The PC system has 3 built-in timing methods. The first is the
"Real-Time" clock, which is usually used to drive OS scheduling
operations. This is essentially the 8253 timer chip, which divides
4.77272MHz signals from the processor (I'm not sure if this is standard,
or varies depending on the processor or bus speed. I'm also not sure if
its exact, or an approxmiation) into some convienent number of RTC
interrupts (like by 65536 to produce interrupts at 18.20648Hz).
However, it is possible for software to change this amount. Thus, if an
Ada compiler were to use "the units of the clock" for this clock, it
would have to use either some huge number in units of 4.77272 x 10^^-6
(or whatever the particular system uses for its base), or it would have
to somehow dynamcily pick the units that the RTC happens to be spewing
out at the moment (and be ready to change if software changes it).
However, the OS most likely uses its own approximation for these units,
so the compiler would have to approxmiate whenever passing control to
the OS for actual scheduling.

There is also a "time of day" clock that just keeps a counter in memory.
For most Ada compilers this clock will only be accessed through OS
system calls. Whatever units the OS uses would be OK here.

There is also a high-frequency timer. This also just keeps a counter in
memory. However in this case it is the CPU doing it, and the counter
increments by one each time the CPU cycles. Thus its rate is dependent
on the clock rate of the CPU. It is possible to set up interrupts when
the counter reaches a (software-set) value. Thus this could be used, but
the Ada compiler would then have to pick units that match the processor
clock speed (which would probably not match any of the above).

Short answer: its not that simple.

--
T.E.D.

http://www.telepath.com/~dennison/Ted/TED.html


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-31 16:03           ` Ted Dennison
@ 2001-01-31 19:16             ` Marin David Condic
  2001-01-31 20:53               ` Ted Dennison
  0 siblings, 1 reply; 29+ messages in thread
From: Marin David Condic @ 2001-01-31 19:16 UTC (permalink / raw)


You are right. It is "not that simple" - certainly for a number of possible
targets. I think my only point here was that there are some hardware
configurations where you *do* have a time source that *could* be reflected
by Duration and when that is the case, that's probably what you want to do.

In your example you have three possible time sources. Which would you pick
for dealing with Duration? Which would you pick for Ada.Real_Time.Time_Span?
Maybe for this particular hardware configuration you make Duration have some
arbitrarily large precision and move on. Maybe where you have a gate array
or some on-board register or some similar configuration wherein you can
interrogate the time source & get a scaled integer with an LSB of - say -
1uSec, then Duration ought to reflect this.

I'm not saying all Ada compilers everywhere should do this for every target
- just where it makes some sense because of what the hardware or OS provides
you. It might be possible to make it user configurable (select time source
and select precision, then rebuild so the compiler gives you what you want.)
For embedded and realtime applications, I think this is a useful feature.

MDC

Ted Dennison wrote:

> In article <uk87d9kjh.fsf@gsfc.nasa.gov>,
>   Stephen Leake <stephen.a.leake.1@gsfc.nasa.gov> wrote:
>
> > Perhaps it would be more appropriate for Marin to request that type
> > Ada.Real_Time.Time_Span (LRM appendix D) match the hardware clock;
> > that is more clearly intended to be a "hard real-time" clock type.
>
> OK. I've tried to just say "its not that simple". Since no one seems to
> be accepting that, let's look at this suggestion in detail. Let's
> postulate a PC system, since that's the most common one in use.
>
> The PC system has 3 built-in timing methods. The first is the
> "Real-Time" clock, which is usually used to drive OS scheduling
> operations. This is essentially the 8253 timer chip, which divides
> 4.77272MHz signals from the processor (I'm not sure if this is standard,
> or varies depending on the processor or bus speed. I'm also not sure if
> its exact, or an approxmiation) into some convienent number of RTC
> interrupts (like by 65536 to produce interrupts at 18.20648Hz).
> However, it is possible for software to change this amount. Thus, if an
> Ada compiler were to use "the units of the clock" for this clock, it
> would have to use either some huge number in units of 4.77272 x 10^^-6
> (or whatever the particular system uses for its base), or it would have
> to somehow dynamcily pick the units that the RTC happens to be spewing
> out at the moment (and be ready to change if software changes it).
> However, the OS most likely uses its own approximation for these units,
> so the compiler would have to approxmiate whenever passing control to
> the OS for actual scheduling.
>
> There is also a "time of day" clock that just keeps a counter in memory.
> For most Ada compilers this clock will only be accessed through OS
> system calls. Whatever units the OS uses would be OK here.
>
> There is also a high-frequency timer. This also just keeps a counter in
> memory. However in this case it is the CPU doing it, and the counter
> increments by one each time the CPU cycles. Thus its rate is dependent
> on the clock rate of the CPU. It is possible to set up interrupts when
> the counter reaches a (software-set) value. Thus this could be used, but
> the Ada compiler would then have to pick units that match the processor
> clock speed (which would probably not match any of the above).
>
> Short answer: its not that simple.
>
> --
> T.E.D.
>
> http://www.telepath.com/~dennison/Ted/TED.html
>
> Sent via Deja.com
> http://www.deja.com/

--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-31 19:16             ` Marin David Condic
@ 2001-01-31 20:53               ` Ted Dennison
  2001-01-31 21:30                 ` tmoran
  2001-01-31 21:47                 ` Marin David Condic
  0 siblings, 2 replies; 29+ messages in thread
From: Ted Dennison @ 2001-01-31 20:53 UTC (permalink / raw)


In article <3A786497.1C791722@acm.org>,
  Marin David Condic <mcondic.auntie.spam@acm.org> wrote:

> In your example you have three possible time sources. Which would you
> pick for dealing with Duration? Which would you pick for
> Ada.Real_Time.Time_Span?

For a PC, if I were making TEDAda, I'd probably want to match duration
with the units the OS uses for its time of day primitives (which are
possibly based on source 2) or the Ada minimum requirements, whichever
is higher-res. For Ada.Real_Time.Time, It'd be nice to use units of
ticks on the high-frequency clock (time source 3), as that will give me
the most possible resolution (which is critical for being able to
accurately measure elapsed time for small tasks). For
Ada.Real_Time.Time_Span, it would probably be best to use the same
units, as  I'd want to be able to use Real_Time.Time_Span in
calculations with Real_Time.Time without loosing accuracy. That means
any use of "delay" or "delay until" is going require a conversion from
one of those other time sources into the OS's units for its thread timed
rescueduling primitives (perhaps based on source 1). However, that is
often in RTC "ticks", so a conversion would have been nessecary anyway.

I know our Ada vendor (GreenHills) chose to use microseconds as its
units for Ada.Real_Time.Time. I believe they made that decision because
support for the high-res timer is a kernel-configurable item (not always
available), and because they use the same compiler codebase on multiple
architectures under the same OS. So it was a good decision for them. But
it has the unfortunate effect that any use of the high-res timer has to
be done via direct OS calls.


> I'm not saying all Ada compilers everywhere should do this for every
> target - just where it makes some sense because of what the hardware
> or OS provides you. It might be possible to make it user configurable

Is there some RTC hardware out there that uses seconds instead of Hz as
its units? If so, and if its OS rescheduling primitive kept the same
units, then yes, the Ada vendor for that platform should also use those
same units.

--
T.E.D.

http://www.telepath.com/~dennison/Ted/TED.html


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-31 20:53               ` Ted Dennison
@ 2001-01-31 21:30                 ` tmoran
  2001-01-31 21:47                 ` Marin David Condic
  1 sibling, 0 replies; 29+ messages in thread
From: tmoran @ 2001-01-31 21:30 UTC (permalink / raw)


>I know our Ada vendor (GreenHills) chose to use microseconds as its
>units for Ada.Real_Time.Time.  ...
>it has the unfortunate effect that any use of the high-res timer has to
>be done via direct OS calls.
  Can you replace the body of Ada.Real_Time that they supply with one
of your own?



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-31 20:53               ` Ted Dennison
  2001-01-31 21:30                 ` tmoran
@ 2001-01-31 21:47                 ` Marin David Condic
  2001-02-01 14:18                   ` Ted Dennison
  1 sibling, 1 reply; 29+ messages in thread
From: Marin David Condic @ 2001-01-31 21:47 UTC (permalink / raw)


Ted Dennison wrote:

> Is there some RTC hardware out there that uses seconds instead of Hz as
> its units? If so, and if its OS rescheduling primitive kept the same
> units, then yes, the Ada vendor for that platform should also use those
> same units.

Well, even if its hz, that doesn't mean Duration'Small can't have that
value. Let's say that the clock goes at 1000hz - Each tick is 1/1000 of a
second, correct? (1hz = 1 cycle/sec - unless senility is setting in? :-)
Duration'Small is 0.001. If the clock is at 1024hz then each tick is 1/1024
of a second. Duration'Small is 0.0009765625. (somewhere you have a
declaration: "for Duration'Small use 0.0009765625 ;" - change the target? -
change this statement in the compiler and off you go.)

All you're saying is that the LSB of Duration has some quantum value which
is measured in units of seconds. As long as this falls under 0.020, its
within the requirements of the Ada standard and Duration coincides nicely
with your available clock.

Now if your clock gives you an IEEE 80 bit floating point value, the story
has to change. I've just never seen one of those.

MDC
--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-31 15:07     ` Marin David Condic
@ 2001-02-01  5:43       ` Robert Dewar
  0 siblings, 0 replies; 29+ messages in thread
From: Robert Dewar @ 2001-02-01  5:43 UTC (permalink / raw)


In article <3A782A43.680425AE@acm.org>,
  Marin David Condic <mcondic.auntie.spam@acm.org> wrote:
> I just think it would be misleading if a compiler says that
> 'Small is going to be 0.000_000_000_000_000_001 and this has
> no relationship whatsoever to what the actual hardware/OS is
> really capable of.

This would only mislead someone who thinks there should be a
relationship between the two. Since the RM does not give any
hint that such a relationship is required or desirable, no one
should be under this impression, and thus no one should be
misled who clearly understands the language.


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-31  5:51 Christoph Grein
@ 2001-02-01  6:27 ` Simon Wright
  0 siblings, 0 replies; 29+ messages in thread
From: Simon Wright @ 2001-02-01  6:27 UTC (permalink / raw)


Christoph Grein <christoph.grein@eurocopter.de> writes:

> > (1) On Solaris (>=2.6 or so, I think) the default operating system
> >    tick is 10 mS.
> 
> What do you mean by measuring times in Millisiemens (mS)?
> 
> 	10 mS = 10E-3 S = 10e-3 / Ohm
> 
> which definitely has not the unit of time.

::aaaaargh::

got me there :-(

-- 
Simon Wright                       Work Email: simon.j.wright@amsjv.com
Alenia Marconi Systems                        Voice: +44(0)23-9270-1778
Integrated Systems Division                     FAX: +44(0)23-9270-1800
Ex-physicist                             Paid-up member of Pedants`R`Us



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-31 21:47                 ` Marin David Condic
@ 2001-02-01 14:18                   ` Ted Dennison
  0 siblings, 0 replies; 29+ messages in thread
From: Ted Dennison @ 2001-02-01 14:18 UTC (permalink / raw)


In article <3A7887E2.49CF12C5@acm.org>,
  Marin David Condic <mcondic.auntie.spam@acm.org> wrote:

> Well, even if its hz, that doesn't mean Duration'Small can't have that
> value. Let's say that the clock goes at 1000hz - Each tick is 1/1000
> of a second, correct? (1hz = 1 cycle/sec - unless senility is setting
> in? :-)

Our simulation uses clocks of 60Hz and 240Hz. The default for vxWorks is
60Hz. Try those (or any other Hz value divisible by 3) on for size. :-)

--
T.E.D.

http://www.telepath.com/~dennison/Ted/TED.html


Sent via Deja.com
http://www.deja.com/



^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-01-29 18:40             ` Marin David Condic
@ 2001-02-08  3:32               ` Buz Cory
  2001-02-08 15:34                 ` Marin David Condic
  0 siblings, 1 reply; 29+ messages in thread
From: Buz Cory @ 2001-02-08  3:32 UTC (permalink / raw)


On 1/29/01, 1:40:29 PM, Marin David Condic <mcondic.auntie.spam@acm.org> 
wrote regarding Re: Duration vs. Ada.Real_Time:

[snip]

> BTW: I've asked before when I was dealing with that same situation
> - two machines that want to operate in lockstep.

You failed to say to what degree of precision. Am assuming below that
a discrepancy on the order of 10ms is OK.

> Do you know of any sort of formal, published writings on
> algorithms to do this? We had a 1.024mSec interrupt on both sides
> - no clock so no "delay until"

You mean you had a periodic interrupt and weren't counting?

If you were counting, then there was your clock. It may have had to
be set externally to sync w/ external time, but it was still there.

One would assume that the 1.024 ms interrupt came from a HW device
counting 1MHz pulses. If that register was readable somehow, now you
had a time resolution of 1µs.

Naturally, if the 1MHz signals are separately generated for each
box, there will be some drift (though I wouldn't expect a lot). One
way to solve this would be to provide the base tick externally.

> - and some hardware signaling between the two. We came up
> with ways of doing the sync, but I kept thinking in the back of my
> mind that better/faster/easier ways of doing it. Not horribly
> important now since I'm not dealing with that same problem, but it
> has always bothered me enough to still want to read some
> books/papers on the subject.

Assuming that since the boxen had to be in lock-step, they were
networked in some way, there is definitely a software solution

Assuming an ethernet LAN, synching w/ an error < 10ms is claimed by
the author. (Maybe an order of magnitude worse for a WAN.)

The docs that came w/ xntpd may be read at:
    "http://BuzCo.PenguinPowered.com/imports/net/time/".
Unfortunately, the PenguinPowered.com nameserver is down as I write
this :-< (Between ISP problems and nameserver problems I seem to get
some 80% availability).

You can find basic material on this at "http://www.ntp.org/".
The software may be downloaded (documented in *_great_* detail) from
    "ftp://ftp.udel.edu/pub/ntp/".

The algorithms, the software, and the related RFCs (referenced in
the docs) are all the work of one Dr. David Mills, who seems to be
The Expert on network time synchronization. I know I could not
follow the details, I just use the software.

BTW, using the supplied software assumes that you are running some
variant of Un*x; the provided daemon runs in user-space but can talk
to the kernel software that maintains time-of-day.

The code is in "C" and gives some warning messages when compiled w/
gcc.

Hope this helps,
== Buz :)
--
Buz Cory of Buzco Systems -- New York NY USA 
http://BuzCo.PenguinPowered.com
<netadm@BuzCo.PenguinPowered.com> (Buz as Net Admin)
write to <helpdesk@BuzCo.PenguinPowered.com> for FREE help with:
    Installing/Configuring Linux
    Getting started with the Ada Programming Language.
Replacing DOS/MS-Win with Linux is like replacing a Fokker with an 
F14.
Programmer? Bugs got you down? Ada is the answer.






^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-02-08  3:32               ` Buz Cory
@ 2001-02-08 15:34                 ` Marin David Condic
  2001-02-10  3:08                   ` Steve Whalen
  0 siblings, 1 reply; 29+ messages in thread
From: Marin David Condic @ 2001-02-08 15:34 UTC (permalink / raw)


Actually, the problem was a lot more specialized than what you seem to be
gathering from my post. Basically, it was two separate, identical computers
running identical software with independent 1.024mS interrupts. (No common
time source or this would have been a source for common-mode failure)

Given the presence of watchdog timers in both boxes that had to be stroked
on specific cycles and the need for each computer to be operating on
identical sensor input, you wanted to have them sync'ed on the same cycle
and starting each cycle with a high degree of precision. (I believe we
settled on about 50uS as a jitter amount, so that was roughly the tolerance
of the sync.)

We had primitive communications between the two CPU's - a Manchester data
link and a couple of discretes.

We achieved sync by basically having the two machines look for each other on
the discretes and one machine taking small delays until they both arrived at
the same cycle. From there, we pulled a few other tricks involving delays
until the machines were in fine sync. (BTW, the same basic problem happens
with frequency-hopping radios where you don't have a common time source or
you can't get the time source until you achieve coarse sync.)

Basically, we came up with a solution that worked, but I could never get
over the feeling that some academic type had probably written a book or a
paper discussing various methods of doing this which might have shed some
light on better ideas.

Network time synchronization is far more heavily studied, but doesn't have
quite the same set of problems.

MDC

Buz Cory wrote:

> On 1/29/01, 1:40:29 PM, Marin David Condic <mcondic.auntie.spam@acm.org>
> wrote regarding Re: Duration vs. Ada.Real_Time:
>
> [snip]
>
> > BTW: I've asked before when I was dealing with that same situation
> > - two machines that want to operate in lockstep.
>
> You failed to say to what degree of precision. Am assuming below that
> a discrepancy on the order of 10ms is OK.
>
> > Do you know of any sort of formal, published writings on
> > algorithms to do this? We had a 1.024mSec interrupt on both sides
> > - no clock so no "delay until"
>
> You mean you had a periodic interrupt and weren't counting?
>
> If you were counting, then there was your clock. It may have had to
> be set externally to sync w/ external time, but it was still there.
>
> One would assume that the 1.024 ms interrupt came from a HW device
> counting 1MHz pulses. If that register was readable somehow, now you
> had a time resolution of 1�s.
>
> Naturally, if the 1MHz signals are separately generated for each
> box, there will be some drift (though I wouldn't expect a lot). One
> way to solve this would be to provide the base tick externally.
>
> > - and some hardware signaling between the two. We came up
> > with ways of doing the sync, but I kept thinking in the back of my
> > mind that better/faster/easier ways of doing it. Not horribly
> > important now since I'm not dealing with that same problem, but it
> > has always bothered me enough to still want to read some
> > books/papers on the subject.
>
> Assuming that since the boxen had to be in lock-step, they were
> networked in some way, there is definitely a software solution
>
> Assuming an ethernet LAN, synching w/ an error < 10ms is claimed by
> the author. (Maybe an order of magnitude worse for a WAN.)
>
> The docs that came w/ xntpd may be read at:
>     "http://BuzCo.PenguinPowered.com/imports/net/time/".
> Unfortunately, the PenguinPowered.com nameserver is down as I write
> this :-< (Between ISP problems and nameserver problems I seem to get
> some 80% availability).
>
> You can find basic material on this at "http://www.ntp.org/".
> The software may be downloaded (documented in *_great_* detail) from
>     "ftp://ftp.udel.edu/pub/ntp/".
>
> The algorithms, the software, and the related RFCs (referenced in
> the docs) are all the work of one Dr. David Mills, who seems to be
> The Expert on network time synchronization. I know I could not
> follow the details, I just use the software.
>
> BTW, using the supplied software assumes that you are running some
> variant of Un*x; the provided daemon runs in user-space but can talk
> to the kernel software that maintains time-of-day.
>
> The code is in "C" and gives some warning messages when compiled w/
> gcc.
>
> Hope this helps,
> == Buz :)
> --
> Buz Cory of Buzco Systems -- New York NY USA
> http://BuzCo.PenguinPowered.com
> <netadm@BuzCo.PenguinPowered.com> (Buz as Net Admin)
> write to <helpdesk@BuzCo.PenguinPowered.com> for FREE help with:
>     Installing/Configuring Linux
>     Getting started with the Ada Programming Language.
> Replacing DOS/MS-Win with Linux is like replacing a Fokker with an
> F14.
> Programmer? Bugs got you down? Ada is the answer.

--
======================================================================
Marin David Condic - Quadrus Corporation - http://www.quadruscorp.com/
Send Replies To: m c o n d i c @ q u a d r u s c o r p . c o m
Visit my web site at:  http://www.mcondic.com/

    "I'd trade it all for just a little more"
        --  Charles Montgomery Burns, [4F10]
======================================================================





^ permalink raw reply	[flat|nested] 29+ messages in thread

* Re: Duration vs. Ada.Real_Time
  2001-02-08 15:34                 ` Marin David Condic
@ 2001-02-10  3:08                   ` Steve Whalen
  0 siblings, 0 replies; 29+ messages in thread
From: Steve Whalen @ 2001-02-10  3:08 UTC (permalink / raw)


You think you had problems... did you read the NSA mongraph on the
WWII SIGSALY system? 
 
 http://www.nsa.gov/wwii/papers/start_of_digital_revolution.htm

In glancing at it again, I don't think their synchronization solution
would have helped you because they primarily relied on an external 
time source, but maybe some of their other tricks for maintaining
synchronization might be adaptable to your situation (somehow 
translating hardware / mechanical / analog into Ada!).

Steve

Marin David Condic <mcondic.auntie.spam@acm.org> wrote in
<3A82BC7B.8EF005AB@acm.org>: 

>Actually, the problem was a lot more specialized than what you seem to
>be gathering from my post. Basically, it was two separate, identical
>computers running identical software with independent 1.024mS
>interrupts. (No common time source or this would have been a source for
>common-mode failure) 
>
>Given the presence of watchdog timers in both boxes that had to be
>stroked on specific cycles and the need for each computer to be
>operating on identical sensor input, you wanted to have them sync'ed on
>the same cycle and starting each cycle with a high degree of precision.
>(I believe we settled on about 50uS as a jitter amount, so that was
>roughly the tolerance of the sync.)
>
>We had primitive communications between the two CPU's - a Manchester
>data link and a couple of discretes.
>
>We achieved sync by basically having the two machines look for each
>other on the discretes and one machine taking small delays until they
>both arrived at the same cycle. From there, we pulled a few other tricks
>involving delays until the machines were in fine sync. (BTW, the same
>basic problem happens with frequency-hopping radios where you don't have
>a common time source or you can't get the time source until you achieve
>coarse sync.) 
>
>Basically, we came up with a solution that worked, but I could never get
>over the feeling that some academic type had probably written a book or
>a paper discussing various methods of doing this which might have shed
>some light on better ideas.
>
>MDC

-- 
------------------------------------------------------------
---   Steve Whalen                  swhalen@micron.net   ---
------------------------------------------------------------



^ permalink raw reply	[flat|nested] 29+ messages in thread

end of thread, other threads:[~2001-02-10  3:08 UTC | newest]

Thread overview: 29+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
     [not found] <980495512.529981@edh3>
     [not found] ` <8mac6.9236$R5.473526@news1.frmt1.sfba.home.com>
2001-01-26 15:30   ` Duration vs. Ada.Real_Time Robert Dewar
     [not found] ` <3A71814B.7E8CCF60@acm.org>
2001-01-26 15:33   ` Robert Dewar
2001-01-26 20:58     ` Marin David Condic
2001-01-26 21:32       ` Ted Dennison
2001-01-27  5:01         ` Keith Thompson
2001-01-27 14:40           ` Marin David Condic
2001-01-27 14:34         ` Marin David Condic
2001-01-28  0:18           ` Robert Dewar
2001-01-29 14:54           ` Ted Dennison
2001-01-29 18:40             ` Marin David Condic
2001-02-08  3:32               ` Buz Cory
2001-02-08 15:34                 ` Marin David Condic
2001-02-10  3:08                   ` Steve Whalen
2001-01-28  0:13       ` Robert Dewar
2001-01-29 14:02         ` Marin David Condic
2001-01-30 14:33         ` Stephen Leake
2001-01-31 14:55           ` Marin David Condic
2001-01-31 16:03           ` Ted Dennison
2001-01-31 19:16             ` Marin David Condic
2001-01-31 20:53               ` Ted Dennison
2001-01-31 21:30                 ` tmoran
2001-01-31 21:47                 ` Marin David Condic
2001-02-01 14:18                   ` Ted Dennison
2001-01-28 19:32 ` Simon Wright
2001-01-31  6:13   ` Robert Dewar
2001-01-31 15:07     ` Marin David Condic
2001-02-01  5:43       ` Robert Dewar
2001-01-31  5:51 Christoph Grein
2001-02-01  6:27 ` Simon Wright

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox