comp.lang.ada
 help / color / mirror / Atom feed
From: firth@sei.cmu.edu (Robert Firth)
Subject: Priority Inversion
Date: 13 May 88 21:31:46 GMT	[thread overview]
Message-ID: <5466@aw.sei.cmu.edu> (raw)


	Priority Inversion
	==================

Ben Brosgol explained (correctly) the problem of priority inversion
in these terms:

A server task of low priority - SERVER(L) - is approaching an accept
statement.  Queued on its entry is a high priority task CALLER(H).
Hoever, an unrelated task of medium priority - RANDOM(M) - prempts
SERVER, and so CALLER stays blocked.  Hence, RANDOM(M) is being
given the processor at the expense of CALLER(H).

Here is another example:  We have CALLER(H) queued behind CALLER(L)
on the entry queue of SERVER(L), which is in rendezvous.  The priority
of the rendezvous is, of course, L.  Again, RANDOM(M) preempts, so
suspending SERVER, and holding up CALLER(H) on the entry queue.

One proposed solution is to have a SERVER inherit the priority of
a CALLER at the moment the entry is called.  This doesn't work, of
course, for reasons we shall see.

Consider now this example:  the SERVER(L) is a resource allocator
allocating buffers, and so is suspended on the usual select statement
"accept CLAIM_BUFFER... or accept RELEASE_BUFFER...".  A high-priority
task CALLER(H) calls the CLAIM_BUFFER entry, but unfortunately SERVER
has run out of buffers, the guard is closed, and CALLER(H) blocks.
Now over in the blue corner, another task RUMPLESTILTSKIN(L) is
closing fast on a RELEASE_BUFFER call that will give back a buffer.
But, as you might guess, RANDOM(M) preempts, suspending poor RUMPLE(L),
and so, indirectly, blocking CALLER(H).  Priority inversion!

The thing about this last example is that there is no way the Ada
runtime can detect what is going on, and so no magic "solution" can
be achieved by tweaking the language semantics.  The only recourse
is that you really shouldn't write code that way.  So we observe,
that the priority mechanism of Ada still requires the user to take
some care in writing the code.

But the same sort of care also solves the other inversion examples.
For example, the rule "give a server a priority no lower than its
highest-priority user" guarantees there will be no inversions such
as the ones given above.  This is a simple, obvious, and, I submit,
largely acceptable rule.  Again, it requires the user to understand
the language semantics and use them with care.

However, isn't it better to change the language to remove the possibility
of priority inversion, or, at any rate, of some priority inversions?
Maybe; the problem is that the proposed solutions are wrong.  Consider
again CALLER(H) blocked on entry of SERVER(L), in turn preempted by
RANDOM(M).  Suppose we now raise the server's priority.  Suppose further
that SERVER is rather sick, and in fact will never reach the accept
statement.  Then, CALLER(H) still will never unblock; all we have
achieved is to allow SERVER(L) indefinitely to shut out RANDOM(M), and
this of course is a priority inversion.  And how can the Ada runtime
possibly know whether a given server will indeed reach the accept?

Hence, with either current Ada or Ada as changed, a programmer can
always code priority inversions, even in the cases the changes are
intended to exclude.  I conclude that priority inversion is not a
language design error; it is a program design error.  Given that
there are simple design rules that will in most cases avoid the
problem, I see no reason to further complicate Ada runtimes - they
are surely complicated enough.

The root cause of the trouble is that word "priority".  It has an
English meaning, and an Ada meaning.  People who take the English
meaning and ingore the Ada meaning tend to go wrong - for instance,
they give the highest priority to the task that is psychologically
"most important".  Priority is merely a number that controls a
comparison somewhere in a scheduler; it is a last-resort way of
deciding which task to run when the user has not fully synchronized
the tasks.

Just because the language provides the word priority, the user is not
absolved from the responsibility of understanding tasking semantics,
and of careful design of real-time algorithms.  Likewise, just because
the language Fortran contains the word REAL, the user is not absolved
from the responsibility of understanding the semantics of floating-point
arithmetic, and of careful design of numerical algorithms.

Robert Firth

             reply	other threads:[~1988-05-13 21:31 UTC|newest]

Thread overview: 2+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
1988-05-13 21:31 Robert Firth [this message]
  -- strict thread matches above, loose matches on Subject: below --
1988-05-12 14:12 Priority Inversion Ben Brosgol
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox