From: "Michael" <fvit@shaw.ca>
Subject: Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
Date: Tue, 17 Mar 2009 17:54:33 -0700
Date: 2009-03-17T17:54:33-07:00 [thread overview]
Message-ID: <rDXvl.42092$3S3.6081@newsfe22.iad> (raw)
In-Reply-To: lp-dnetIfJeXtyPUnZ2dnUVZ8jGWnZ2d@posted.plusnet
"Tim Rowe" <spamtrap@tgrowe.plus.net> wrote in message
news:lp-dnetIfJeXtyPUnZ2dnUVZ8jGWnZ2d@posted.plusnet...
> I don't know what CAATS!
The 2 Automated Air Traffic Systems which have been developed at Vancouver
were including an integrated flight conflict prediction (COPR) capability up
to 20 minutes into the future.
Specifications and system architecture are different from iFACTS but the
functional concept is pretty the same.
But software development process is formal and well established. i.e.: SEI
level 3, which meets a military requirement to develop safety critical
software (e.g.: CAATS military counterpart: MAATS).
> I know is that it very difficult indeed to define how it should work.
In 1995, COPR was first cancel to ease delivering a less complex system.
http://www.flightglobal.com/articles/1995/06/21/25542/hughes-rethinks-canadian-atc-project.html
But the Transportation Safety Board suggested getting it back. i.e.: they
knew how it should have to work, they had just experimented and documented a
test case scenario:
http://www.tsb.gc.ca/eng/rapports-reports/aviation/1995/a95c0127/a95c0127.asp
At the CAATS delivery in 2001, COPR was working, which allows Nav Canada to
refining requirements. COPR was upgraded in accordance to better fit
operational characteristics and constraints.
> It's hard to specify, and it's hard to determine whether algorithms are
> adequate.
Good engineering practices will help!
e.g.: Prototyping, modeling. Iterative development, unit, non-regression and
integration testing.
Many lessons have been documented in Industry, and reinforced on CAATS:
- System Requirements must be ready, well written and complete
- Need strong system engineering team to support software developers
- Support tools for software development/validation must be 'in place'
early
- Continuous measurements and analysis of software development team
effectiveness and progress helps improve and streamline the process in place
> Each of those (related) issues could lead to delays (or even failure) that
> have nothing to do with the language or the use of formal methods.
"even" is odd.
Probably something is missing between formal specifications and code
programming/formal verification (SPARK examiner, proof); and between formal
specification and formal verification.
Adding all of the above, might have help to fix the problem.
Avoid "cutting corners at the expense of the engineering". That the best
way to re-inventing the wheel!
> Using Ada with formal methods you can have a great deal of confidence that
> an algorithm is correctly implemented, but if you don't have confidence
> that the algorithm correctly solves your problem
then you test it. You will see where it doesn't work and probably figure
what need to be done and how to do it, and test it again. Simplifying the
model, dividing into elementary modules, reducing the application domain
might help the conception process. You might not find a deficiency, but a
trace left by a deficiency (an anomaly) or identify a specific context in
which a deficiency can occur. Follow the trace, think what specific, it
could say more on the problem. If you have nothing to start with, start
with someting absurd and try to make a sense of it.
If you find nothing, ask to someone. If not your discipline, don't stay
still.
> This is certainly an issue for development of complex systems, but it's
> not an Ada or formal methods issue.
There are always issues for development of complex system. Assuming that
what is necessary is also sufficient, is one of them.
(i.e.: Engineering is not a game, but a responsible approach to design and
develop safe technological solutions to serve people and protect the
environment.)
> I don't know whether this is the actual issue with iFacts, but it should
> be enough to show that "Ada + formal methods project has difficulties"
> doesn't necessarily mean that the difficulties are with Ada or formal
> methods.
No, but reciprocally.
it is enough to say that
"If the difficulties are with Ada or formal methods your project has
difficulties"
(e.g.: Snipping testing from assuming a proof can't have any (syntaxt,
semantic, assumption) error)
> If you think Ada or formal methods is causing the difficulty, could you
> clarify why you think that, and explain what the specific issues are?
Generally speaking I can find a lot of difficulties in the way to use Ada or
the formal methods. There is no magic bullet.
The difficulty is not such the question, but being responsible and assuming
responsibilities is an answer.
CAATS and MAATS had their lot of difficulties, we never underestimated these
difficulties. Anything it took to tackle them, we almost get it, and I think
the results had worth all these difficulties and what it cost.
Problem, Nav Canada promptly started to spare a lot of efforts within the
CAATS integration, and then got more difficulties and lost some
functionalities (including conflict prediction). Since in that context the
defence choose to cancel MAATS, such conflict prediction tool is no longer
availlable in Canada.
Air traffic controllers still insist to get conflict prediction back again.
"Based on other ANS's experiences with implementing MTCD the implementation
of MTCD into the CAATS environment will be a significant effort. The amount
of existing CAATS code that can be leveraged, particularly in the Trajectory
Modeller and Flight Plan Management areas, is unknown." Nav Canada
At least the CAATS MTCD algorithms are known. That all what Nav Canada
needs to know to do it itself. But such MTCD is hardly compliant with the
Nav Canada version of the CAATS architecture.
I doubt iFACTS delay can be attributed to the non-accuracy of algorithms
NATS has developed (for almost ten years).
They didn't need to be perfect. They needed to be verified.
(A formal proof is based on the proof of intrants, not on assumptions on
intrants. Establishing the SPARK proof of a system based on assumptions is
a waste of efforts and time. Ditto for the validation testing.)
If NATS knew how to do it, it would have been able to do it itself, and
would have done it (they have enough engineers who would like having done
it, and who would have done it too).
That is the (Praxis) engineering duty to undertake and accept responsibility
for professional assignments only when qualified by training or experience
To provide an opinion on a professional subject only when it is founded upon
adequate knowledge and honest conviction;
To present clearly to clients the possible consequences if professional
decisions or judgments are overruled or disregarded;
To conduct themselves with fairness, courtesy and good faith towards
clients, colleagues and others, give credit where it is due and accept, as
well as give, honest and fair professional comment;
Lady Ada created a program for an Analytical Engine which didn't exist yet.
When such engine was built, her program was tested and it worked.
That formally proves there are not enough women in engineering.
Cheers
Michael,
Vancouver, BC
next prev parent reply other threads:[~2009-03-18 0:54 UTC|newest]
Thread overview: 11+ messages / expand[flat|nested] mbox.gz Atom feed top
2009-03-10 5:47 Ada UK conference: SPARK safety: is no delivery better than 1 defect? Michael
2009-03-10 14:54 ` (see below)
2009-03-11 10:34 ` Michael
2009-03-11 14:46 ` (see below)
2009-03-12 10:36 ` Michael
2009-03-12 10:52 ` Ludovic Brenta
2009-03-16 9:18 ` Michael
2009-03-16 10:29 ` Tim Rowe
2009-03-18 0:54 ` Michael [this message]
2009-03-12 12:39 ` (see below)
-- strict thread matches above, loose matches on Subject: below --
2009-03-10 6:01 Michael
replies disabled
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox