comp.lang.ada
 help / color / mirror / Atom feed
* Ada UK conference: SPARK safety: is no delivery better than 1 defect?
@ 2009-03-10  5:47 Michael
  2009-03-10 14:54 ` (see below)
  0 siblings, 1 reply; 11+ messages in thread
From: Michael @ 2009-03-10  5:47 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 1413 bytes --]

Hi all,

The next Ada Conference UK 2009 (March 24, in London), is to highlight the 
increased relevance of Ada in safety-and security-critical programming.

Software reliability and conformance are the Ada's raison d'�tre and the 
main objectives of software engineering.  Base on that, safety engineering 
is focusing on the global system vision (e.g.: unforeseen interactions of 
reliable sub-systems, modifications to the system, changes to the 
operational environment.)

About safety, does Ada need to still evolve, or engineers being more 
responsible?

The SPARK Ada enhancement was recently brought to our attention, (from an 
open-source mini demonstrating project named Tokeneer).  Based on a subset 
of Ada, SPARK code "should be correct by virtue of the techniques used in 
its construction".  Tookeneer might, but not iFACTS (a medium term flight 
conflict detection system "scheduled for delivery by Dec-07, re-approved by 
the NATS Board in January 2008, with a revised cost, delivery and benefits 
profile".  ("re-planned again for 2009, and now with an optimised schedule 
of Winter 2010").

In regards to the Tokeneer mini-project findings, were the safety critical 
iFACTS project's delays and deficiencies predictable?

That should be one of the main Ada Conference safety concerns. (Tookeneer 
and SPARK are both in the Ada conference program, but not iFACTS yet!).

Cheers,

MIchael 





^ permalink raw reply	[flat|nested] 11+ messages in thread

* Ada UK conference: SPARK safety: is no delivery better than 1 defect?
@ 2009-03-10  6:01 Michael
  0 siblings, 0 replies; 11+ messages in thread
From: Michael @ 2009-03-10  6:01 UTC (permalink / raw)


[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #1: Type: text/plain, Size: 1430 bytes --]

Hi all,



The next Ada Conference UK 2009 (March 24, in London), is to highlight the 
increased relevance of Ada in safety-and security-critical programming.



Software reliability and conformance are the Ada's raison d'�tre and the 
main objectives of software engineering.  Base on that, safety engineering 
is focusing on the global system vision (e.g.: unforeseen interactions of 
reliable sub-systems, modifications to the system, changes to the 
operational environment.)



About safety, does Ada need to still evolve, or engineers being more 
responsible?



The SPARK Ada enhancement was recently brought to our attention, (from an 
open-source mini demonstrating project named Tokeneer).  Based on a subset 
of Ada, SPARK code "should be correct by virtue of the techniques used in 
its construction".  Tookeneer might, but not iFACTS (a medium term flight 
conflict detection system "scheduled for delivery by Dec-07, re-approved by 
the NATS Board in January 2008, with a revised cost, delivery and benefits 
profile".  ("re-planned again for 2009, and now with an optimised schedule 
of Winter 2010").



In regards to the Tokeneer mini-project findings, were the safety critical 
iFACTS project's delays and deficiencies predictable?



That should be one of the main Ada Conference safety concerns. (Tookeneer 
and SPARK are both in the Ada conference program, but not iFACTS yet!).



Cheers,



Michael







^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
  2009-03-10  5:47 Michael
@ 2009-03-10 14:54 ` (see below)
  2009-03-11 10:34   ` Michael
  0 siblings, 1 reply; 11+ messages in thread
From: (see below) @ 2009-03-10 14:54 UTC (permalink / raw)


On 10/03/2009 05:47, in article q9ntl.33022$l71.13235@newsfe23.iad,
"Michael" <fvit@shaw.ca> wrote:

> Hi all,
> 
> The next Ada Conference UK 2009 (March 24, in London), is to highlight the
> increased relevance of Ada in safety-and security-critical programming.
> 
...
> 
> In regards to the Tokeneer mini-project findings, were the safety critical
> iFACTS project's delays and deficiencies predictable?
> 
> That should be one of the main Ada Conference safety concerns. (Tookeneer
> and SPARK are both in the Ada conference program, but not iFACTS yet!).

Do you have any evidence that iFACTS problems are due to the Ada language?

-- 
Bill Findlay
<surname><forename> chez blueyonder.co.uk





^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
  2009-03-10 14:54 ` (see below)
@ 2009-03-11 10:34   ` Michael
  2009-03-11 14:46     ` (see below)
  0 siblings, 1 reply; 11+ messages in thread
From: Michael @ 2009-03-11 10:34 UTC (permalink / raw)


Hi all,



Thank you for the question Bill.



iFACTS deficiencies and delays are not due to Ada, (the software engineering 
language), but more likely from the SPARK constraints and limitations.

SPARK is made from a subset of Ada and use specific annotations (e.g.: data 
flow or pre and post conditions assertions) to establish a mathematical 
proof of the code correctness.

http://www.adacore.com/wp-content/uploads/2008/10/11_safe_secure_ada_2005_certified_safe_with_spark.pdf



Within large projects, when using C/C++ it becomes increasingly difficult to 
clearly separate design concerns that are scattered into classes.

("The high-integrity C++ coding standard manual" won't help that much 
http://www.literateprogramming.com/CppHighIntegrity.pdf)



Within large projects, when using SPARK it becomes increasingly difficult to 
focus on establishing abstract proof of correctness without compromising 
testing effectiveness.

("Is Proof More Cost Effective than Testing?" won't help either

http://www.praxis-his.com/pdfs/cost_effective_proof.pdf)



Working on too strong abstractions will not left much tools and utilities 
for the operational problems' emulation and testing. Within a proof of 
correctness context, verification and validation procedures might more 
likely be concerned about monitoring divergences than ensuring 
specifications' compliances.



At the opposite Ada is a powerful engineering tool which allows engineers to 
know what they are doing.  And when complexity is increasing, doing it right 
and doing the right think is mostly a consequence of experiences and of a 
mature software engineering process.



The next Ada UK Conference is an opportunity for Praxis to clarifying the 
SPARK capabilities (Tookeneer) and limitations (iFACTS).

http://www.ada-uk-conference.co.uk/programme.html



Cheers



Michael,

Vancouver, BC



CC: to Rod Chapman and Janet Barnes



>
> Do you have any evidence that iFACTS problems are due to the Ada language?
>
> -- 
> Bill Findlay





^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
  2009-03-11 10:34   ` Michael
@ 2009-03-11 14:46     ` (see below)
  2009-03-12 10:36       ` Michael
  0 siblings, 1 reply; 11+ messages in thread
From: (see below) @ 2009-03-11 14:46 UTC (permalink / raw)


On 11/03/2009 10:34, in article zsMtl.48994$Tp5.224@newsfe13.iad, "Michael"
<fvit@shaw.ca> wrote:

... [many assertions snipped]

Do you have any evidence for these claims (which, by the way, are primarily
to do with  failures of management to control the development process, and
little or nothing to do with the SPARK language)? Do you have personal
experience and inside knowledge of IFACTS development? If so, you should say
what it is if you want to be taken seriously.
-- 
Bill Findlay
<surname><forename> chez blueyonder.co.uk





^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
  2009-03-11 14:46     ` (see below)
@ 2009-03-12 10:36       ` Michael
  2009-03-12 10:52         ` Ludovic Brenta
  2009-03-12 12:39         ` (see below)
  0 siblings, 2 replies; 11+ messages in thread
From: Michael @ 2009-03-12 10:36 UTC (permalink / raw)


Hi Bill and all,

iFACTS has failed as an engineering results. Engineers are used to find 
solutions rather than excuses (i.e.: former SPARK successful software 
development process involves substituting testing by formal proofs; and the 
iFACTS managers were confident that proven procedures were used in dealing 
with problems.  In case of problem, changing the way engineering must be 
done is a necessity for engineers to report).

Tokeneer is an open project, so open for discussion:

My initial question was: "In regards to the Tokeneer mini-project findings, 
were the safety critical iFACTS project's delays and deficiencies 
predictable?"

A few months prior NATS realised iFACTS was ending with a lot of 
deficiencies, Praxis and NATS were still celebrating a great iFACTS success 
story (just prior the final iFACTS trials at Hurn).

"Paul Barron, NATS' Chief Executive commented: "This is one of the most 
exciting developments in the aviation industry in decades and we're now 
close to introducing it."

http://www.praxis-his.com/news/radarmar2007.as
http://www.nats.co.uk/uploads/user/NERL%20Business%20Plan%20Report%202008.pdf 
page 6

"In all engineering disciplines it is accepted that it is better to avoid 
the introduction of errors rather than have to correct them at a later 
stage."

How "correctness per construction" fails to meet such a fundamental goals?

Pre-condition: there must have no effective validation or verification 
procedure.

It might be a few known limitations: e.g.: SPARK can ensure intrinsic code 
robustness, but not the software compliance toward system specifications.

But there if were enough deficiencies to avoiding iFACTS to converge to a 
compliant solution, there were enough for SPARK to catch. And SPARK, as well 
as that the Z formal specifications, caught none of them.

What's the SPARK problem?  Good enough for Tokeneer, blind as a bat for 
iFACTS!

The next Ada UK Conference is still a responsibilities for Rod Chapman and 
Janet Barnes to clarifying the Praxis's SPARK capabilities (Tookeneer) and 
limitations (iFACTS).
In 2007, conclusion of  "Correctness by Construction: Putting Engineering 
into Software" was quite amazing:
http://www.adacore.com/multimedia/aa_videos/lectures/adauk07_05_slides.pdf
 "A future?
     What happens if we put together:
        - ...
        - CbyC,
        - SPARK and strong static verification,

     Don't know! But we aim to have some fun finding out."

Have fun,

Cheers

Michael,

Vancouver, BC

CC: to Rod Chapman and Janet Barnes


Just in case:

Formal specifications can describe in a precise way the properties of an 
information system, but without constraining the software architecture and 
design in which these properties are to be achieved.  In others word formal 
specifications describe what the system must do without saying how it is to 
be done.  Therefore, formal specifications are the mathematical foundation 
which can ensure system requirements shall be transformed into consistent 
and accurate systems specifications.

The general goal of SPARK is to provide a language which increases the 
probability of the software code behaving as intending.  In another word 
SPARK provides additional information, which allows performing rigorous 
mathematical analysis in order to significantly increase the code intrinsic 
integrity and runtime correctness.  The benefits are to reduce the risk of 
processing erroneous data, and then preventing any error handling and/or 
system recovering risk, ultimately preventing system instability, saturation 
or even crash. 





^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
  2009-03-12 10:36       ` Michael
@ 2009-03-12 10:52         ` Ludovic Brenta
  2009-03-16  9:18           ` Michael
  2009-03-12 12:39         ` (see below)
  1 sibling, 1 reply; 11+ messages in thread
From: Ludovic Brenta @ 2009-03-12 10:52 UTC (permalink / raw)


I read your post three times and I still can't figure out what your
point is.

You claim that "iFACTS has failed as an engineering results" without
explaining what the failure is. What are the criteria for success that
iFACTS does not meet, in your opinion? And what are your sources of
information?

Also you say "iFACTS was ending with a lot of deficiencies" but I have
no idea what you are talking about because I am not an insider. Please
provide some details about your claims (links to published reports
would be excellent); explain why you think the "deficiencies" are
attributable to the development process or tools; propose a solution.
That would really contribute to the knowledge and wisdom of your
fellow software engineers. Right now, I for one am totally in the dark
and I have no idea what conclusion I'm supposed to draw from your
post.

--
Ludovic Brenta.



^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
  2009-03-12 10:36       ` Michael
  2009-03-12 10:52         ` Ludovic Brenta
@ 2009-03-12 12:39         ` (see below)
  1 sibling, 0 replies; 11+ messages in thread
From: (see below) @ 2009-03-12 12:39 UTC (permalink / raw)


On 12/03/2009 10:36, in article lB5ul.64542$Rg3.54974@newsfe17.iad,
"Michael" <fvit@shaw.ca> wrote:

> Hi Bill and all,
> 
[verbiage snipped] 
> 

You did not answer my question.

-- 
Bill Findlay
<surname><forename> chez blueyonder.co.uk





^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
  2009-03-12 10:52         ` Ludovic Brenta
@ 2009-03-16  9:18           ` Michael
  2009-03-16 10:29             ` Tim Rowe
  0 siblings, 1 reply; 11+ messages in thread
From: Michael @ 2009-03-16  9:18 UTC (permalink / raw)


"Ludovic Brenta" <ludovic@ludovic-brenta.org> wrote in message 
news:355dc49a-3414-4883-a268-fa3bcc493e7b@o11g2000yql.googlegroups.com...
>I read your post three times and I still can't figure out what your
> point is.
>
> You claim that "iFACTS has failed as an engineering results" >
> --
> Ludovic Brenta.

Hi Ludovic,

Made it "CAATS" and you get what Nav Canada is reporting in its latest 
Operation Strategic Plan. section - 08T.4, page 53.
http://www.navcanada.ca/ContentDefinitionFiles/Publications/CorpPublications/AdditionalPublications/Ops_Strategic_Plan_2008_2011_en.pdf

The media is the message.  It's a short message, but the audience is its 
contents. (McLuhan http://www.youtube.com/watch?v=RtycdRBAbXk).

Nav Canada is not the problem.  Like almost Praxis, Nav Canada just has no 
solution to a specific engineering problem.

The common denominator:  both assumed being able to undertake a project far 
beyond their usual level of expertise and capabilities.

Behind the iFACTS or CAATS delays and deficiencies is a "Correctness by 
construction" complex: quite the saint Bernard (Madoff) syndrom: "providing 
the latest and most up-to-date "world class" tools available", continuously 
underestimating risks, overestimating self-confidence, great lack of global 
vision.
There is no cure for that.

Such syndrome is even contagious if engineers fail to answer.
(Engineers are trusted. Therefore they should avoid fooling themselves to 
preventing fooling others.)

Instead the engineer reactions went on the autopilot: so we adjusted the 
level of effort according to the level of difficulties until we get out of 
resources.

The problem: we can get far, even too far:
e.g.:  During their training, during CAATS integration and while in charge 
of flight traffic control, the air traffic controllers get progressively 
caught within the system verification.  At the beginning they get the 
deficiencies we identified and resolved but didn't have time to integrate, 
then the one we had identified but not resolved yet, and then others we 
might have anticipated, but not yet identified.
The system was hardly converging to a compliant solution, but confidence's 
up and down increased the inertia.

The response time due to such inertia was critical: i.e.: Reactions were 
pending until it's almost too late.

http://www.navcanada.ca/ContentDefinitionFiles/Publications/CorpPublications/AdditionalPublications/Ops_Strategic_Plan_2005_2008_en.pdf - 
nothing yet about CAATS deficiencies..

http://www.catca.ca/English/Association_Minutes/May7-07.doc, page 2, A10
(QM is Moncton, QX Gander, QG Winnipeg, UL Montreal ACC)

http://www.catca.ca/English/Branches_and_Facilities/BF-May24-07.html
(St-Laurent Region Branches and Facilities: Montreal ACC)

http://www.navcanada.ca/ContentDefinitionFiles/Publications/CorpPublications/AdditionalPublications/Ops_Strategic_Plan_2007_2008_en.pdf, 
page 41 - 2005-2008 OSP is updated

A question : what has triggered the controller reaction : the cumulative 
error effect or the error amplitude?

By using Ada as a formal language to address the increasing dependence of 
complex engineering systems on software; by better evaluating and reducing 
risks to an acceptable level, from almost all of the following:
 - Using proven procedures in dealing with problems
 - Using experience and knowledge of the fact to make decisions
 - Having a well-thought-out process that utilises careful advance 
preparation
 - Providing stability, consistency, predictability and efficiency
 - Considering the reality of the situation

Answer: That issss.. the error amplitude.
Why: Deficiency impacts remain unpredictable even after an anomaly is 
identified.

CAATS is a $1 billion Ada project that is currently operational within 6 of 
7 ACCs. Not all deficiencies are actually resolved, but CAATS safety might 
be likely improved depending on the way priorities are perceived.

By using formal methods' proofs as an abstract certitude of future 
correctness.

Answer: that is the cumulative error effect.
Why: Real world's anomalies don't get easily identified until the final 
trial. And you might not realize what you are doing until all is done right, 
but not necessarily the right thing.

Therefore, iFACTS is safe since it doesn't work, but that is not sufficient. 
i.e.: iFACTS is a required safety component which should already being 
reliably working.
From that point of view, formal methods' proofs are much better!

Cheers

Michael,
Vancouver, BC


Did I miss something?

Real world is not mathematically perfect.  Safety is almost based on the 
robustness to unexpected events, and on the perception and/or control of 
anomalies or errors: e.g.: numerical error from computer's calculation, 
digital signal conversion, measurement precisions, error processing 
(handling, tolerance), system recovery, interface with system at risk of 
deficiencies, different mathematical approximations eventually according to 
transposition within different specific application domains, ., and the 
unforeseen interactions between any of the above, scattered across more than 
one system, at different system levels.

In comparison, the defect Tokeneer caught is a trivial abstract case of 
range overflow.  By specifying a range, this defect can get caught from 
boundary testing or SPARK data flow analysing.

"System stability is based on the notion that one change will not cause one 
or more problems shortly after that change is made. In fact, the system 
should go from one stable state to another to another. Such stability needs 
to be system-wide, not just as applied to one sector, so that changes made 
by one operator will be consistent with the needs of other operators. One 
aspect of conflict detection which does not change is the role of the 
operator as being in control and responsible." Nav Canada

"I am very optimistic about the future. With an air safety regime that is 
already among the best, our commitment to safety and to promoting a stronger 
safety culture will help to maintain Canada's position as a world leader." 
Transport Canada's "Dear Minister"

If any claim: Canada is not in the Somerset.





^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
  2009-03-16  9:18           ` Michael
@ 2009-03-16 10:29             ` Tim Rowe
  2009-03-18  0:54               ` Michael
  0 siblings, 1 reply; 11+ messages in thread
From: Tim Rowe @ 2009-03-16 10:29 UTC (permalink / raw)


Michael wrote:
> "Ludovic Brenta" <ludovic@ludovic-brenta.org> wrote in message 
> news:355dc49a-3414-4883-a268-fa3bcc493e7b@o11g2000yql.googlegroups.com...
>> I read your post three times and I still can't figure out what your
>> point is.
>>
>> You claim that "iFACTS has failed as an engineering results" >
>> --
>> Ludovic Brenta.
> 
> Hi Ludovic,
> 
> Made it "CAATS" and you get what Nav Canada is reporting in its latest 
> Operation Strategic Plan. section - 08T.4, page 53.
> http://www.navcanada.ca/ContentDefinitionFiles/Publications/CorpPublications/AdditionalPublications/Ops_Strategic_Plan_2008_2011_en.pdf
[detail snipped]

You are still not making clear what you think the cause of the delays in 
these projects is (well, you seem to be saying that it's Ada + formal 
methods, but not saying in what way they are the problem). I don't know 
what CAATS is, but I have some knowledge of iFacts, and one of the 
things I know is that it very difficult indeed to define how it should 
work. It's hard to specify, and it's hard to determine whether 
algorithms are adequate. Each of those (related) issues could lead to 
delays (or even failure) that have nothing to do with the language or 
the use of formal methods. Using Ada with formal methods you can have a 
great deal of confidence that an algorithm is correctly implemented, but 
if you don't have confidence that the algorithm correctly solves your 
problem then where's the value? This is certainly an issue for 
development of complex systems, but it's not an Ada or formal methods 
issue. I don't know whether this is the actual issue with iFacts, but it 
should be enough to show that "Ada + formal methods project has 
difficulties" doesn't necessarily mean that the difficulties are with 
Ada or formal methods. If you think Ada or formal methods is causing the 
difficulty, could you clarify why you think that, and explain what the 
specific issues are?



^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: Ada UK conference: SPARK safety: is no delivery better than 1 defect?
  2009-03-16 10:29             ` Tim Rowe
@ 2009-03-18  0:54               ` Michael
  0 siblings, 0 replies; 11+ messages in thread
From: Michael @ 2009-03-18  0:54 UTC (permalink / raw)


"Tim Rowe" <spamtrap@tgrowe.plus.net> wrote in message 
news:lp-dnetIfJeXtyPUnZ2dnUVZ8jGWnZ2d@posted.plusnet...

> I don't know what CAATS!

The 2 Automated Air Traffic Systems which have been developed at Vancouver 
were including an integrated flight conflict prediction (COPR) capability up 
to 20 minutes into the future.

Specifications and system architecture are different from iFACTS but the 
functional concept is pretty the same.

But software development process is formal and well established.  i.e.: SEI 
level 3, which meets a military requirement to develop safety critical 
software (e.g.: CAATS military counterpart: MAATS).

> I know is that it very difficult indeed to define how it should work.

In 1995, COPR was first cancel to ease delivering a less complex system.
http://www.flightglobal.com/articles/1995/06/21/25542/hughes-rethinks-canadian-atc-project.html

But the Transportation Safety Board suggested getting it back. i.e.: they 
knew how it should have to work, they had just experimented and documented a 
test case scenario:
http://www.tsb.gc.ca/eng/rapports-reports/aviation/1995/a95c0127/a95c0127.asp

At the CAATS delivery in 2001, COPR was working, which allows Nav Canada to 
refining requirements.  COPR was upgraded in accordance to better fit 
operational characteristics and constraints.

> It's hard to specify, and it's hard to determine whether algorithms are 
> adequate.

Good engineering practices will help!
e.g.: Prototyping, modeling. Iterative development, unit, non-regression and 
integration testing.

Many lessons have been documented in Industry, and reinforced on CAATS:
 - System Requirements must be ready, well written and complete
 - Need strong system engineering team to support software developers
 - Support tools for software development/validation must be 'in place' 
early
 - Continuous measurements and analysis of software development team 
effectiveness and progress helps improve and streamline the process in place


> Each of those (related) issues could lead to delays (or even failure) that 
> have nothing to do with the language or the use of formal methods.

"even" is odd.

Probably something is missing between formal specifications and code 
programming/formal verification (SPARK examiner, proof); and between formal 
specification and formal verification.
Adding all of the above, might have help to fix the problem.

Avoid "cutting corners at the expense of the engineering".  That the best 
way to re-inventing the wheel!


> Using Ada with formal methods you can have a great deal of confidence that 
> an algorithm is correctly implemented, but if you don't have confidence 
> that the algorithm correctly solves your problem

then you test it. You will see where it doesn't work and probably figure 
what need to be done and how to do it, and test it again.  Simplifying the 
model, dividing into elementary modules, reducing the application domain 
might help the conception process.  You might not find a deficiency, but a 
trace left by a deficiency (an anomaly) or identify a specific context in 
which a deficiency can occur.  Follow the trace, think what specific, it 
could say more on the problem.  If you have nothing to start with, start 
with someting absurd and try to make a sense of it.

If you find nothing, ask to someone.  If not your discipline, don't stay 
still.


> This is certainly an issue for development of complex systems, but it's 
> not an Ada or formal methods issue.

There are always issues for development of complex system.  Assuming that 
what is necessary is also sufficient, is one of them.
(i.e.: Engineering is not a game, but a responsible approach to design and 
develop safe technological solutions to serve people and protect the 
environment.)

> I don't know whether this is the actual issue with iFacts, but it should 
> be enough to show that "Ada + formal methods project has difficulties" 
> doesn't necessarily mean that the difficulties are with Ada or formal 
> methods.

No, but reciprocally.

it is enough to say that

"If the difficulties are with Ada or formal methods your project has 
difficulties"
(e.g.: Snipping testing from assuming a proof can't have any (syntaxt, 
semantic, assumption) error)

> If you think Ada or formal methods is causing the difficulty, could you 
> clarify why you think that, and explain what the specific issues are?

Generally speaking I can find a lot of difficulties in the way to use Ada or 
the formal methods. There is no magic bullet.
The difficulty is not such the question, but being responsible and assuming 
responsibilities is an answer.

CAATS and MAATS had their lot of difficulties, we never underestimated these 
difficulties. Anything it took to tackle them, we almost get it, and I think 
the results had worth all these difficulties and what it cost.

Problem, Nav Canada promptly started to spare a lot of efforts within the 
CAATS integration, and then got more difficulties and lost some 
functionalities (including conflict prediction).  Since in that context the 
defence choose to cancel MAATS, such conflict prediction tool is no longer 
availlable in Canada.

Air traffic controllers still insist to get conflict prediction back again.
"Based on other ANS's experiences with implementing MTCD the implementation 
of MTCD into the CAATS environment will be a significant effort.  The amount 
of existing CAATS code that can be leveraged, particularly in the Trajectory 
Modeller and Flight Plan Management areas, is unknown."  Nav Canada

At least the CAATS MTCD algorithms are known.  That all what Nav Canada 
needs to know to do it itself.  But such MTCD is hardly compliant with the 
Nav Canada version of the CAATS architecture.


I doubt iFACTS delay can be attributed to the non-accuracy of algorithms 
NATS has developed (for almost ten years).
They didn't need to be perfect.  They needed to be verified.

(A formal proof is based on the proof of intrants, not on assumptions on 
intrants.  Establishing the SPARK proof of a system based on assumptions is 
a waste of efforts and time.  Ditto for the validation testing.)

If NATS knew how to do it, it would have been able to do it itself, and 
would have done it (they have enough engineers who would like having done 
it, and who would have done it too).


That is the (Praxis) engineering duty to undertake and accept responsibility 
for professional assignments only when qualified by training or experience
To provide an opinion on a professional subject only when it is founded upon 
adequate knowledge and honest conviction;
To present clearly to clients the possible consequences if professional 
decisions or judgments are overruled or disregarded;
To conduct themselves with fairness, courtesy and good faith towards 
clients, colleagues and others, give credit where it is due and accept, as 
well as give, honest and fair professional comment;


Lady Ada created a program for an Analytical Engine which didn't exist yet.
When such engine was built, her program was tested and it worked.

That formally proves there are not enough women in engineering.

Cheers

Michael,
Vancouver, BC





^ permalink raw reply	[flat|nested] 11+ messages in thread

end of thread, other threads:[~2009-03-18  0:54 UTC | newest]

Thread overview: 11+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2009-03-10  6:01 Ada UK conference: SPARK safety: is no delivery better than 1 defect? Michael
  -- strict thread matches above, loose matches on Subject: below --
2009-03-10  5:47 Michael
2009-03-10 14:54 ` (see below)
2009-03-11 10:34   ` Michael
2009-03-11 14:46     ` (see below)
2009-03-12 10:36       ` Michael
2009-03-12 10:52         ` Ludovic Brenta
2009-03-16  9:18           ` Michael
2009-03-16 10:29             ` Tim Rowe
2009-03-18  0:54               ` Michael
2009-03-12 12:39         ` (see below)

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox