comp.lang.ada
 help / color / mirror / Atom feed
* SPARK User Group 2008
@ 2008-04-28 16:02 roderick.chapman
  2008-05-12 10:29 ` Michael
                   ` (2 more replies)
  0 siblings, 3 replies; 14+ messages in thread
From: roderick.chapman @ 2008-04-28 16:02 UTC (permalink / raw)


We're pleased to announce that the next SPARK User Group
meeting will be held in Bath, UK, on Wednesday 15th October 2008.

Booking a place

Capacity is limited, so priority will be given to our supported
customers, other clients, tool
partners and academic users. If you have not already received an
invitation, then please
contact us via sparkinfo@praxis-his.com.

Programme

The provisional list of speakers includes:

Guest speaker: Duncan Brown (Rolls-Royce plc).
"Formal Methods and DO-178C"

Industrial case-study: Neil White (iFACTS Team, Praxis).
"The iFACTS project"

Research report: Dr Paul Jackson, (University of Edinburgh)
"Using SMT Solvers to Prove SPARK VCs"

R&D Report: Rod Chapman (Praxis)
"SPARK Update and Release 7.6 Highlights"



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-04-28 16:02 SPARK User Group 2008 roderick.chapman
@ 2008-05-12 10:29 ` Michael
  2008-05-13  7:47   ` Simon Wright
  2008-05-25 20:14 ` Michael
  2008-06-01 20:47 ` SPARK User Group 2008 Michael
  2 siblings, 1 reply; 14+ messages in thread
From: Michael @ 2008-05-12 10:29 UTC (permalink / raw)


Is the iFACTS project going to be an Ada success story?



iFACTS is an engineering project made without engineering.

That should have been a must, for a Medium Term Conflict Detection software 
conceived to enhance ATC operations and assist the UK flight controllers in 
assuring safety.



Results are delays and an unusual amount of deficiencies.   Some Ada 
projects are like that:  ignoring the engineering for the sake of Ada.



Ensuring system safety, stability and non-saturation are ones of the main 
engineering challenges.  Ada is an engineering tool made to meet such 
objectives; and it works pretty well if used with the best software 
engineering practices and processes (i.e: SEI's CMM level III and over).



SPARKS is a subset of Ada.  The general goal of SPARK is to provide a 
language which increases the probability of the software code behaving as 
intending.  In another word SPARK provides additional information, which 
allows performing rigorous mathematical analysis in order to significantly 
increase the code intrinsic integrity and runtime correctness.  The benefits 
are to reduce the risk of processing erroneous data, and then preventing any 
error handling and/or system recovering risk.



SPARKS has its advantages and its limitations.  It better works for small 
embedded projects like airborne systems.



Praxis-Hight Integrity Systems should have been well advised to favour High 
Engineering Reliability.  That's still not sufficient for safety, but better 
than nothing (i.e: iFACTS).



Cheers,



Michael

Vancouver







^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-05-12 10:29 ` Michael
@ 2008-05-13  7:47   ` Simon Wright
  2008-05-16  6:57     ` Michael
  0 siblings, 1 reply; 14+ messages in thread
From: Simon Wright @ 2008-05-13  7:47 UTC (permalink / raw)


SPARK, Michael, not SPARKS.

I think you should give some references for your remarks. Not saying
you're right or wrong, but there seem to be no reports of problems on
the BBC, the Guardian, or El Reg..



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-05-13  7:47   ` Simon Wright
@ 2008-05-16  6:57     ` Michael
  2008-05-16  8:21       ` stefan-lucks
  2008-05-16 21:41       ` Simon Wright
  0 siblings, 2 replies; 14+ messages in thread
From: Michael @ 2008-05-16  6:57 UTC (permalink / raw)


"Simon Wright" <simon.j.wright@mac.com> wrote in message 
news:m263tiir0x.fsf@mac.com...

> I think you should give some references for your remarks.

Engineering is the reference.

That is also most of this audience.  So, we already knew.

What went wrong?   We just didn't say a word about that!

To compete for a better reliability, shall we put safety at risk?
We could be closed to get rid of the reliability either!

That time that was called "Correctness by Construction".
As usual, that could have being called "running to the wall".  A chance they 
were running fast.  So no one else get hurts, but a few ego.

"Correctness by Construction" had emerged from some Praxis-HIS's "White 
papers", and intrigued at some of the previous annual international 
conferences - on reliable software technologies (e.g.: Ada Europe, next in 
Venice, Italy, 16-20 June) or on the Ada Programming Languages (e.g.: Sig 
Ada, next in Portland, Oregon, 26-30 October).

Ada is not a toy!  Engineering is not a game!

Putting all together diverse abstract methods without engineering experience 
and concern about their limitations is quite irresponsible.

The immediate result which could be expected is a definitive lost of 
visibility on the development process.

Without understanding what software development process has to be, how to 
make decisions which shall not put out of control divergences and 
deficiencies?  Worst, where all these divergences and deficiencies get lost? 
That looks such like a surprise when another operational dysfunction occurs!

A chance they were supposed to all vanished by Correctness by Construction 
as soon the iFACTS project ends.

Indeed that is Correctness by Construction which seems to have vanished from 
the next annual international Ada conferences and the SPARK User Group 
meeting as well.


Is the iFACTS project going to be an Ada success story?  That is the only 
question!

   Yes: there are engineers; they didn't enter the wall.
   No: there are dummies; they are already into the wall.
   No answer: there are ghosts, they went across the wall.

Cheers,

Michael,
Vancouver, (Beautiful British Columbia)


Ada Europe 2005 (York, UK, 20-24 June 2005)
Correctness by Construction - A Manifesto for High Integrity Engineering
"Correctness by Construction is based on a set of principles, distilled from 
practical project experience, to realize systems and software engineering 
outputs with very low defect rate and very high resilience to change."

SIGAda 2007 (Fairfax, Virginia, USA)
Correctness by Construction: Putting Engineering (back) into Software
"The need to "engineer" our software implies that discipline and reasoning 
are required, yet most processes and languages seem to obstruct rather than 
assist such attempts."

NATS new realeases: March 2007
"Final trials of the iFACTS tools are currently under way.
"Praxis is using a unique software engineering approach known as 
"Correctness by Construction" (CbyC) which delivers ultra low defect 
software for critical applications."

NATS Strategic plan for safety 2007.
"iFACTS is planned to enhance Human ATC Performance Risk Prediction at the 
London Area Control in June 2008."





^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-05-16  6:57     ` Michael
@ 2008-05-16  8:21       ` stefan-lucks
  2008-05-16 21:41       ` Simon Wright
  1 sibling, 0 replies; 14+ messages in thread
From: stefan-lucks @ 2008-05-16  8:21 UTC (permalink / raw)


On Fri, 16 May 2008, Michael wrote:

> What went wrong?   We just didn't say a word about that!

If you have any references that something actually did go wrong with 
iFACTS, could you share them in this newsgroup?

I am really curious. Using google, I couldn't find anything about 
some failure of the iFACTS project. 

So long

Stefan Lucks

-- 
------ Stefan Lucks   --  Bauhaus-University Weimar  --   Germany  ------
               Stefan dot Lucks at uni minus weimar dot de
------  I  love  the  taste  of  Cryptanalysis  in  the  morning!  ------




^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-05-16  6:57     ` Michael
  2008-05-16  8:21       ` stefan-lucks
@ 2008-05-16 21:41       ` Simon Wright
  1 sibling, 0 replies; 14+ messages in thread
From: Simon Wright @ 2008-05-16 21:41 UTC (permalink / raw)


"Michael" <Michael@home.ca> writes:

> "Simon Wright" <simon.j.wright@mac.com> wrote in message 
> news:m263tiir0x.fsf@mac.com...
>
>> I think you should give some references for your remarks.
>
> Engineering is the reference.
>
> That is also most of this audience.  So, we already knew.
>
> What went wrong?   We just didn't say a word about that!

I am asking you to tell us what went wrong.

If you are saying "if they don't do it right, it will be a failure" --
well, duh, this is news?



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-04-28 16:02 SPARK User Group 2008 roderick.chapman
  2008-05-12 10:29 ` Michael
@ 2008-05-25 20:14 ` Michael
  2008-05-26 10:06   ` Simon Wright
  2008-06-01 20:47 ` SPARK User Group 2008 Michael
  2 siblings, 1 reply; 14+ messages in thread
From: Michael @ 2008-05-25 20:14 UTC (permalink / raw)


"Simon Wright" <simon.j.wright@mac.com> wrote in message
news:m21w42hqp8.fsf@mac.com...
 > If you are saying "if they don't do it right, it will be a failure" --
> well, duh, this is news?

OUCH, that sparks!


Being distracted by discussions is not focusing on safety.
March the 26th, 2006, BC ferries' Queen of the North vanished just like 
that!
"Crew had also turned off alarms that would have alerted them to danger."

Engineering is not that much about doing wrong or right, but knowing what is 
being done.

Even before a system gets operational, what matters is early identifying 
vulnerabilities, locating anomalies and resolving any deficiency prior a 
dysfunction occurs.  (i.e.: A clue: dysfunctions mostly occur in a specific 
context (not necessarily the current application domain), but anomalies are 
almost there! Without a trace? Not really!).

Apparently it wasn't one of the "Correctness by Construction" capabilities.
Therefore Praxis's iFACTS verifications never noticed any defects, but the 
NATS's iFACTS validation.
(i.e.: The 7 of March 2007, NATS news released proudly announced the iFACTS 
final trials.  Since, NATS never announced that iFACTS gets commissioned - 
find the errors! - anomalies I meant).

Like it says, "Correctness by Construction" is a brand new distilled set of 
formal principles, to realize systems and software engineering outputs with 
very low defect rate and very high resilience to change.

"Is Proof More Cost-Effective Than Testing?"
Or is CbC "just another way to make things so complicated that there are no 
obvious deficiencies?"

Praxis-HIS gets almost $20 millions to develop iFACTS and demonstrating that 
Formal method, formal verification and SPARK all together "appear to be 
substantially more efficient at finding faults than the most efficient 
testing phase".

Preliminary answers are expected at Ada Europe, in Venice, Italy, 16-20 
June.

Better to know how easily removable iFACTS deficiencies are!
If yes, did NATS reschedule iFACTS with new funding?

Cheers,

Michael,
Vancouver, BC

NOTE:
Never underestimate engineering capabilities:
Running without visibility, faster than you can and hitting the sound 
barrier is all but probable.
It's more likely going from your speed to zero in less than 1 second!

Big bang on the nose?

"Much ado about nothing" indeed.





^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-05-25 20:14 ` Michael
@ 2008-05-26 10:06   ` Simon Wright
  2008-05-27 18:43     ` Michael
  0 siblings, 1 reply; 14+ messages in thread
From: Simon Wright @ 2008-05-26 10:06 UTC (permalink / raw)


"Michael" <Michael@home.ca> writes:

> Apparently it wasn't one of the "Correctness by Construction"
> capabilities.  Therefore Praxis's iFACTS verifications never noticed
> any defects, but the NATS's iFACTS validation.
> (i.e.: The 7 of March 2007, NATS news released proudly announced the
> iFACTS final trials.  Since, NATS never announced that iFACTS gets
> commissioned - find the errors! - anomalies I meant).

But see http://www.airflights.co.uk/news/default.aspx?category=438006664&story=18081798&archive=-&AspxAutoDetectCookieSupport=1



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-05-26 10:06   ` Simon Wright
@ 2008-05-27 18:43     ` Michael
  2008-05-27 19:23       ` Simon Wright
  2008-05-28  9:51       ` iFACTS (was: SPARK User Group 2008) Stuart
  0 siblings, 2 replies; 14+ messages in thread
From: Michael @ 2008-05-27 18:43 UTC (permalink / raw)


"Simon Wright" <simon.j.wright@mac.com> wrote in message 
news:m2iqx1z8fe.fsf@mac.com...

> But see 
> http://www.airflights.co.uk/news/default.aspx?category=438006664&story=18081798&archive=-&AspxAutoDetectCookieSupport=1

Nice picture!

Thank you Simon, your contribution is greatly appreciated.

Did you also notice your finding was posted: Wed, 07 Mar 2007 14:53:47 GMT,
That's a few minutes after NATS proudly announced the iFACTS final trials.

In other words you just confirm the problem: iFACTS has vanished after the 
NATS's final trial and no one can say what the problems were.
(Since UK air traffic is still growing, delay is the safety problem)


Such problem seems to emanate from a UK military standard (00-55)
Requirements for safety related software in defence equipment - 1st August 
1997.
"The testing shall exercise all properties and functions which have not been 
demonstrated by formal verification."
http://www.dstan.mod.uk/data/00/055/01000200.pdf

Since CbC is just about formal methods, as per standard 00-55, iFACT 
reliability was merely demonstrated by formal verification.

No testing means no visibility on problems, not even a doubt it might have 
any problem.

e.g.: "Crew had also turned off alarms that would have alerted them to 
danger."


WHAT ENGINEERING SAYS:
"Formal verification will not make complex systems simple either. There is 
no magic here. Some limitations of formal verification include that the 
system of formal mathematics, such as proofs, may be larger than the program 
being verified. Often, the formal verification is very difficult to 
construct. Like anything difficult to construct, it is likely to contain 
errors. Some limited aspects of system behavior may be proven, but are these 
errors likely to be caught in testing anyway? Are these errors ones that are 
likely to cause accidents? Formal verification probably has little effect on 
safety unless it is aimed at the safety constraints."


In facts, the iFACTS fiasco was predictable before the iFACTS project even 
begins.

With some unit and non-regression testing, iFACTS divergences would have 
been early detected.
i.e.: iFACTS would have had a chance to be commissioned, and another safety 
risk could have been addressed.

http://seattletimes.nwsource.com/html/travel/2004128412_webaircanada16.html
e.g.: January the 10th, 2008, over BC, "Wake from passing plane may have 
caused Air Canada jet's plunge: 10 passengers injured."
i.e.: Medium Term Conflict Detection strongly needed, but not iFACTS.

Cheers,
Michael, BC


NOTES:
In 2006, an integrated product team comprising NATS, Lockheed Martin and 
Praxis launched the interim Future Area Control Tools Support (iFACTS) 
system,
http://www.lockheedmartin.com/products/london-area-control-center/index.html

March 12, 2007 - iFACTS poised to transform ATC:
"Praxis managing director Keith Williams says that, at present, the system 
is "85% of the way to being operational",
http://www.tmcnet.com/usubmit/2007/03/12/2407775.htm

SigAda 2007 - Fairfax, Virginia, USA, November 4-9, 2007 - as usual, 
Praxis-HIS proudly presented CbC
http://www.sigada.org/conf/sigada2007/

Next Ada Europe, Venice, Italy, 16 - 20, June 2008 - CbC is gone (embroglio 
or fiasco?)
http://www.math.unipd.it/ae2008/
SIGAda 2008: Portland, Oregon, USA, October 26-30, 2008 - is CbC yet coming? 





^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-05-27 18:43     ` Michael
@ 2008-05-27 19:23       ` Simon Wright
  2008-05-27 20:07         ` Jeffrey R. Carter
  2008-05-28  9:51       ` iFACTS (was: SPARK User Group 2008) Stuart
  1 sibling, 1 reply; 14+ messages in thread
From: Simon Wright @ 2008-05-27 19:23 UTC (permalink / raw)


I don't understand.

Either you *know* that iFacts has got problems/has failed, in which
case *WHY DON'T YOU POST THE EVIDENCE SO WE CAN SEE IT TOO*, or you
*assume* that no news is bad news.



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-05-27 19:23       ` Simon Wright
@ 2008-05-27 20:07         ` Jeffrey R. Carter
  2008-05-27 21:57           ` Ed Falis
  0 siblings, 1 reply; 14+ messages in thread
From: Jeffrey R. Carter @ 2008-05-27 20:07 UTC (permalink / raw)


Simon Wright wrote:
> I don't understand.
> 
> Either you *know* that iFacts has got problems/has failed, in which
> case *WHY DON'T YOU POST THE EVIDENCE SO WE CAN SEE IT TOO*, or you
> *assume* that no news is bad news.

Don't you recognize proof by blatant assertion?

-- 
Jeff Carter
"What's the amount of the insult?"
Never Give a Sucker an Even Break
104



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-05-27 20:07         ` Jeffrey R. Carter
@ 2008-05-27 21:57           ` Ed Falis
  0 siblings, 0 replies; 14+ messages in thread
From: Ed Falis @ 2008-05-27 21:57 UTC (permalink / raw)


On Tue, 27 May 2008 16:07:33 -0400, Jeffrey R. Carter  
<spam.jrcarter.not@spam.acm.org> wrote:

> Simon Wright wrote:
>> I don't understand.
>>  Either you *know* that iFacts has got problems/has failed, in which
>> case *WHY DON'T YOU POST THE EVIDENCE SO WE CAN SEE IT TOO*, or you
>> *assume* that no news is bad news.
>
> Don't you recognize proof by blatant assertion?
>

One of my favorite rhetorical devices, especially when it's a peripheral  
phrase in a statement about something else.  Amazing how those slip under  
people's radar.



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: iFACTS (was: SPARK User Group 2008)
  2008-05-27 18:43     ` Michael
  2008-05-27 19:23       ` Simon Wright
@ 2008-05-28  9:51       ` Stuart
  1 sibling, 0 replies; 14+ messages in thread
From: Stuart @ 2008-05-28  9:51 UTC (permalink / raw)


"Michael" <Michael@home.ca> wrote in message 
news:ZCY_j.169009$rd2.39062@pd7urf3no...

<replying to Simon Wright>
> Did you also notice your finding was posted: Wed, 07 Mar 2007 14:53:47 
> GMT,
> That's a few minutes after NATS proudly announced the iFACTS final trials.
>
> In other words you just confirm the problem: iFACTS has vanished after the 
> NATS's final trial and no one can say what the problems were.
> (Since UK air traffic is still growing, delay is the safety problem)

Nor could one reasonably conclude there is a problem with iFACTS.  Deploying 
a new system into service is not a trivial undertaking!

> Such problem seems to emanate from a UK military standard (00-55)
> Requirements for safety related software in defence equipment - 1st August 
> 1997.
> "The testing shall exercise all properties and functions which have not 
> been demonstrated by formal verification."
> http://www.dstan.mod.uk/data/00/055/01000200.pdf

Presumably you are citing 37.1.1 a)

Might I suggest that you go on to read 37.1.1 b)

> Since CbC is just about formal methods, as per standard 00-55, iFACT 
> reliability was merely demonstrated by formal verification.

Aside from your selective quoting of 00-55, this also appears to be a poor 
presentation of the facts.  The first thing that is not clear is whether you 
mean 'formal method' in the sense 00-55/56 means them - or just some methods 
with a degree of prescription about them.

From what I understand CbC, as it might have been used on iFACTS, does use 
formalism in the 00-55/56 sense - but to say it is just about that is a 
gross distortion, given the criticisms being made.

The conclusion about how iFACT was demonstrated is a non-sequitur.  The use 
of CbC in the creation of iFACT tells us nothing about what other processes 
may have been used to underpin the system certification - or reliability!


> No testing means no visibility on problems, not even a doubt it might have 
> any problem.

This is an untrue statement!  Problems can, and are, revealed by processes 
other than testing.

> e.g.: "Crew had also turned off alarms that would have alerted them to 
> danger."

This example does not seem to illustrate the point you appeared to be trying 
to make.  Operational misuse of equipment is highly unlikely to be revealed 
by testing!

> In facts, the iFACTS fiasco was predictable before the iFACTS project even 
> begins.

As Simon Wright asks - please provide details of your claims.

> With some unit and non-regression testing, iFACTS divergences would have 
> been early detected.

This is bordering on the libellous.  I believe those involved in the 
development of iFACTS are highly professional and would use the appropriate 
level of testing.  They would be fully aware of the need to validate (not 
simply verify).

> i.e.: iFACTS would have had a chance to be commissioned, and another 
> safety risk could have been addressed.

Nothing you have presented to date even supports a claim that iFACTS is not 
due to be commissioned let alone any reasons why!

> http://seattletimes.nwsource.com/html/travel/2004128412_webaircanada16.html
> e.g.: January the 10th, 2008, over BC, "Wake from passing plane may have 
> caused Air Canada jet's plunge: 10 passengers injured."
> i.e.: Medium Term Conflict Detection strongly needed, but not iFACTS.

Whether iFACTS is suited to the Canadian Air Traffic situation or not may be 
a reasonable point of discussion - but hardly impinges upon whther iFACTS 
meets its requirements for the UK Air Traffic situation.!


> NOTES:
<snip>
> Next Ada Europe, Venice, Italy, 16 - 20, June 2008 - CbC is gone 
> (embroglio or fiasco?)
> http://www.math.unipd.it/ae2008/

So because a conference does not discuss a particular topic which has 
previously been discussed you conclude that the subject matter is 
discredited!!!  0/10 - must do better!  Show working!

-- 
Stuart 





^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: SPARK User Group 2008
  2008-04-28 16:02 SPARK User Group 2008 roderick.chapman
  2008-05-12 10:29 ` Michael
  2008-05-25 20:14 ` Michael
@ 2008-06-01 20:47 ` Michael
  2 siblings, 0 replies; 14+ messages in thread
From: Michael @ 2008-06-01 20:47 UTC (permalink / raw)


"Stuart" <stuart@0.0> wrote in message 
news:483d26f5$1_1@glkas0286.greenlnk.net...
> "Michael" <Michael@home.ca> wrote in message
> news:ZCY_j.169009$rd2.39062@pd7urf3no...
>
>  must do better!
> Stuart

Right Stuart you had the point: you extended the safety view, but stuck on 
details and get mixed up.

From that view, you could almost tell that the Praxis Correctness by 
Construction formal methods are incompatible with a software-intensive 
safety-critical large-scale system like NERC.

(Please, don't confound with the formal methods, (from engineering control 
theory); which provide mathematical guarantee of the modeled system's 
behaviour in the presence of state and input constraints.  e.g.: design of 
multi-objective control laws that by construction will guarantee stability, 
and prevent input saturation.)

Praxis CbC formal methods are mathematical abstractions transposed within 
the software development domain (i.e.: Z, SPARK).  The intents are to 
enhance reliability by verifying coherence of specifications, code semantic 
correctness and resilience to dataflow defects.
Such CbC set of principles, distilled from formal methods, (like he said), 
should have been diluted within some engineering crystal-clear water.  That 
would have prevented Praxis-HIS to drive iFACTS crazy - as well as most of 
his engineers - quite probably.


NATS's NERC was developed by Lockheed Martin, in Rockville, MD.  After some 
adaptations in the UK (e.g.: problems with 0 degrees longitude, and so on 
...) NERC went solid like a rock - finally.

iFACTS is almost plug and play (Lockheed Martin enhanced the NERC interfaces 
accordingly).

iFACTS successful development announcements were interrupted soon after 
March the 7th 2007. Indeed soon after iFACTS hit NERC at the NATS final 
trial, (probably, just by the time Praxis-HIS consults his formal 
specifications and tried to figure what could have happened?).

A solution: keeping the rock and getting a new Ferry (rock proof)?
Praxis-HIS probably didn't yet realize it should be more cost and time 
effective if Lockheed Martin were restarted the job, but in a compatible 
NERC Maryland style manner - (i.e.: SEI CMMI level 3 or higher).


iFACTS development is following an old predictable pattern (e.g.: SEI CMMI - 
between level 1 and 2).  i.e.: Increasing the divergences without addressing 
deficiencies, delays or reducing risks, that is not an innovation.   ATC 
projects are all interconnected.  Better to keep that on track, like that: 
http://www.bcferries.com/about/super_c_tracker.html)


NERC using Praxis' iFACTS is composing one hybrid system made from hybrid 
technology and hybrid development process and methods.
It is increasingly common to see safety risks arising from the unforeseen 
interactions of reliable sub-systems.
Get a problem.  On which of these sub-systems a correction can be applied?

Consider both systems can have a few intrinsic defects.  (One more than the 
other).
Knowing one dysfunction, can you retrace the related deficiency by following 
the anomalies across different conceptual models?
You would have to continuously transpose your finding from one model to the 
other, and reciprocally.
(Worst: software development environments have not to be compatible. 
Problem solving tools either.  Easy to get caught by a lot of non related 
points - depending of the size of the views, they might look weird, but are 
not necessarily related to any anomaly).

Stuart, you already get mixed up by the transposition of a simple lack of 
visibility (iFACTS lack of testing vs. misuse of Ferry equipment).

Ada wrote an application without a target to apply.  Later, it worked!

iFACTS conceptual methods are also about math!
Never heard of the Laplace transform? - Application domains and purposes are 
different, but one corollary is the same: an error made in one domain can 
hardly bee seen in the other, but within the applications' results - i.e.: 
quite too late).

Please, would you be kind enough to report your finding to Praxis-HIS.  They 
might appreciate your support.  (i.e.: While completing SHOLIS, they 
perceived a few of the limitations for more complex projects. Whether they 
didn't remember or wanted a second opinion).



From another hands, Praxis-HIS inappropriately promotes HIS High Integrity 
Software capability by emphasizing the SPARK approach to safety and 
security.

Better to say, "SPARK suits for writing programs that need to be reliable, 
and thus particularly relevant to those applications areas where safety and 
security are important." (John Barnes)

Reliability is based on the probability an item will perform its required 
function in the specified manner over a given time period and under 
specified or assumed conditions.

Beyond reliability, safety is not exactly a SPARK capability.

Safety is a planned, disciplined, and systematic approach to preventing or 
reducing accidents throughout the life cycle of a system.  That requires 
getting the global safety picture (even the configuration data).

"Too narrow, a view of architecture will lead to a very pretty 
infrastructure, but the wrong infrastructure for the problem at hand". 
(Grady Booch)

"The safety risk associated with a complex software-intensive system is a 
dynamic property that could easily vary with modifications to the system or 
changes to the operational environment.

Mechanical, electrical, chemical and many other engineering disciplines have 
well established approaches for managing safety risk. However, the 
increasing dependence of complex systems on software control has 
fundamentally changed the source of safety risk for such systems."

Engineering is not an exact science.  There is no simple solution. Safe 
systems require time, effort, special engineering knowledge and experience.


Like it said, the SPARK Praxis-HIS approach has still that advantage:  It 
didn't made iFACTS reliable, but safe since iFACTS can't be used.

Assuming another safety risk is just a matter of not delaying for long some 
Medium Term Conflict Detection capabilities within NERC.

Cheers,

Michael,
Vancouver, BC


THE POINT:
"It is not enough to talk about "absolute safety" and of "zero accident". 
There must also be proper organisation and management to ensure that actions 
live up to words." (UK Department of Transport)

iFACTS didn't go strait to this point!  Praxis-HIS is not even pretending he 
could!
For satisfying DoT, an US detour seems appropriated.

http://www.praxis-his.com/pdfs/cost_effective_proof.pdf
http://www.anthonyhall.org/html/papers_on_formal_methods.html





^ permalink raw reply	[flat|nested] 14+ messages in thread

end of thread, other threads:[~2008-06-01 20:47 UTC | newest]

Thread overview: 14+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2008-04-28 16:02 SPARK User Group 2008 roderick.chapman
2008-05-12 10:29 ` Michael
2008-05-13  7:47   ` Simon Wright
2008-05-16  6:57     ` Michael
2008-05-16  8:21       ` stefan-lucks
2008-05-16 21:41       ` Simon Wright
2008-05-25 20:14 ` Michael
2008-05-26 10:06   ` Simon Wright
2008-05-27 18:43     ` Michael
2008-05-27 19:23       ` Simon Wright
2008-05-27 20:07         ` Jeffrey R. Carter
2008-05-27 21:57           ` Ed Falis
2008-05-28  9:51       ` iFACTS (was: SPARK User Group 2008) Stuart
2008-06-01 20:47 ` SPARK User Group 2008 Michael

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox