comp.lang.ada
 help / color / mirror / Atom feed
From: "Randy Brukardt" <randy@rrsoftware.com>
Subject: Re: The future of Spark . Spark 2014 : a wreckage
Date: Wed, 10 Jul 2013 19:36:42 -0500
Date: 2013-07-10T19:36:42-05:00	[thread overview]
Message-ID: <krkums$ldf$1@loke.gir.dk> (raw)
In-Reply-To: 7ebe0439-fa4c-4374-a6c7-365ac78e6e39@googlegroups.com

<vincent.diemunsch@gmail.com> wrote in message 
news:7ebe0439-fa4c-4374-a6c7-365ac78e6e39@googlegroups.com..., replying to 
me

>> I think existing Spark is evil, in that it funnels off effort best 
>> applied
>> to the entire Ada language (or at least, a much larger subset). Computers
>> are now powerful enough that the excuses for avoiding proofs of 
>> exceptions,
>> dynamic memory allocation, and the like no longer are valid.
>
>Spark imposes static memory allocation to be able to prove total 
>correctness,
>which means that the program will always do what is expected, which is 
>required
>by embedded softwares...

The idea of "total correctness" is exactly what I object to. It's a false 
promise that only works on tiny, toy programs. Real Spark programs almost 
always interface to "regular" Ada code to do stuff that Spark can't handle, 
and once you've done that, you've lost your "total correctness".

In any case, static memory allocation is not necessary to prove correctness. 
All that's needed is a discipline in manging dynamically allocated memory 
(as happens inside of the containers, for instance). Ada already has strong 
typing of pointers, so the only problem to be avoided that affects 
correctness is premature destruction. Moreover, in Ada, you can use pools to 
keep the memory "theaters" separate, so you can avoid problems with 
fragmentation. (Although those are way overblown, they're very unlikely to 
occur in practice.)

It's certainly true that a tiny minority of systems cannot have any 
possibility of downtime, and those probably may have to stick with static 
allocation (although they could still use pools as an alternative). But it 
is silly to describe that as any sort of mainstream need.

>> The Globals notation is much more important. We tried to design such a
>> notation for Ada 2012, but it became too much work to complete and there 
>> was
>> a lot of concern about getting it wrong. So we left this for the 
>> "experts".
>> It appears that the experts are working on it, which is good, because we
>> still want this capability in the next version of Ada (whenever that will
>> be). And for that to be the case, we'll need it to have proper Ada syntax
>> using aspects.

>Ada has been designed so as to let global variables be freely accessed. 
>Only a
>pragma was used to specify that a function is Pure (of side effects) and 
>can be
>skipped for optimisation. Therefore it is now almost impossible to change 
>that

Right, we'd never change that.

>and that is probably the reason why you had so much difficulties to define 
>the
>global annotation, which is trivial,

It's not remotely trivial, because you have to deal with global variables 
that you don't have visibility onto and thus must not name (those in package 
bodies, especially), you have to deal intelligently with storage pools and 
access types, and you have to deal with generic units. Spark ignores almost 
all of these issues by not allowing the constructs, which is a non-starter 
for a construct for the full Ada language. We also required checking that 
the annotations were accurate, and that too is a fairly complex issue.

>because how will you deal with a procedure that have no annotation, but 
>still
>uses global variables ? A simple warning at compile time ? Refuse to 
>compile ?

No annotation means "uses anything", of course. You have to specify that a 
routine does NOT use globals. Perhaps that's backwards of what we would do 
if starting from scratch, but we're not starting from scratch. This is the 
same way we dealt with "overriding" and "limited" and lots of other things.

>Aspects syntax is ugly. It is simply the return of a kind of primitive 
>language like LISP,
>done with words and arrows. It shows clearly that it is simply a 
>workaround.

It's ten times better than using pragmas. Pragmas are the worst possible way 
to do anything. The rest is just Ada expression syntax, which has the 
distinct advantage that programmers don't have to learn anything new to use 
it. If you don't believe that Ada syntax is far superior to any other 
language's syntax, go back to curly bracket hell. :-) The barrier to using 
Spark is far too high, because it's a completely new syntax.

I agree with you that the Depends syntax is ugly, but that's mainly because 
Depends is a stupid thing that should not exist, and thus does not fit at 
all into the Ada model.

>> No, to allow *compile-time* analysis of pre and post conditions. An Ada
>> compiler cannot look into the body of a package to see what is there, and
>> without expression functions, almost no compile-time analysis could be
>> accomplished. (We also need globals annotations for similar reasons.) The
>> real end-game here ought to be compile-time proof (that is, proofs that 
>> are
>> created as part of the normal compilation of Ada code).

>A proof can't be generated at compile time ! The proof must be written
>independently, based on formal specification, and checked by an analyser,
>that may or may not be the compiler.

Wow, what an elitist attitude! *Every* compiler, for every language, uses 
proof techniques when it does optimizations and code generation. "I can 
prove that V always has value 5 here, because every path to here either 
fails to modify V or sets it to 5." Why shouldn't Ada programmers have 
access to that sort of checking?

Moreover, only a very small number of people/projects are willing to spend 
extra time to prove a working program. Schedules and the like make those 
sorts of extras the first thing to go. If we want more people to use these 
sorts of tools, they have to be integrated into the compiler so that using 
them is just as automatic as getting type and range checks.

>> Proofs of any kind should never, ever depend on anything in the package 
>> body
[other than the subprogram that you are proving, of course.]

>As others told you, a proof is a proof that the implementation respects the 
>specification,
>therefore it needs to make an interpretation of the package body.

If the one, single package body that it is currently proving. Looking into 
any *other* package body adds additional dependencies.

>> Spark users are on a certain dead-end, IMHO. The future is getting proof
>> into the hands of all programmers, as a normal part of their day-to-day
>>  programming. That requires that proofs aren't necessarily complete, that
>> hey use whatever information is available, and most of all, that they 
>> only
>> depend on the specifications of packages (never, ever, on the contents of 
>> a
>>  body).

>But you seem to make also the confusion between a test and a proof, and 
>that is
>exactly what AdaCore is doing now : they simply write the tests as post-
>conditions and uses symbolic interpretation during compile time. I think 
>that is
>what you mention by "into the hands of all programmers", ie even those with
>very few theoretical background.

Right; no one should need to understand the details in order to get the 
benefits. Do you understand how your Ada compiler does code selection or 
optimization? Of course not, but you still get the benefits.

And the entire point is that there is a strong correlation between test and 
proof. The programmer writes a "test" (as you put it; Ada 2012 calls it an 
assertion), and the compiler uses proof to verify that it is true (including 
all of the optimization tricks that compiler writers have been using for 
decades. If the proof is true, the "test" is eliminated from the generated 
code; if the proof fails, the "test" is generated, and the programmer gets a 
warning or error (depending on switches).

>It's a nice feature to sell and great fun for compiler writers but it's far 
>from a mathematical
>proof and what Spark was able to do. But mathematical proof requires formal 
>specification
>and that is far from the "all programmers" knowledge. The problem is that 
>recent books on
>Spark completly forgot to speak of the necessary formal specifications 
>(like Z for instance).

This is a case of the tail wagging the dog. (It's not surprising though, 
people often get so wrapped up in something that they forget why they 
started to do it in the first place.)

The goal here is to reduce the number of bugs in programs. No one cares how 
that is accomplished. "Formal mathematical proof" is one such tool, but it's 
almost completely inaccessible to the average programmer and project. And no 
project manager cares about a formal proof; they only care that their 
project is finished on time with a minimal number of bugs.

>To conclude here is my vision of Spark and Ada :
>- use Ada for things that are necessary for the compiler to produce the 
>code and to
>  express the semantic of the program.
>- use Pragmas only as compiler directives, for specific implementation or 
>optimisation
> (InLine, Ravenscar, Remote-Call-Interface...). Not for assertion, tests or 
> whatever.
Pragmas shouldn't be used for anything, IMHO. Maybe other that global 
properties; certainly nothing specific to an entity.
>- everything that is related to formal verification should stay as comment 
>in
>the code. These comments can then be verified by external tools. There can 
>be
>different levels of verification, not only the Spark vision.
>- formal specifications are necessary for formal verification, therefore 
>Spark
>should be able to make the link between formal annotations and formal 
>specification.
>- tests can be put either as comments or as external files, but please not 
>as aspects.
>It is ugly to write the detail of a procedure in the spec.

>Forget Aspects and worst expression functions : they are simply an error.
>Mixing testing, specification and proofs as pragma or aspects or a 
>combination
>of both of them will result in a too complicated language lacking of 
>abstraction and
>a proper syntax.

Too late, Ada 2012 was standardized in December and couldn't be recinded if 
we wanted to. Far too much code already exists. You should have come to the 
ARG and made these comments 3 years ago. Of course, it's unlikely that they 
would have changed anything -- we had comments from members of the formal 
methods community and they were taken very seriously. And we decided to go 
with aspects, expression functions and the like. Now that the path has been 
set, it's likely that most will follow along.

You of course don't have to follow the path -- you can stay with Ada 2005 
and Spark 2005 (which I'm sure will continue to be supported for a long time 
to come). But if you want to stay at the bleeding edge, then you're going to 
have to adopt Ada 2012, and everything that comes with it.

It's fairly clear that you have a real problem with the Ada model of 
specification (as it has existed forever), because there is no mixing here: 
assertions, proofs, and specifications are all different aspects (sorry 
about the pun) of the same whole; what's artifical is separating them in the 
first place.

The truly formal stuff shouldn't be in the Ada source code at all (I'd go so 
far as to say it shouldn't *exist* at all, but who am I to prevent people 
from wasting their time).

Anyway, good luck, no matter how you decide to proceed with Ada. I hope of 
course, that you will get over your initial reaction to Ada 2012 and join us 
in a better future way of programming, but if not, good luck wasting time. 
:-)

                               Randy.




  parent reply	other threads:[~2013-07-11  0:36 UTC|newest]

Thread overview: 68+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2013-07-08 17:56 The future of Spark . Spark 2014 : a wreckage vincent.diemunsch
2013-07-08 19:24 ` Simon Wright
2013-07-08 20:59 ` Florian Weimer
2013-07-09  7:40   ` Dmitry A. Kazakov
2013-07-09  8:30     ` Georg Bauhaus
2013-07-09  8:58       ` Dmitry A. Kazakov
2013-07-09 10:27         ` G.B.
2013-07-09 11:13         ` G.B.
2013-07-09 15:14     ` Adam Beneschan
2013-07-09 22:51     ` Robert A Duff
2013-07-10  7:49       ` Dmitry A. Kazakov
2013-07-10  8:21         ` Georg Bauhaus
2013-07-08 21:32 ` Randy Brukardt
2013-07-09  7:28   ` Dmitry A. Kazakov
2013-07-09 11:29     ` Simon Wright
2013-07-09 12:22       ` Dmitry A. Kazakov
2013-07-09 20:37     ` Randy Brukardt
2013-07-10 10:03       ` Dmitry A. Kazakov
2013-07-10 23:21         ` Randy Brukardt
2013-07-11  7:44           ` Dmitry A. Kazakov
2013-07-11 22:28             ` Randy Brukardt
2013-07-12  8:02               ` Dmitry A. Kazakov
2013-07-12 21:16                 ` Randy Brukardt
2013-07-14 10:19                   ` Dmitry A. Kazakov
2013-07-14 15:57                     ` Georg Bauhaus
2013-07-16  0:16                       ` Randy Brukardt
2013-07-17  1:30                         ` Shark8
2013-07-17 23:08                           ` Randy Brukardt
2013-07-18  7:19                             ` Dmitry A. Kazakov
2013-07-19  4:36                               ` Randy Brukardt
2013-07-16  0:13                     ` Randy Brukardt
2013-07-16  8:37                       ` Dmitry A. Kazakov
2013-07-16 22:13                         ` Randy Brukardt
2013-07-17  7:59                           ` Dmitry A. Kazakov
2013-07-17 23:26                             ` Randy Brukardt
2013-07-18  7:37                               ` Dmitry A. Kazakov
2013-07-19  4:32                                 ` Randy Brukardt
2013-07-19  7:16                                   ` Dmitry A. Kazakov
2013-07-20  5:32                                     ` Randy Brukardt
2013-07-20  9:06                                       ` Dmitry A. Kazakov
2013-07-12  1:01           ` Slow? Ada?? Bill Findlay
2013-07-09  7:57   ` The future of Spark . Spark 2014 : a wreckage Stefan.Lucks
2013-07-09 20:46     ` Randy Brukardt
2013-07-10  8:03       ` Stefan.Lucks
2013-07-10 12:46         ` Simon Wright
2013-07-10 23:10         ` Randy Brukardt
2013-07-11 19:23           ` Stefan.Lucks
2013-07-12  0:21             ` Randy Brukardt
2013-07-12  9:12               ` Stefan.Lucks
2013-07-12 20:47                 ` Randy Brukardt
2013-08-05  5:43                 ` Paul Rubin
2013-07-10 12:19   ` vincent.diemunsch
2013-07-10 16:02     ` Simon Wright
2013-07-11  0:36     ` Randy Brukardt [this message]
2013-07-11  6:19       ` Simon Wright
2013-07-11 23:11         ` Randy Brukardt
2013-07-11 10:10       ` vincent.diemunsch
2013-07-11 14:23         ` Dmitry A. Kazakov
2013-07-12  0:07           ` Randy Brukardt
2013-07-12  0:00         ` Randy Brukardt
2013-07-12  7:25           ` Simon Wright
2013-07-12 20:07             ` Randy Brukardt
2013-07-12 14:23           ` Robert A Duff
2013-07-12 20:14             ` Randy Brukardt
2013-07-11 23:50       ` Shark8
2013-07-08 23:18 ` Peter C. Chapin
2013-07-09  7:34   ` Stefan.Lucks
2013-07-09 15:21     ` Adam Beneschan
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox