From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=unavailable autolearn_force=no version=3.4.4 X-FeedAbuse: http://nntpfeed.proxad.net/abuse.pl feeded by 78.192.65.63 Path: border1.nntp.ams.giganews.com!nntp.giganews.com!feeder2-2.proxad.net!proxad.net!feeder1-2.proxad.net!nntpfeed.proxad.net!news.muarf.org!news.ecp.fr!news.jacob-sparre.dk!loke.jacob-sparre.dk!pnx.dk!.POSTED!not-for-mail From: "Randy Brukardt" Newsgroups: comp.lang.ada Subject: Re: The future of Spark . Spark 2014 : a wreckage Date: Wed, 10 Jul 2013 19:36:42 -0500 Organization: Jacob Sparre Andersen Research & Innovation Message-ID: References: <7ebe0439-fa4c-4374-a6c7-365ac78e6e39@googlegroups.com> NNTP-Posting-Host: static-69-95-181-76.mad.choiceone.net X-Trace: loke.gir.dk 1373503005 21935 69.95.181.76 (11 Jul 2013 00:36:45 GMT) X-Complaints-To: news@jacob-sparre.dk NNTP-Posting-Date: Thu, 11 Jul 2013 00:36:45 +0000 (UTC) X-Priority: 3 X-MSMail-Priority: Normal X-Newsreader: Microsoft Outlook Express 6.00.2900.5931 X-RFC2646: Format=Flowed; Original X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.6157 Date: 2013-07-10T19:36:42-05:00 List-Id: wrote in message news:7ebe0439-fa4c-4374-a6c7-365ac78e6e39@googlegroups.com..., replying to me >> I think existing Spark is evil, in that it funnels off effort best >> applied >> to the entire Ada language (or at least, a much larger subset). Computers >> are now powerful enough that the excuses for avoiding proofs of >> exceptions, >> dynamic memory allocation, and the like no longer are valid. > >Spark imposes static memory allocation to be able to prove total >correctness, >which means that the program will always do what is expected, which is >required >by embedded softwares... The idea of "total correctness" is exactly what I object to. It's a false promise that only works on tiny, toy programs. Real Spark programs almost always interface to "regular" Ada code to do stuff that Spark can't handle, and once you've done that, you've lost your "total correctness". In any case, static memory allocation is not necessary to prove correctness. All that's needed is a discipline in manging dynamically allocated memory (as happens inside of the containers, for instance). Ada already has strong typing of pointers, so the only problem to be avoided that affects correctness is premature destruction. Moreover, in Ada, you can use pools to keep the memory "theaters" separate, so you can avoid problems with fragmentation. (Although those are way overblown, they're very unlikely to occur in practice.) It's certainly true that a tiny minority of systems cannot have any possibility of downtime, and those probably may have to stick with static allocation (although they could still use pools as an alternative). But it is silly to describe that as any sort of mainstream need. >> The Globals notation is much more important. We tried to design such a >> notation for Ada 2012, but it became too much work to complete and there >> was >> a lot of concern about getting it wrong. So we left this for the >> "experts". >> It appears that the experts are working on it, which is good, because we >> still want this capability in the next version of Ada (whenever that will >> be). And for that to be the case, we'll need it to have proper Ada syntax >> using aspects. >Ada has been designed so as to let global variables be freely accessed. >Only a >pragma was used to specify that a function is Pure (of side effects) and >can be >skipped for optimisation. Therefore it is now almost impossible to change >that Right, we'd never change that. >and that is probably the reason why you had so much difficulties to define >the >global annotation, which is trivial, It's not remotely trivial, because you have to deal with global variables that you don't have visibility onto and thus must not name (those in package bodies, especially), you have to deal intelligently with storage pools and access types, and you have to deal with generic units. Spark ignores almost all of these issues by not allowing the constructs, which is a non-starter for a construct for the full Ada language. We also required checking that the annotations were accurate, and that too is a fairly complex issue. >because how will you deal with a procedure that have no annotation, but >still >uses global variables ? A simple warning at compile time ? Refuse to >compile ? No annotation means "uses anything", of course. You have to specify that a routine does NOT use globals. Perhaps that's backwards of what we would do if starting from scratch, but we're not starting from scratch. This is the same way we dealt with "overriding" and "limited" and lots of other things. >Aspects syntax is ugly. It is simply the return of a kind of primitive >language like LISP, >done with words and arrows. It shows clearly that it is simply a >workaround. It's ten times better than using pragmas. Pragmas are the worst possible way to do anything. The rest is just Ada expression syntax, which has the distinct advantage that programmers don't have to learn anything new to use it. If you don't believe that Ada syntax is far superior to any other language's syntax, go back to curly bracket hell. :-) The barrier to using Spark is far too high, because it's a completely new syntax. I agree with you that the Depends syntax is ugly, but that's mainly because Depends is a stupid thing that should not exist, and thus does not fit at all into the Ada model. >> No, to allow *compile-time* analysis of pre and post conditions. An Ada >> compiler cannot look into the body of a package to see what is there, and >> without expression functions, almost no compile-time analysis could be >> accomplished. (We also need globals annotations for similar reasons.) The >> real end-game here ought to be compile-time proof (that is, proofs that >> are >> created as part of the normal compilation of Ada code). >A proof can't be generated at compile time ! The proof must be written >independently, based on formal specification, and checked by an analyser, >that may or may not be the compiler. Wow, what an elitist attitude! *Every* compiler, for every language, uses proof techniques when it does optimizations and code generation. "I can prove that V always has value 5 here, because every path to here either fails to modify V or sets it to 5." Why shouldn't Ada programmers have access to that sort of checking? Moreover, only a very small number of people/projects are willing to spend extra time to prove a working program. Schedules and the like make those sorts of extras the first thing to go. If we want more people to use these sorts of tools, they have to be integrated into the compiler so that using them is just as automatic as getting type and range checks. >> Proofs of any kind should never, ever depend on anything in the package >> body [other than the subprogram that you are proving, of course.] >As others told you, a proof is a proof that the implementation respects the >specification, >therefore it needs to make an interpretation of the package body. If the one, single package body that it is currently proving. Looking into any *other* package body adds additional dependencies. >> Spark users are on a certain dead-end, IMHO. The future is getting proof >> into the hands of all programmers, as a normal part of their day-to-day >> programming. That requires that proofs aren't necessarily complete, that >> hey use whatever information is available, and most of all, that they >> only >> depend on the specifications of packages (never, ever, on the contents of >> a >> body). >But you seem to make also the confusion between a test and a proof, and >that is >exactly what AdaCore is doing now : they simply write the tests as post- >conditions and uses symbolic interpretation during compile time. I think >that is >what you mention by "into the hands of all programmers", ie even those with >very few theoretical background. Right; no one should need to understand the details in order to get the benefits. Do you understand how your Ada compiler does code selection or optimization? Of course not, but you still get the benefits. And the entire point is that there is a strong correlation between test and proof. The programmer writes a "test" (as you put it; Ada 2012 calls it an assertion), and the compiler uses proof to verify that it is true (including all of the optimization tricks that compiler writers have been using for decades. If the proof is true, the "test" is eliminated from the generated code; if the proof fails, the "test" is generated, and the programmer gets a warning or error (depending on switches). >It's a nice feature to sell and great fun for compiler writers but it's far >from a mathematical >proof and what Spark was able to do. But mathematical proof requires formal >specification >and that is far from the "all programmers" knowledge. The problem is that >recent books on >Spark completly forgot to speak of the necessary formal specifications >(like Z for instance). This is a case of the tail wagging the dog. (It's not surprising though, people often get so wrapped up in something that they forget why they started to do it in the first place.) The goal here is to reduce the number of bugs in programs. No one cares how that is accomplished. "Formal mathematical proof" is one such tool, but it's almost completely inaccessible to the average programmer and project. And no project manager cares about a formal proof; they only care that their project is finished on time with a minimal number of bugs. >To conclude here is my vision of Spark and Ada : >- use Ada for things that are necessary for the compiler to produce the >code and to > express the semantic of the program. >- use Pragmas only as compiler directives, for specific implementation or >optimisation > (InLine, Ravenscar, Remote-Call-Interface...). Not for assertion, tests or > whatever. Pragmas shouldn't be used for anything, IMHO. Maybe other that global properties; certainly nothing specific to an entity. >- everything that is related to formal verification should stay as comment >in >the code. These comments can then be verified by external tools. There can >be >different levels of verification, not only the Spark vision. >- formal specifications are necessary for formal verification, therefore >Spark >should be able to make the link between formal annotations and formal >specification. >- tests can be put either as comments or as external files, but please not >as aspects. >It is ugly to write the detail of a procedure in the spec. >Forget Aspects and worst expression functions : they are simply an error. >Mixing testing, specification and proofs as pragma or aspects or a >combination >of both of them will result in a too complicated language lacking of >abstraction and >a proper syntax. Too late, Ada 2012 was standardized in December and couldn't be recinded if we wanted to. Far too much code already exists. You should have come to the ARG and made these comments 3 years ago. Of course, it's unlikely that they would have changed anything -- we had comments from members of the formal methods community and they were taken very seriously. And we decided to go with aspects, expression functions and the like. Now that the path has been set, it's likely that most will follow along. You of course don't have to follow the path -- you can stay with Ada 2005 and Spark 2005 (which I'm sure will continue to be supported for a long time to come). But if you want to stay at the bleeding edge, then you're going to have to adopt Ada 2012, and everything that comes with it. It's fairly clear that you have a real problem with the Ada model of specification (as it has existed forever), because there is no mixing here: assertions, proofs, and specifications are all different aspects (sorry about the pun) of the same whole; what's artifical is separating them in the first place. The truly formal stuff shouldn't be in the Ada source code at all (I'd go so far as to say it shouldn't *exist* at all, but who am I to prevent people from wasting their time). Anyway, good luck, no matter how you decide to proceed with Ada. I hope of course, that you will get over your initial reaction to Ada 2012 and join us in a better future way of programming, but if not, good luck wasting time. :-) Randy.