From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00,FREEMAIL_FROM autolearn=unavailable autolearn_force=no version=3.4.4 X-Received: by 10.224.41.145 with SMTP id o17mr22800308qae.3.1373458792642; Wed, 10 Jul 2013 05:19:52 -0700 (PDT) X-Received: by 10.49.17.166 with SMTP id p6mr935357qed.18.1373458792625; Wed, 10 Jul 2013 05:19:52 -0700 (PDT) Path: border1.nntp.dca3.giganews.com!border3.nntp.dca.giganews.com!border1.nntp.dca.giganews.com!nntp.giganews.com!t19no997027qam.0!news-out.google.com!f7ni1805qai.0!nntp.google.com!t19no1050003qam.0!postnews.google.com!glegroupsg2000goo.googlegroups.com!not-for-mail Newsgroups: comp.lang.ada Date: Wed, 10 Jul 2013 05:19:52 -0700 (PDT) In-Reply-To: Complaints-To: groups-abuse@google.com Injection-Info: glegroupsg2000goo.googlegroups.com; posting-host=163.116.6.12; posting-account=hya6vwoAAADTA0O27Aq3u6Su3lQKpSMz NNTP-Posting-Host: 163.116.6.12 References: User-Agent: G2/1.0 MIME-Version: 1.0 Message-ID: <7ebe0439-fa4c-4374-a6c7-365ac78e6e39@googlegroups.com> Subject: Re: The future of Spark . Spark 2014 : a wreckage From: vincent.diemunsch@gmail.com Injection-Date: Wed, 10 Jul 2013 12:19:52 +0000 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable X-Original-Bytes: 6249 Xref: number.nntp.dca.giganews.com comp.lang.ada:182428 Date: 2013-07-10T05:19:52-07:00 List-Id: Hello Randy, > I think existing Spark is evil, in that it funnels off effort best applie= d=20 > to the entire Ada language (or at least, a much larger subset). Computers= =20 > are now powerful enough that the excuses for avoiding proofs of exception= s,=20 > dynamic memory allocation, and the like no longer are valid. Spark imposes static memory allocation to be able to prove total correctnes= s, which means that the program will always do what is expected, which is requ= ired by embedded softwares...=20 > The Globals notation is much more important. We tried to design such a=20 > notation for Ada 2012, but it became too much work to complete and there = was=20 > a lot of concern about getting it wrong. So we left this for the "experts= ".=20 > It appears that the experts are working on it, which is good, because we= =20 > still want this capability in the next version of Ada (whenever that will= =20 > be). And for that to be the case, we'll need it to have proper Ada syntax= =20 > using aspects. Ada has been designed so as to let global variables be freely accessed. Onl= y a pragma was used to specify that a function is Pure (of side effects) an= d can be skipped for optimisation. Therefore it is now almost impossible to= change that and that is probably the reason why you had so much difficulti= es to define the global annotation, which is trivial, because how will you = deal with a procedure that have no annotation, but still uses global variab= les ? A simple warning at compile time ? Refuse to compile ? Aspects syntax is ugly. It is simply the return of a kind of primitive lang= uage like LISP, done with words and arrows. It shows clearly that it is sim= ply a workaround. > No, to allow *compile-time* analysis of pre and post conditions. An Ada= =20 > compiler cannot look into the body of a package to see what is there, and= =20 > without expression functions, almost no compile-time analysis could be=20 > accomplished. (We also need globals annotations for similar reasons.) The= =20 > real end-game here ought to be compile-time proof (that is, proofs that a= re=20 > created as part of the normal compilation of Ada code). A proof can't be generated at compile time ! The proof must be written inde= pendently, based on formal specification, and checked by an analyser, that = may or may not be the compiler. > Proofs of any kind should never, ever depend on anything in the package b= ody=20 As others told you, a proof is a proof that the implementation respects the= specification, therefore it needs to make an interpretation of the package= body. > Spark users are on a certain dead-end, IMHO. The future is getting proof= =20 > into the hands of all programmers, as a normal part of their day-to-day= =20 > programming. That requires that proofs aren't necessarily complete, that= =20 > hey use whatever information is available, and most of all, that they onl= y=20 > depend on the specifications of packages (never, ever, on the contents of= a=20 > body). But you seem to make also the confusion between a test and a proof, and tha= t is exactly what AdaCore is doing now : they simply write the tests as pos= t- conditions and uses symbolic interpretation during compile time. I think th= at is what you mention by "into the hands of all programmers", ie even thos= e with very few theoretical background. It's a nice feature to sell and great fun for compiler writers but it's far= from a mathematical proof and what Spark was able to do. But mathematical = proof requires formal specification and that is far from the "all programme= rs" knowledge. The problem is that recent books on Spark completly forgot t= o speak of the necessary formal specifications (like Z for instance). To conclude here is my vision of Spark and Ada : - use Ada for things that are necessary for the compiler to produce the cod= e and to express the semantic of the program. - use Pragmas only as compiler directives, for specific implementation or o= ptimisation (InLine, Ravenscar, Remote-Call-Interface...). Not for assertio= n, tests or whatever. - everything that is related to formal verification should stay as comment = in the code. These comments can then be verified by external tools. There can = be different levels of verification, not only the Spark vision. - formal specifications are necessary for formal verification, therefore Sp= ark should be able to make the link between formal annotations and formal s= pecification. - tests can be put either as comments or as external files, but please not = as aspects. It is ugly to write the detail of a procedure in the spec. Forget Aspects and worst expression functions :=A0they are simply an error. Mixing testing, specification and proofs as pragma or aspects or a combinat= ion of both of them will result in a too complicated language lacking of ab= straction and a proper syntax. Regards=20 Vincent