comp.lang.ada
 help / color / mirror / Atom feed
From: "G.B." <rm-dash-bau-haus@dash.futureapps.de>
Subject: Re: I'm a noob here.
Date: Mon, 05 May 2014 18:40:11 +0200
Date: 2014-05-05T18:40:11+02:00	[thread overview]
Message-ID: <5367beeb$0$6706$9b4e6d93@newsspool2.arcor-online.net> (raw)
In-Reply-To: <c5be2835-dbb0-490d-a484-19c1a252a175@googlegroups.com>

On 05.05.14 15:52, sdalemorrey@gmail.com wrote:
> On Monday, May 5, 2014 6:43:50 AM UTC-6, Dan'l Miller wrote:
>> In brief, what the OP is asking is:  precisely how does a SPARK compiler actually utilize the meta-information annotations?  E.g., precisely how does a SPARK compiler make logical inferences regarding correctness of Ada-subset source code to verify that the imperative code does what the annotations have declared?
>
> Well there is that and then there is the more general question of "What is the relationship of Ada to Spark" is one a subset or superset of the other?

To those unfortunate ones who cannot buy this information
about Ada vs SPARK from AdaCore,

    SPARK (originally) defines a subset of the Ada language
    in such a way that the absence of a class of errors can
    be proved mathematically, with the help of SPARK tools.

Think "lint on formal steroids".

SPARK's analysis needs more information that is available in a
typical Ada program, and this information is added in formal
comment syntax. Being comments, an Ada compiler will ignore them.
Being formal comments, SPARK tools will interpret the formal
comments as they apply to the program text proper.

More recently, AdaCore has integrated SPARK with Ada 2012,
in such a way that their recent Ada toolset can provide the
same functionality as SPARK, thereby removing the need
to have SPARK tools plus any Ada compiler, basically letting you
do the work with AdaCore's recent GNAT.

> Is every valid Spark program a valid Ada program,

Yes.

> or is every valid Ada program also a valid Spark program?

No, since SPARK restricts the Ada language.

> Ada documentation seems to be a bit thin, but Spark documentation has been nearly impossible to track down.  To my mind this means I'm either asking the question wrong, or asking the wrong question.
>
> A good primer on Spark as well as what tools I would need to see Spark in it's full glory would be a good place for me to start I believe.

There is quite a bit of documentation, including examples,
in the distributions themselves.

  parent reply	other threads:[~2014-05-05 16:40 UTC|newest]

Thread overview: 13+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2014-05-05 11:28 I'm a noob here sdalemorrey
2014-05-05 12:43 ` Dan'l Miller
2014-05-05 13:52   ` sdalemorrey
2014-05-05 13:59     ` roderick.chapman
2014-05-05 16:40     ` G.B. [this message]
2014-05-05 16:47     ` G.B.
2014-05-05 17:35     ` Shark8
2014-05-06 10:47   ` Brian Drummond
2014-05-05 13:57 ` roderick.chapman
2014-05-05 17:34 ` Shark8
2014-05-05 17:46 ` Jeffrey Carter
2014-05-05 18:07   ` Nasser M. Abbasi
2014-05-05 19:02     ` Georg Bauhaus
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox