From: Phil Thornley <phil.jpthornley@gmail.com>
Subject: Re: Psuedo-Dynamic memory with Spark?
Date: Tue, 5 Feb 2013 09:56:57 -0000
Date: 2013-02-05T09:56:57+00:00 [thread overview]
Message-ID: <MPG.2b7ae7a35a28d5ab9896a2@news.zen.co.uk> (raw)
In-Reply-To: 996a2b43-d409-4762-b795-85831b62419b@googlegroups.com
In article <996a2b43-d409-4762-b795-85831b62419b@googlegroups.com>,
phathax0r@gmail.com says...
>
> Normally Spark does not allow Dynamic Memory allocation during program
execution. However is there a way to allow a Spark program to allocate
memory at program elaboration? i.e. At program the
program/process/partition boot sequence? Since each each instance of the
program might have different memory requirements.
>
> The memory requirements would remain static during the execution of
the main program, I just need to have the ability to allocate different
amounts of memory during program initialization.
You can hide elaboration code of a package from the Examiner by putting
a --# hide annotation immediately after the 'begin' for that code.
You may also need to hide some of the code that accesses the memory -
and you can also --# hide the bodies of individual subroutines.
Section 9.2 of the book has a complete list of which sections of code
can be hidden - and, of course, a completely non-SPARK package body
would not be examined at all.
If the visible part of a package specification must contain non-SPARK
code, then you may be able to create a 'shadow' specification - one that
is used for the Examiner in place of the actual spec used for
compilation.
>
> Also, I was thinking that it might be useful if Spark, by default, barred dynamic memory allocation, except in cases where the programmer called Ada.Storage_Pools explicitly. And then only allowed it at the library level, and only with the packages that explicitly "with"ed the Storage_Pool implementation.
From what was said at the SPARK User Group meeting last October it
appears that the definition of SPARK is going to change substantially in
the next couple of years - and I'm fairly sure that dynamic memory was
mentioned as one of the possible extensions. We may find out before the
end of this year what the new SPARK will contain.
Cheers,
Phil
next prev parent reply other threads:[~2013-02-05 9:56 UTC|newest]
Thread overview: 3+ messages / expand[flat|nested] mbox.gz Atom feed top
2013-02-04 20:04 Psuedo-Dynamic memory with Spark? Diogenes
2013-02-05 9:56 ` Phil Thornley [this message]
2013-02-05 13:03 ` Stephen Leake
replies disabled
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox