comp.lang.ada
 help / color / mirror / Atom feed
* Psuedo-Dynamic memory with Spark?
@ 2013-02-04 20:04 Diogenes
  2013-02-05  9:56 ` Phil Thornley
  2013-02-05 13:03 ` Stephen Leake
  0 siblings, 2 replies; 3+ messages in thread
From: Diogenes @ 2013-02-04 20:04 UTC (permalink / raw)


Normally Spark does not allow Dynamic Memory allocation during program execution. However is there a way to allow a Spark program to allocate memory at program elaboration? i.e. At program the program/process/partition boot sequence? Since each each instance of the program might have different memory requirements.

The memory requirements would remain static during the execution of the main program, I just need to have the ability to allocate different amounts of memory during program initialization.

Also, I was thinking that it might be useful if Spark, by default, barred dynamic memory allocation, except in cases where the programmer called Ada.Storage_Pools explicitly. And then only allowed it at the library level, and only with the packages that explicitly "with"ed the Storage_Pool implementation.

Yes, no?

Diogenes



^ permalink raw reply	[flat|nested] 3+ messages in thread

* Re: Psuedo-Dynamic memory with Spark?
  2013-02-04 20:04 Psuedo-Dynamic memory with Spark? Diogenes
@ 2013-02-05  9:56 ` Phil Thornley
  2013-02-05 13:03 ` Stephen Leake
  1 sibling, 0 replies; 3+ messages in thread
From: Phil Thornley @ 2013-02-05  9:56 UTC (permalink / raw)


In article <996a2b43-d409-4762-b795-85831b62419b@googlegroups.com>, 
phathax0r@gmail.com says...
> 
> Normally Spark does not allow Dynamic Memory allocation during program 
execution. However is there a way to allow a Spark program to allocate 
memory at program elaboration? i.e. At program the 
program/process/partition boot sequence? Since each each instance of the 
program might have different memory requirements.
> 
> The memory requirements would remain static during the execution of 
the main program, I just need to have the ability to allocate different 
amounts of memory during program initialization.

You can hide elaboration code of a package from the Examiner by putting 
a --# hide annotation immediately after the 'begin' for that code.

You may also need to hide some of the code that accesses the memory - 
and you can also --# hide the bodies of individual subroutines.
Section 9.2 of the book has a complete list of which sections of code 
can be hidden - and, of course, a completely non-SPARK package body 
would not be examined at all.

If the visible part of a package specification must contain non-SPARK 
code, then you may be able to create a 'shadow' specification - one that 
is used for the Examiner in place of the actual spec used for 
compilation.

> 
> Also, I was thinking that it might be useful if Spark, by default, barred dynamic memory allocation, except in cases where the programmer called Ada.Storage_Pools explicitly. And then only allowed it at the library level, and only with the packages that explicitly "with"ed the Storage_Pool implementation.

From what was said at the SPARK User Group meeting last October it 
appears that the definition of SPARK is going to change substantially in 
the next couple of years - and I'm fairly sure that dynamic memory was 
mentioned as one of the possible extensions. We may find out before the 
end of this year what the new SPARK will contain.

Cheers,

Phil




^ permalink raw reply	[flat|nested] 3+ messages in thread

* Re: Psuedo-Dynamic memory with Spark?
  2013-02-04 20:04 Psuedo-Dynamic memory with Spark? Diogenes
  2013-02-05  9:56 ` Phil Thornley
@ 2013-02-05 13:03 ` Stephen Leake
  1 sibling, 0 replies; 3+ messages in thread
From: Stephen Leake @ 2013-02-05 13:03 UTC (permalink / raw)


Diogenes <phathax0r@gmail.com> writes:

> Normally Spark does not allow Dynamic Memory allocation during program
> execution. However is there a way to allow a Spark program to allocate
> memory at program elaboration? i.e. At program the
> program/process/partition boot sequence? 

How would you define that time precisely?

note that you can have generic packages instantiated in functions, so
"elaboration" can happen at any time during a programs execution.

-- 
-- Stephe



^ permalink raw reply	[flat|nested] 3+ messages in thread

end of thread, other threads:[~2013-02-05 13:03 UTC | newest]

Thread overview: 3+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2013-02-04 20:04 Psuedo-Dynamic memory with Spark? Diogenes
2013-02-05  9:56 ` Phil Thornley
2013-02-05 13:03 ` Stephen Leake

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox