comp.lang.ada
 help / color / mirror / Atom feed
* Psuedo-Dynamic memory with Spark?
@ 2013-02-04 20:04 Diogenes
  2013-02-05  9:56 ` Phil Thornley
  2013-02-05 13:03 ` Stephen Leake
  0 siblings, 2 replies; 3+ messages in thread
From: Diogenes @ 2013-02-04 20:04 UTC (permalink / raw)


Normally Spark does not allow Dynamic Memory allocation during program execution. However is there a way to allow a Spark program to allocate memory at program elaboration? i.e. At program the program/process/partition boot sequence? Since each each instance of the program might have different memory requirements.

The memory requirements would remain static during the execution of the main program, I just need to have the ability to allocate different amounts of memory during program initialization.

Also, I was thinking that it might be useful if Spark, by default, barred dynamic memory allocation, except in cases where the programmer called Ada.Storage_Pools explicitly. And then only allowed it at the library level, and only with the packages that explicitly "with"ed the Storage_Pool implementation.

Yes, no?

Diogenes



^ permalink raw reply	[flat|nested] 3+ messages in thread

end of thread, other threads:[~2013-02-05 13:03 UTC | newest]

Thread overview: 3+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2013-02-04 20:04 Psuedo-Dynamic memory with Spark? Diogenes
2013-02-05  9:56 ` Phil Thornley
2013-02-05 13:03 ` Stephen Leake

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox