comp.lang.ada
 help / color / mirror / Atom feed
From: Diogenes <phathax0r@gmail.com>
Subject: Psuedo-Dynamic memory with Spark?
Date: Mon, 4 Feb 2013 12:04:32 -0800 (PST)
Date: 2013-02-04T12:04:32-08:00	[thread overview]
Message-ID: <996a2b43-d409-4762-b795-85831b62419b@googlegroups.com> (raw)

Normally Spark does not allow Dynamic Memory allocation during program execution. However is there a way to allow a Spark program to allocate memory at program elaboration? i.e. At program the program/process/partition boot sequence? Since each each instance of the program might have different memory requirements.

The memory requirements would remain static during the execution of the main program, I just need to have the ability to allocate different amounts of memory during program initialization.

Also, I was thinking that it might be useful if Spark, by default, barred dynamic memory allocation, except in cases where the programmer called Ada.Storage_Pools explicitly. And then only allowed it at the library level, and only with the packages that explicitly "with"ed the Storage_Pool implementation.

Yes, no?

Diogenes



             reply	other threads:[~2013-02-04 20:04 UTC|newest]

Thread overview: 3+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2013-02-04 20:04 Diogenes [this message]
2013-02-05  9:56 ` Psuedo-Dynamic memory with Spark? Phil Thornley
2013-02-05 13:03 ` Stephen Leake
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox