From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=0.7 required=5.0 tests=BAYES_00,FREEMAIL_FROM, FREEMAIL_REPLYTO,REPLYTO_WITHOUT_TO_CC autolearn=no autolearn_force=no version=3.4.4 X-Google-Thread: a07f3367d7,19c470088f7334d3 X-Google-Attributes: gida07f3367d7,public,usenet X-Google-NewGroupId: yes X-Google-Language: ENGLISH,ASCII-7-bit X-Received: by 10.66.80.41 with SMTP id o9mr856258pax.4.1360469025676; Sat, 09 Feb 2013 20:03:45 -0800 (PST) Path: mj10ni1850pbb.1!nntp.google.com!border1.nntp.dca.giganews.com!border4.nntp.dca.giganews.com!border2.nntp.dca.giganews.com!nntp.giganews.com!nrc-news.nrc.ca!goblin1!goblin.stu.neva.ru!feeds.phibee-telecom.net!zen.net.uk!dedekind.zen.co.uk!reader02.nrc01.news.zen.net.uk.POSTED!not-for-mail From: Phil Thornley Newsgroups: comp.lang.ada Subject: Re: Psuedo-Dynamic memory with Spark? Date: Tue, 5 Feb 2013 09:56:57 -0000 Message-ID: References: <996a2b43-d409-4762-b795-85831b62419b@googlegroups.com> Reply-To: phil.jpthornley@gmail.com MIME-Version: 1.0 User-Agent: MicroPlanet-Gravity/3.0.4 Organization: Zen Internet NNTP-Posting-Host: 950b7ef9.news.zen.co.uk X-Trace: DXC=kh?B0BRk[lH<0m3mFa?g@M]G;bfYi23hD=dR0\ckLKG@WeZ<[7LZNRFa5UMPWFUJkCUhLi?]0KG=KXkZUWH4_UHLjMF[@JSMFiF X-Complaints-To: abuse@zen.co.uk Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: 7bit Date: 2013-02-05T09:56:57+00:00 List-Id: In article <996a2b43-d409-4762-b795-85831b62419b@googlegroups.com>, phathax0r@gmail.com says... > > Normally Spark does not allow Dynamic Memory allocation during program execution. However is there a way to allow a Spark program to allocate memory at program elaboration? i.e. At program the program/process/partition boot sequence? Since each each instance of the program might have different memory requirements. > > The memory requirements would remain static during the execution of the main program, I just need to have the ability to allocate different amounts of memory during program initialization. You can hide elaboration code of a package from the Examiner by putting a --# hide annotation immediately after the 'begin' for that code. You may also need to hide some of the code that accesses the memory - and you can also --# hide the bodies of individual subroutines. Section 9.2 of the book has a complete list of which sections of code can be hidden - and, of course, a completely non-SPARK package body would not be examined at all. If the visible part of a package specification must contain non-SPARK code, then you may be able to create a 'shadow' specification - one that is used for the Examiner in place of the actual spec used for compilation. > > Also, I was thinking that it might be useful if Spark, by default, barred dynamic memory allocation, except in cases where the programmer called Ada.Storage_Pools explicitly. And then only allowed it at the library level, and only with the packages that explicitly "with"ed the Storage_Pool implementation. >From what was said at the SPARK User Group meeting last October it appears that the definition of SPARK is going to change substantially in the next couple of years - and I'm fairly sure that dynamic memory was mentioned as one of the possible extensions. We may find out before the end of this year what the new SPARK will contain. Cheers, Phil