From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,6bc8ce99e45144be X-Google-Attributes: gid103376,public From: "Marc A. Criley" Subject: Re: Task stack overflow problem with GNAT Date: 2000/10/06 Message-ID: <39DE061B.51F62B69@icdc.com>#1/1 X-Deja-AN: 678356111 Content-Transfer-Encoding: 7bit References: <8rkhnt$4ac$1@nnrp1.deja.com> X-Accept-Language: en Content-Type: text/plain; charset=us-ascii X-Complaints-To: newsabuse@supernews.com Organization: Posted via Supernews, http://www.supernews.com MIME-Version: 1.0 Newsgroups: comp.lang.ada Date: 2000-10-06T00:00:00+00:00 List-Id: fabien_bousquet@my-deja.com wrote: > > Hi, > I have a task which is raising a Storage_Error exception ("Stack > overflow"). > I am using the compiler GNAT 3.13p. > I have tried to put in the specification of my task type a "pragma > Storage_Size(32000)" and it has changed nothing !! > Is there another way to change a task stack size with the GNAT or have > I misunderstood the usage of the pragma Storage_Size ? > One thing that that needs to be done to help narrow down the problem is to determine exactly where the raising of the Storage_Error is occurring. Is there something obvious there, like trying to declare a huge object that would be placed in the stack space? Or is there nothing suspicious at the point where the Storage_Error is raised? I experienced this frequently on a previous program on which I worked, and there were two solutions: 1) Find big data items and dynamically allocate them off the heap, rather than having them reside on the stack. 2) Increase the task storage size, as you said you've done. You state that you raised it to 32K and there's still a problem. Actually, that's not surprising to me. We usually started at 32K and went up. The largest Storage_Size allocation we had to use is on the order of 160K. We didn't do this across the board, but would just bump it up whenever it occurred (after scrubbing for any instances of unwarranted large objects or looking for instances where option #1 might be useful). Marc