From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=0.6 required=5.0 tests=BAYES_00,TO_NO_BRKTS_FROM_MSSP autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,5a97e6705e234408 X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2001-09-21 09:22:24 PST Path: archiver1.google.com!newsfeed.google.com!newsfeed.stanford.edu!feed.textport.net!newsranger.com!www.newsranger.com!not-for-mail Newsgroups: comp.lang.ada From: Ted Dennison References: <8f23da36.0109181403.52128d70@posting.google.com> <_hRp7.7630$ot.1153235@typhoon.ne.mediaone.net> <8f23da36.0109201115.2f708535@posting.google.com> Subject: Re: Expected bytes per sloc (semicolons) performance Message-ID: X-Abuse-Info: When contacting newsranger.com regarding abuse please X-Abuse-Info: forward the entire news article including headers or X-Abuse-Info: else we will not be able to process your request X-Complaints-To: abuse@newsranger.com NNTP-Posting-Date: Fri, 21 Sep 2001 12:22:05 EDT Organization: http://www.newsranger.com Date: Fri, 21 Sep 2001 16:22:05 GMT Xref: archiver1.google.com comp.lang.ada:13235 Date: 2001-09-21T16:22:05+00:00 List-Id: In article <8f23da36.0109201115.2f708535@posting.google.com>, Mike Harrison says... > >Thanks for the responses. Sorry the thread turned into a RISC discussion. > >So let me try to get back on track with a different question: >If you were bidding a 5K sloc estimated size program (algorithmic in >nature) to go into a satellite with limited memory, what sort of >bytes/per sloc estimate would you use, 5, 10, 20, 50, 100? Our >experience is 50 or more. Is this what others are seeing >with today's Ada compiler technology? I'm not sure I'd want to use a random number from a newsgroup in such a calculation. A lot of folks here don't work in such constricted memory environments, and are apt to go "hog wild" on memory from time to time. To give you an idea, on my last job, I inadverntantly ended up needing a data buffer of > 100MB. I offered to recode the thing, but was told that it would be easier (and cheaper) to just respec the system using newer motherboards loaded with 1GB of RAM! (Which they promptly did). We run in a non-embedded real-time environment (simulations), so speed and regularity is much more important to us than memory usage. Also, I'd expect the count to be affected by style factors such as how large subprograms tend to be and how many subprograms and declarations are placed in one package, how extensively generics are used, use of tagged types and dynamic dispatch, optimization, etc. For the record, on the project I just mentioned there are currently 79K SLOC (according to ada_count) generating an executable of 12.5MB. That would work out to very roughly 150 bytes/SLOC. If you throw in the debug symbol file, that number would more than double. However, this includes a few things linked in, such as the GreenHills ada library which is about 900K by itself. But factoring that stuff out doesn't change the final number much. --- T.E.D. homepage - http://www.telepath.com/dennison/Ted/TED.html home email - mailto:dennison@telepath.com