From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.4 required=5.0 tests=AC_FROM_MANY_DOTS,BAYES_00 autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,c397a9e135b263db X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2001-08-06 07:35:39 PST Path: archiver1.google.com!newsfeed.google.com!newsfeed.stanford.edu!newsfeeds.belnet.be!news.belnet.be!psinet-eu-nl!psiuk-p4!uknet!psiuk-n!news.pace.co.uk!nh.pace.co.uk!not-for-mail From: "Marin David Condic" Newsgroups: comp.lang.ada Subject: Re: What happens to DEC Ada? Date: Mon, 6 Aug 2001 10:24:43 -0400 Organization: Posted on a server owned by Pace Micro Technology plc Message-ID: <9km9bd$dg7$1@nh.pace.co.uk> References: <9ff447f2.0107310659.36d01e9@posting.google.com> <%jra7.3545$257.153834@ozemail.com.au> <5ee5b646.0108031100.2be1c7d6@posting.google.com> <9ketvd$t21$1@nh.pace.co.uk> <3B6B04F0.63E9602B@sparc01.ftw.rsc.raytheon.com> NNTP-Posting-Host: 136.170.200.133 X-Trace: nh.pace.co.uk 997107885 13831 136.170.200.133 (6 Aug 2001 14:24:45 GMT) X-Complaints-To: newsmaster@news.cam.pace.co.uk NNTP-Posting-Date: 6 Aug 2001 14:24:45 GMT X-Priority: 3 X-MSMail-Priority: Normal X-Newsreader: Microsoft Outlook Express 5.50.4522.1200 X-MimeOLE: Produced By Microsoft MimeOLE V5.50.4522.1200 Xref: archiver1.google.com comp.lang.ada:11362 Date: 2001-08-06T14:24:45+00:00 List-Id: Well, 20,000 files certainly qualifies in my mind as "pretty large". That's an unusual sized project - I'll bet that of all the Ada projects in all the world, that may rank in say the top 5%? It certainly isn't center of the bell curve. Obviously in a case like that, minimizing recompilation is a noble goal. However, I'd suspect that there are cases where it takes more time to try to reduce the number further than to simply take the hit & recompile. Once you've gone to the effort of scanning the file to determine if it *really* needs to be recompiled, you probably should just go ahead and recompile it. (Maybe a smart editor and a project database could spread the load to edit-time rather than compile-time? But that would start me haranguing on how wonderful it would be for Ada to have a really spiffy IDE with seamlessly integrated tools, etc. :-) And of course usually the less expensive option is to apply faster hardware. Depends on the environment you work in. I once worked for a defense contractor in New Jersey that did lots of cost-plus projects. Engineers spent huge amounts of time in line waiting to xerox stuff because there was only one available xerox machine. "Buy More" seems the obvious answer. However, an engineer's time on a cost-plus contract was making money. A xerox machine was a capital expenditure that couldn't be billed back entirely to the government. So I could imagine situations in which it would be more profitable to make the engineers wait for massive recompiles - or laboriously figure it out by longhand. :-) MDC -- Marin David Condic Senior Software Engineer Pace Micro Technology Americas www.pacemicro.com Enabling the digital revolution e-Mail: marin.condic@pacemicro.com Web: http://www.mcondic.com/ "Wes Groleau" wrote in message news:3B6B04F0.63E9602B@sparc01.ftw.rsc.raytheon.com... > > We have about 20,000 Ada files, many of them VERY large, > and if ALL are obsolete (forced to be), we can code all > in ten hours on ONE processor with Apex. Faster with > GNAT. When only a few hundred are obsolete, Apex takes > longer to figure out which ones than it does to actually > compiler them. > > -- > Wes Groleau > http://freepages.rootsweb.com/~wgroleau