From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,c397a9e135b263db X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2001-08-08 12:24:38 PST Path: archiver1.google.com!news2.google.com!postnews1.google.com!not-for-mail From: gab@rational.com (Greg Bek) Newsgroups: comp.lang.ada Subject: Re: What happens to DEC Ada? Date: 8 Aug 2001 12:24:38 -0700 Organization: http://groups.google.com/ Message-ID: References: <9ff447f2.0107310659.36d01e9@posting.google.com> <%jra7.3545$257.153834@ozemail.com.au> <5ee5b646.0108031100.2be1c7d6@posting.google.com> <9ketvd$t21$1@nh.pace.co.uk> <3B6B04F0.63E9602B@sparc01.ftw.rsc.raytheon.com> <9km9bd$dg7$1@nh.pace.co.uk> NNTP-Posting-Host: 130.213.200.206 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 8bit X-Trace: posting.google.com 997298678 3497 127.0.0.1 (8 Aug 2001 19:24:38 GMT) X-Complaints-To: groups-abuse@google.com NNTP-Posting-Date: 8 Aug 2001 19:24:38 GMT Xref: archiver1.google.com comp.lang.ada:11636 Date: 2001-08-08T19:24:38+00:00 List-Id: The DEC Ada smart compilation mechansim was based on the research into compilation signatures by W Tichy (spelling?), of RCS fame. The mechansims that Rational Apex uses are also based on that research, although the two different implementations do differ. Interestingly in the original paper (my copy has long since dissapeared) Tichy thought that the technique would be most appropriate for C and Pascal, he thought that it had little relevance to Ada (at least that's how I remember it). The only two commercial implementations of it that I'm aware of were the above two Ada compilers. I'm suprised to find GNAT faster than Apex for a really large set of source files. My own experience is that for a large program Apex is usually 20% - 50% faster to compile a system from scratch. Now the definition of "large" for me is at least 20KSLOC. Apex is not a very fast compiler of hello_world. Our experience is that the stored representation (DIANA) of the already compiled units does have a significant impact on the compilation performance of units that follow. But as people have pointed out, building that representation and then using imposes an overhead. And when it comes to recompilation, I'm in no doubt about the benefit of the DIANA. Sure it takes time to read, but the ability of the compiler to compare change sets of the underlying declaration signatures means that for many dependent units the compiler doesn't even need to recompile them. For some class of changes to specifications the compilation system knows that no dependent units could have been affected so there is no recompilation at all of any dependents. But situations differ, so the usual "your mileage will vary" disclaimer applies. Greg "Marin David Condic" wrote in message news:<9km9bd$dg7$1@nh.pace.co.uk>... > Well, 20,000 files certainly qualifies in my mind as "pretty large". That's > an unusual sized project - I'll bet that of all the Ada projects in all the > world, that may rank in say the top 5%? It certainly isn't center of the > bell curve. > > Obviously in a case like that, minimizing recompilation is a noble goal. > However, I'd suspect that there are cases where it takes more time to try to > reduce the number further than to simply take the hit & recompile. Once > you've gone to the effort of scanning the file to determine if it *really* > needs to be recompiled, you probably should just go ahead and recompile it. > (Maybe a smart editor and a project database could spread the load to > edit-time rather than compile-time? But that would start me haranguing on > how wonderful it would be for Ada to have a really spiffy IDE with > seamlessly integrated tools, etc. :-) > > And of course usually the less expensive option is to apply faster hardware. > Depends on the environment you work in. I once worked for a defense > contractor in New Jersey that did lots of cost-plus projects. Engineers > spent huge amounts of time in line waiting to xerox stuff because there was > only one available xerox machine. "Buy More" seems the obvious answer. > However, an engineer's time on a cost-plus contract was making money. A > xerox machine was a capital expenditure that couldn't be billed back > entirely to the government. So I could imagine situations in which it would > be more profitable to make the engineers wait for massive recompiles - or > laboriously figure it out by longhand. :-) > > MDC > -- > Marin David Condic > Senior Software Engineer > Pace Micro Technology Americas www.pacemicro.com > Enabling the digital revolution > e-Mail: marin.condic@pacemicro.com > Web: http://www.mcondic.com/ > > > "Wes Groleau" wrote in message > news:3B6B04F0.63E9602B@sparc01.ftw.rsc.raytheon.com... > > > > We have about 20,000 Ada files, many of them VERY large, > > and if ALL are obsolete (forced to be), we can code all > > in ten hours on ONE processor with Apex. Faster with > > GNAT. When only a few hundred are obsolete, Apex takes > > longer to figure out which ones than it does to actually > > compiler them. > > > > -- > > Wes Groleau > > http://freepages.rootsweb.com/~wgroleau