From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.3 required=5.0 tests=BAYES_00, REPLYTO_WITHOUT_TO_CC autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,c397a9e135b263db X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2001-08-14 12:47:27 PST Path: archiver1.google.com!newsfeed.google.com!newsfeed.stanford.edu!news-spur1.maxwell.syr.edu!news.maxwell.syr.edu!cpk-news-hub1.bbnplanet.com!cambridge1-snf1.gtei.net!news.gtei.net!bos-service1.ext.raytheon.com!bos-service2.ext.raytheon.com.POSTED!not-for-mail Message-ID: <3B7981CC.3135BF89@sparc01.ftw.rsc.raytheon.com> From: Wes Groleau Reply-To: wwgrol@sparc01.ftw.raytheon.com X-Mailer: Mozilla 4.77 [en] (Windows NT 5.0; U) X-Accept-Language: en,es-MX,es,pt,fr-CA,fr MIME-Version: 1.0 Newsgroups: comp.lang.ada Subject: Re: What happens to DEC Ada? References: <9ff447f2.0107310659.36d01e9@posting.google.com> <%jra7.3545$257.153834@ozemail.com.au> <5ee5b646.0108031100.2be1c7d6@posting.google.com> <9ketvd$t21$1@nh.pace.co.uk> <3B6B04F0.63E9602B@sparc01.ftw.rsc.raytheon.com> <9km9bd$dg7$1@nh.pace.co.uk> Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Date: Tue, 14 Aug 2001 14:53:48 -0500 NNTP-Posting-Host: 151.168.135.96 X-Complaints-To: news@ext.ray.com X-Trace: bos-service2.ext.raytheon.com 997818447 151.168.135.96 (Tue, 14 Aug 2001 15:47:27 EDT) NNTP-Posting-Date: Tue, 14 Aug 2001 15:47:27 EDT Organization: Raytheon Company Xref: archiver1.google.com comp.lang.ada:11934 Date: 2001-08-14T14:53:48-05:00 List-Id: > I'm suprised to find GNAT faster than Apex for a really > large set of source files. My own experience is that for For the "all files" case, GNAT is faster, but not by much, in SINGLE THREAD mode. However, because GNAT has no "DIANA" it can easily do many files in parallel. Try that with Apex on this big a set of files (most with LOTS of context clauses) and invariably there will be a collision on the same file, resulting in "severe errors" which means a large portion of the job must be repeated. When doing _only_ obsolete files and only a small percentage of the job, Apex does not usually have that problem and we can work in parallel. However, even with more than two dozen processors, the time savings is only 60 percent. With the size of our project, Apex takes a long time to figure out which files are obsolete. In fact, if NO files are obsolete, Apex takes quite a while to verify that. -- Wes Groleau http://freepages.rootsweb.com/~wgroleau