comp.lang.ada
 help / color / mirror / Atom feed
From: phil.brashear@acm.org
Subject: Re: Embedded Processor/Compiler Selection
Date: 1998/02/25
Date: 1998-02-25T00:00:00+00:00	[thread overview]
Message-ID: <6d14ju$ue$1@nnrp1.dejanews.com> (raw)
In-Reply-To: dewar.888332332@merv


In article <dewar.888332332@merv>,
  dewar@merv.cs.nyu.edu (Robert Dewar) wrote:
>
> This advice from Phil, as one of the creators of ACES, is not a surprise :-)
>

Right!  As I said, this is my standard reply.

> But I have a question, has anyone successfully used the latest version of
the
> full ACES suite. We had a couple of customers look at using it and decide
> it was far too much work. One customer used the Quick-Look facility, but
> I would not base much on the results from this, since it is a rather
> miscellaneous collection of tests.

Yes, I have exchanged information with various people who appear to be using
ACES successfully.  (Sometimes I wish they were just a little bit less
successful, so they could pay me to help -- but that's rather mercenary of me,
isn't it!)  To cite one example of ACES use, the primary software developer on
an ongoing major weapon used ACES to do formal evaluations of several Ada
implementations as part of selecting an appropriate combination of platform
and compiler.

Yes, it's a lot of work, though less than previous versions, due to the much-
improved user interface.  The hardest part, of course, is deciding
what you're trying to accomplish, and selecting the appropriate tests and
analysis techniques.

>
> Any set of canned benchmarks is always a bit dubious. I think you usually
> do better to create some test cases of your own, trying to mirror your
> intended application as closely as possible.
>

One of the nice features of the ACES is that it provides a facility for
inserting your own benchmarks into the ACES testing and analysis processes.
It also provides a mechanism for selecting, processing, and analyzing subsets
tailored to your needs.

All this said, the fact remains that ACES is big, and the amount of
information produced can be overwhelming if you aren't selective.
Performance evaluation is a very tricky thing, not because ACES is hard to
use, but because the concept of measuring performance is loaded with
difficulties.

The worst thing is that after all the work of selection, processing,
analyzing, and understanding the results, you run into a manager who wants
you to give him a single number representing the performance of each
implementation.  One might as well give such a manager the Douglas Adams
answer: 42.

Phil

-----== Posted via Deja News, The Leader in Internet Discussion ==-----
http://www.dejanews.com/   Now offering spam-free web-based newsreading




  reply	other threads:[~1998-02-25  0:00 UTC|newest]

Thread overview: 14+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
1998-02-23  0:00 Embedded Processor/Compiler Selection Marin David Condic, 561.796.8997, M/S 731-96
1998-02-24  0:00 ` Robert Dewar
1998-02-25  0:00   ` phil.brashear [this message]
  -- strict thread matches above, loose matches on Subject: below --
1998-02-27  0:00 Marin David Condic, 561.796.8997, M/S 731-96
1998-02-24  0:00 Marin David Condic, 561.796.8997, M/S 731-96
1998-02-24  0:00 ` Rakesh Malhotra
1998-02-23  0:00 Marin David Condic, 561.796.8997, M/S 731-96
1998-02-24  0:00 ` Robert Dewar
1998-02-21  0:00 Robert C. Leif, Ph.D.
1998-02-21  0:00 ` Robert Dewar
1998-02-20  0:00 Marin David Condic, 561.796.8997, M/S 731-96
1998-02-19  0:00 Marin David Condic, 561.796.8997, M/S 731-96
1998-02-20  0:00 ` phil.brashear
1998-02-20  0:00 ` John E. Doss
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox