comp.lang.ada
 help / color / mirror / Atom feed
* ACVC tests
@ 1996-04-27  0:00 Robert Dewar
  0 siblings, 0 replies; 7+ messages in thread
From: Robert Dewar @ 1996-04-27  0:00 UTC (permalink / raw)



Ken said


 "Why not read the older posts where I said I _did_ look, going so far as
 to quote from the ACVC documents I reviewed on the AdaIC server? Why not
 read my response to Mr. McCabe, discussing what I found? Why not answer
 my questions regarding what I _couldn't_ find, AFTER the review?
 
 If you have other documents you wish for me to review (and they, in
 fact, can be accessed in a reasonably easy fashion), please feel free to
 send me pointers to them. If not, please discontinue asking me to review
 something that you can't identify?"

Ken, I am suggesting looking at the *tests* themselves. The entire 1.11
suite is of course availabe for review, and so is the 2.0.1 suite that
contains good examples of the new philosophy. Preliminary versions of
many 2.1 tests are also available for review.

There! identified! :-)

Actually, EVERYONE is encouraged to look at the new ACVC tests, and
comments are welcome. As Ken so often points out, we lack objective
criteria for some aspects of these tests, e.g. are they really
usage oriented. Comments from lots of users would be most helpfu :-)





^ permalink raw reply	[flat|nested] 7+ messages in thread

* ACVC tests
@ 1996-04-28  0:00 Robert Dewar
  1996-05-06  0:00 ` John McCabe
  0 siblings, 1 reply; 7+ messages in thread
From: Robert Dewar @ 1996-04-28  0:00 UTC (permalink / raw)



Ken said

  "Well, I did. Unfortunately, I don't know what represents common usage.
  I only know how I (and my group use Ada). Is it common usage? Beats me.
  This was exactly my comment: Without some sort of survey, etc. how can
  _any_ one person or small group of people know what represents common usage?

  There's another issue. Common usage today is based on Ada 83 (unless you
  wish to claim that most existing Ada code was developed using Ada 95).
  Therefore, even if such a survey was done, how could it predict how people
  will be using Ada 95 unique features?"

That's much too pessimistic I think. These "unique features" were not
handed out "deus ex machina", they were carefully designed with a pretty
good idea of how they are expected to be used. This means that you can
quite accurately predict how they will be used. Sure you will miss some
interesting unintended or unforseen uses, which can inform later versions
of the ACVC tests, but I think you can do quite a good job of figuring
out how features in the language will be used.

  > The resulting test is then
  > reviewed by the ACVC review team, which represents implementors and
  > users, to see how well it seems to match potential use.

  How does this team represent me, if they have never contacted me? Or have
  they contacted most users, and I represent some irrelevant minority?


I did not mean represent in a political sense. I meant that the user
viewpoint is represented. This committee was chosen by an open competitive
selection process. I don't know if you applied or not to be a member, but
you certainy could have. I also did not make the choices!

Of course the team has not "contacted most users", who would number in
the tens of thousands. In fact the experience in the past has been that,
although the ACVC tests were generally available for review, it has been
extremely difficult to get ANY tecnical input from anyone. Even vendors
do not in general look at the tests in advance of the formal release
of the suite, and it is extremely rare to get any technical input from
users on the tests (I can't remember any example of such). Thus the
phiolosophy behind the committee was precisely to get at least *some*
users, implementors and testers looking at the test carefully in advance.

  > It is hard to establish objective criterion for how well one is doing
  > in this process. The review group certainly has not found anyway of
  > pre-establishing what will turn out to be typical usage.

  Exactly. EXACTLY. I don't how they _could_ do this.

As I say, I think this is an easier task than you think, and certainly,
as you acknowledge the important thing is to ask the question. At the
very least, anyone who writes a test must be able to defend it on these
grounds, and many of the old ACVC 1.11 tests could clearly NOT be
defended as usage-oriented.





^ permalink raw reply	[flat|nested] 7+ messages in thread

* ACVC tests
@ 1996-04-28  0:00 Robert Dewar
  1996-05-06  0:00 ` John McCabe
  0 siblings, 1 reply; 7+ messages in thread
From: Robert Dewar @ 1996-04-28  0:00 UTC (permalink / raw)



John McCabe said

"When I have some time, or when I can persuade my company to let me, I
will look at the ACVC tests, but the fact that I haven't so far must
not exclude me from expressing opinions on the effect of the tests,
based on the experience I have of using the products that have
(mysteriously) passed the tests!"

Ah, that's the point. You don't have this experience, since there are
no products that have passed the new ACVC 2.1 regression suite yet,
and so to make judgments based on this kind of anecdotal experience,
we at least have to get the right anecdotes.

I was not suggesting a thorough review of the tests, just take a look
at a couple of the new tests, I think you will be surprised by the
significant difference in style.





^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: ACVC tests
  1996-04-28  0:00 ACVC tests Robert Dewar
@ 1996-05-06  0:00 ` John McCabe
  1996-05-06  0:00   ` Robert Dewar
  1996-05-07  0:00   ` Kevin D. Heatwole
  0 siblings, 2 replies; 7+ messages in thread
From: John McCabe @ 1996-05-06  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) wrote:

>Of course the team has not "contacted most users", who would number in
>the tens of thousands. In fact the experience in the past has been that,
>although the ACVC tests were generally available for review, it has been
>extremely difficult to get ANY tecnical input from anyone. Even vendors
>do not in general look at the tests in advance of the formal release
>of the suite, and it is extremely rare to get any technical input from
>users on the tests (I can't remember any example of such). Thus the
>phiolosophy behind the committee was precisely to get at least *some*
>users, implementors and testers looking at the test carefully in advance.

I am surprised that the vendors attitude here since one would have
thought they would be keen to be the first on the market with a
validated product.

Perhaps this attitude should be considered further. It may be, for
example, that they consider validation as a hinderance that does not
add any real value to their product. It could then be suggested that
this attitude is promoted by the users (like myself, or the mandators
like Dornier) not being strict enough with the vendors and asking for
VSRs and ACES results etc.

Any comments?

In general, however, I applaud your attempts here to extract more
detailed comments from users on the new test suite and, when I have
time, I'll see if I can be of any help!


Best Regards
John McCabe <john@assen.demon.co.uk>





^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: ACVC tests
  1996-05-06  0:00 ` John McCabe
@ 1996-05-06  0:00   ` Robert Dewar
  1996-05-07  0:00   ` Kevin D. Heatwole
  1 sibling, 0 replies; 7+ messages in thread
From: Robert Dewar @ 1996-05-06  0:00 UTC (permalink / raw)



John McCabe said, answering me

">extremely difficult to get ANY tecnical input from anyone. Even vendors
>do not in general look at the tests in advance of the formal release
>of the suite, and it is extremely rare to get any technical input from
>users on the tests (I can't remember any example of such). Thus the
>phiolosophy behind the committee was precisely to get at least *some*
>users, implementors and testers looking at the test carefully in advance.

I am surprised that the vendors attitude here since one would have
thought they would be keen to be the first on the market with a
validated product."


Robert replies

not surprising at all. The suite gets frozen several months before it
is usable for testing, and that is when vendors really start to look
at it. Looking at the suite earlier than this is not efficient, since
it may change under you.

As for rushing to be first, not necessarily, I don't think there are
users out there who buy a compiler just because it is the first on the
block to be validated, nor should they. Sure there is to some extent
a race to be validated first, but it is not the most important 
criterion in choosing a validation schedule.

Back to the issue of looking early. The trouble is that it is always
better to have someone else smoke out the errors, since challenging
the tests takes considerable time and energy. Thus the incentives
are wrong. It is not obvious how to fix this, but as I say the
committee was set up to make sure that at least there would be
some formal input earlier.

This was tried earlier (actually at my insistence that it would be
useful) with ACVC 1.10. A committee was set up consisting of
several government folks and me -- in practice I was the only
one who showed up for meetings (you can see traces of that work,
length_check and enum_check still have my name on somewhere :-)

Anyway, this time, we have a reasonable sized functoining committee,
which i think has been very helpful in improving the quality of
the tests, but the more eyes the better, and the ACVC development
team welcomes outside input.





^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: ACVC tests
  1996-04-28  0:00 Robert Dewar
@ 1996-05-06  0:00 ` John McCabe
  0 siblings, 0 replies; 7+ messages in thread
From: John McCabe @ 1996-05-06  0:00 UTC (permalink / raw)



dewar@cs.nyu.edu (Robert Dewar) wrote:

>John McCabe said

>"When I have some time, or when I can persuade my company to let me, I
>will look at the ACVC tests, but the fact that I haven't so far must
>not exclude me from expressing opinions on the effect of the tests,
>based on the experience I have of using the products that have
>(mysteriously) passed the tests!"

>Ah, that's the point. You don't have this experience, since there are
>no products that have passed the new ACVC 2.1 regression suite yet,
>and so to make judgments based on this kind of anecdotal experience,
>we at least have to get the right anecdotes.

Don't be so pedantic - I was not only referring to the ACVC 2.1 suite.
As someone who is currently mandated to use Ada 83, ACVC 2.1 is
unlikely to have an effect on me until I can persuade the relevant
people in my company that Ada 95 is the way to go. Until then, I
_must_ also take into account ACVC 1.11

>I was not suggesting a thorough review of the tests, just take a look
>at a couple of the new tests, I think you will be surprised by the
>significant difference in style.

I will - when I find time!

Best Regards
John McCabe <john@assen.demon.co.uk>





^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: ACVC tests
  1996-05-06  0:00 ` John McCabe
  1996-05-06  0:00   ` Robert Dewar
@ 1996-05-07  0:00   ` Kevin D. Heatwole
  1 sibling, 0 replies; 7+ messages in thread
From: Kevin D. Heatwole @ 1996-05-07  0:00 UTC (permalink / raw)



In article <831410279.2370.2@assen.demon.co.uk>, john@assen.demon.co.uk
(John McCabe) wrote:

>dewar@cs.nyu.edu (Robert Dewar) wrote:
>
>>Of course the team has not "contacted most users", who would number in
>>the tens of thousands. In fact the experience in the past has been that,
>>although the ACVC tests were generally available for review, it has been
>>extremely difficult to get ANY tecnical input from anyone. Even vendors
>>do not in general look at the tests in advance of the formal release
>>of the suite, and it is extremely rare to get any technical input from
>>users on the tests (I can't remember any example of such). Thus the
>>phiolosophy behind the committee was precisely to get at least *some*
>>users, implementors and testers looking at the test carefully in advance.
>
>I am surprised that the vendors attitude here since one would have
>thought they would be keen to be the first on the market with a
>validated product.
>
>Perhaps this attitude should be considered further. It may be, for
>example, that they consider validation as a hinderance that does not
>add any real value to their product. It could then be suggested that
>this attitude is promoted by the users (like myself, or the mandators
>like Dornier) not being strict enough with the vendors and asking for
>VSRs and ACES results etc.
>
>Any comments?

Well, I work for OC Systems and we would be considered an Ada vendor.  I
will not pretend to speak for OC Systems, but I can offer my attitude on
this subject.

The ACVC tests are a great resource for the compiler vendors and provide
a measure of reassurance to our customers that we do indeed implement
the Ada language.  The test suite has literally cost millions of dollars to
develop and is probably the best such suite of tests that I have ever
encountered.  If this test suite did not exist, a individual compiler vendor
would be forced to develop such a test suite internally.  This burden would
increase the cost of Ada compilers to our end users (Ada compilers are
already an incredibly expensive piece of software to develop and we need
all the help we can get to cut these costs).

Also as a vendor, we do not consider validation "as a hinderance".
Rather, it is
an essential part of developing a commercially acceptable Ada compiler.
Indeed, even if there was no requirement to "validate" using this test suite,
we would still want to incorporate this set of tests into our test suite.
Vendors simply do not want to delivery buggy product (it will hurt our
bottom line eventually).

I wish that OC Systems had the resources to participate in the development
of the new Ada95 ACVCs more.  Unfortunately, developing an Ada95 compiler
is a huge undertaking and resources must be allocated to this effort in
view of market realities and current cash flow.  We have chosen to try to
implement the new features in Ada95 first (trying to implement them as
best we can independent of running the new ACVCs - which are still in
development and still have some bugs in them).  When the compiler reaches
a level of maturity that we feel it is beneficial to start running/debugging
the ACVC tests, we focus our efforts in this direction.  In this regards, the
ACVCs provide valuable feedback to vendors developing Ada95 compilers as
to how close the compiler is to being completed/robust.  If you fail most
of the ACVCs the first time around, you probably didn't do a good job
implementing the features.  Anyway, this is valuable feedback.

As to the current ACVC 2.1 team (led by SAIC), they are doing an exellent job.
We monitor all the email that they exchange and have some incite into
their process.  They have some very good people on the project.  They are
tasked to develop a test suite before there are compilers that implement the
rules they are testing.  This is a very difficult thing to do, since some of the
rules in Ada are very subtle and without a reliable compiler to tell you that
you messed up, it is very big undertaking (by the way, the ACVC team does have
access to all the various vendors' Ada95 compilers which are at different
levels of completeness.  The ACVC team does a great job of sending the vendors
bug reports when they encounter problems.).

>
>In general, however, I applaud your attempts here to extract more
>detailed comments from users on the new test suite and, when I have
>time, I'll see if I can be of any help!

I think you had best hurry here if you expect to be of any help for ACVC 2.1.
This test suite is nearing completion and you better get your licks in now.
They have been developing the new Ada 95 tests for over two years now (I
am not really sure how long they have been at it, but there are new tests
from 1994 - the Ada83 tests in the suite were developed over the a number
of years in the 1980s).

Kevin Heatwole
OC Systems, Inc.




^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~1996-05-07  0:00 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
1996-04-28  0:00 ACVC tests Robert Dewar
1996-05-06  0:00 ` John McCabe
1996-05-06  0:00   ` Robert Dewar
1996-05-07  0:00   ` Kevin D. Heatwole
  -- strict thread matches above, loose matches on Subject: below --
1996-04-28  0:00 Robert Dewar
1996-05-06  0:00 ` John McCabe
1996-04-27  0:00 Robert Dewar

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox