From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=0.6 required=5.0 tests=BAYES_05,INVALID_DATE autolearn=no autolearn_force=no version=3.4.4 Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP Posting-Version: version B 2.10.3 4.3bsd-beta 6/6/85; site ucbvax.ARPA Path: utzoo!watmath!clyde!burl!ulysses!ucbvax!usc-eclb.arpa!JBROOKSHIRE From: JBROOKSHIRE@USC-ECLB.ARPA Newsgroups: net.lang.ada Subject: APSE E&V Evaluator Survey Message-ID: <8510150410.AA04984@UCB-VAX> Date: Wed, 2-Oct-85 17:35:00 EDT Article-I.D.: UCB-VAX.8510150410.AA04984 Posted: Wed Oct 2 17:35:00 1985 Date-Received: Wed, 16-Oct-85 05:13:24 EDT Sender: daemon@ucbvax.ARPA Organization: The ARPA Internet List-Id: TO: MEMBERS OF THE Ada AND SOFTWARE ENGINEERING COMMUNITIES: The Ada Programming Support Environment (APSE) Evaluation and Validation Task was established to develop the tools and techniques to provide a capability to perform assessment of APSEs and to determine conformance of APSEs to the Ada Joint Program Office (AJPO) sponsored Common APSE Interface Set (CAIS). As the E&V technology is developed, it will be made available to the community for use by the DoD components, industry, and academia. The E&V Team will be developing technology to evaluate specific components (ie., compilers, debuggers, command language interpretors, etc.). In support of these activities, a number of support services and specific tool evaluators have been identified. Your help in determining the relative merits of these facilities to the community is hereby requested. APSE EVALUATION FACILITIES SURVEY: The following list of software development tool evaluation facilities (EVALUATORS) and other useful facilities (Information System, Trial Accounts, etc.), has been developed by the APSE E & V Team as candidates for early definition and development. In the interest of broadening the base of support for the selection decisions, it has been decided to survey as much of the "Ada Community" and "Software Engineering Community" as we can locate. You are therefore requested to review this list and to advise us of your preferences by ranking these evaluators in your order of relative importance - say, for example, one thru N, where one would be your first choice, and "N" represents the last entry from the list that you would see a need for - we would hope that most people would choose five or so facilities as being of immediate interest or need. Please feel free to suggest any additions to the list, or to comment on the individual evaluators/facilities, or on the list as a whole. It is also requested that you provide us with references to any existing or planned evaluators you may be aware of. If you can report on such evaluations, please provide details, such as status and availability, or a point of contact. The results of this total survey will be distributed thru the communities as soon and as broadly as possible. If any respondents should desire copies of incremental intermediate results, please so state in your response. FACILITIES/EVALUATORS: (Stated in terms of the function or Tool to be evaluated - note the difference between "evaluation - goodness or something", and "validation - conformance to a requirement") RECOMMENDED RANK ___________ | | 1. Ada PDL (Program Design Language) Guidelines |_________| | | 2. APSE E&V Network (On-Line) Information Repository |_________| | | 3. APSE Trial Accounts (via network access) |_________| | | 4. CAIS (Common APSE Interface System) Evaluators |_________| | | 5. CAIS (Common APSE Interface System) Validation |_________| | | 6. Compiler Evaluators |_________| | | 7. Configuration ManagementSupport (in APSE) |_________| | | 8. Database Optimization Evaluators |_________| | | 9. Distributed Development Environment Support (in APSE) |_________| | | 10. Distributed Runtime System Support (in APSE) |_________| | | 11. Interoperability Evaluators |_________| | | 12. Methodology Evaluators |_________| | | 13. Multilingual APSE Evaluators |_________| | | 14. Portability Evaluators |_________| | | 15. Program Library Evaluators |_________| | | 16. Runtime Evaluators |_________| | | 17. Software Maintainability Support (in APSE) |_________| | | 18. Software Reliability Support (in APSE) |_________| | | 19. Target Simulation Evaluators |_________| | | 20. Test Management Support (in APSE) |_________| | | 21. Useability (User Interface/Performance/Capacity) |_________| | | 22. "Whole APSE" Evaluators |_________| Please note the list is alphabetically ordered - you are requested to essentially reorder it in accordance with your preferred order of importance. Your cooperation in this survey is greatly appreciated. The survey may be returned via electronic mail or, if you prefer, to the U S Mail address given below. An acceptable alternative to the return of the completed form would be a message containing your preferred ranking using the order numbers from the above list; ie., 20, 3, 9, 11, 6, where you perceive #20 as most important, #3 as second-most, etc. Jerry R. Brookshire Member, APSE E & V Team and Requirements Working Group (REQWG) JBROOKSHIRE @ USC-ECLB / brookshir%ti-eg@CSNET-RELAY Jerry R Brookshire Texas Instruments Inc P O Box 660246 - M/S 3114 Dallas, TX 75266 -------