From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.5-pre1 (2020-06-20) on ip-172-31-74-118.ec2.internal X-Spam-Level: X-Spam-Status: No, score=-0.0 required=3.0 tests=BAYES_40 autolearn=ham autolearn_force=no version=3.4.5-pre1 Date: 15 Nov 91 17:33:00 GMT From: csusac!csus.edu!wupost!zaphod.mps.ohio-state.edu!menudo.uh.edu!cl2.cl.uh. edu!csci03a9@ucdavis.ucdavis.edu (Dana Newman) Subject: Re: Software Engineering Education Message-ID: <15NOV199111332885@cl2.cl.uh.edu> List-Id: In article <20600125@inmet>, ryer@inmet.camb.inmet.com writes... > >I was talking with a co-worker about Software Engineering Education, and said: > "The best thing universities could do would be to give F's to a few hackers > who wrote programs that worked perfectly but were completely unmaintainable . > >She said: > > When she was at MIT (early 80's), the grading for all exercises in all > computer science and software engineering courses was: > > 25% - Quality (correctness) of the executable program > 25% - Quality of the written design document > 25% - Quality of the written test plan and procedures > 25% - Quality of the user documentation > >I thought this was the most intelligent approach I'd ever heard. Do any >of you educators have a better idea? Is this done in other universities? > >Mike Ryer >Intermetrics I am not an educator, but a student here at University of Houston (Clear Lake). Here's the rundown on how we are evaluated for our Ada programming assignments. (At least, for this particular instructor. Might vary a bit for others.) Documentation and Design (20) Algorithm Structure and Efficiency 15 Well Structured Output 5 Well Structured Program (40) Code matches algorithm 15 Meaningful variable names 5 Comments 15 Test for input validity 5 Working Program (40) Compiling/Linking 10 Running 30 There are generally four or five assignments per semester, and together they contribute around 30% of the final grade.