Article: Designing an Adaptive Learning Module to Teach Software Testing
There is a need to teach software testing skills to computer science undergraduates. Software testing is critical for building successful software, but unfortunately it is viewed as boring and uncreative and it is traditionally neglected in courses. In this paper we explored the design and use of a series of self-paced, adaptive on-line learning modules for teaching software testing. The adaptive modules were used as a supplement to classroom teaching and learning activities, so they could be used without taking away from lecture time spent on other course topics. The adaptive nature of the modules helps create a self-paced learning environment for the student and encourages independent study. Adaptive learning systems customize teaching and learning to suit each individual user. This tailoring is based upon a user model that is incrementally built and updated as users complete various exercises and tests. These systems often modify the order of presentation or the sequencing of material based on the unique experience of a particular user. They often provide hyperlinks that customize navigation, and provide intelligent feedback and problems solving support to aid the student.
In our exploration, we used NetCoach, an adaptive learning system designed to enable authors to develop a course without any programming knowledge. NetCoach provides a web interface for course developers to develop and maintain concepts, questions, and their interrelationships. To build the adaptive learning module, we first organized the course as a graph with concepts as nodes and interrelationships as edges. There are two types of edges, prerequisites and inferences. Prerequisites represent knowledge of one concept is necessary for learning another. Inference relatinoships indicate that mastery of one concept implies mastery of another. In order to support an effective level of individualization, creating adaptive courses can require as much as two to three times as much content as traditional courses. We used material from previous offerings of courses to provide the additional material required.
When writing questions, we ensured that each question related to a specific concept so that its correct answer implies knowledge of that concept. The questions may imply knowledge of multiple concepts as well. We also had questions catering to varied knowledge levels, and not purely focused on factual recall. Overall questions covered knowledge, application, analysis, and synthesis types of knowledge. A knowledge question, for example, required recall of a previously learned material or fact (e.g., definitions). An analysis question required breaking a problem into parts and being able to understand its intended purpose (e.g., code execution).
Our first adaptive module, “Introduction to Testing”, is designed to help freshmen learn the basics of testing their own software. We tested it with our CS 2 course with this approach”>1. A Pre-test at the beginning of a module gauges prior student knowledge and builds the initial user model. 2. Concepts and exercises make up the bulk of the module and are presented to the learner adaptively as the student progresses through the module. 3. A Post-test is used to assess student learning once they have completed all the concepts they were guided through. This tests the student on their knowledge of the whole module.
To assess the effectiveness of the adaptive learning module, we designed a set of objective questions covering the range of difficulty levels on the material in our module. These questions did not appear on any of the module’s tests or self-tests. This set of questions was given to 63 students in our CS2 course at the end of the spring 2005 semester to collect a baseline for comparison before the first use of our module. Eighteen students in the summer 2005 offering of the same course were required to complete the on-line module over a period of about ten days as a regular homework assignment. The same assessment questions were then administered at the end of the summer session. The results from the two groups shows a clear shift in scores, with means that are significantly different at the 0.01 level (t-score 3.08, df = 19). This comparison provides evidence that students gain some knowledge after going through the adaptive module that they are not already getting in the regular course content or by writing software tests on their own. In addition, subjective student perception of the adaptive presentation showed that a majority of the students (56%) agreed or strongly agreed that the system was helpful.
This work is supported in part by the National Science Foundation under grant DUE-0127225. Any opinions, conclusions or recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of NSF.
Author 1: Rahul Agarwal [email protected]
Author 2: Stephen H. Edwards [email protected]
Author 3: Manuel A. Pérez-Quiñones [email protected]
Article Link: http://portal.acm.org/citation.cfm?id=1121420&coll=portal&dl=ACM&CFID=9122936&CFTOKEN=32438184