Article: Experiences with Formative Assessment in Engineering Classrooms
Background: Student assessment in traditional engineering education is ordinarily summative in nature. Homework assignments, if corrected in a timely manner, can sometimes be considered as formative, but the longer it takes for students to receive feedback on the assignment, the less effective the assessment becomes. A number of studies, including the National Academy of Science Report, entitled How People Learn, have shown that providing learners with formative assessment in the classroom is effective in clearing up misconceptions and enhances retention. Timely feedback in the classroom requires a method that provides nearly instantaneous and simultaneous responses from all students. Electronic classroom communication systems (CCS) are capable of rapidly capturing and anonymously displaying student responses to questions posed by the instructor.
Objectives: We provided in-class formative feedback via CCS on key concepts, followed by peer-peer and class discussions. Our objectives were to determine if formative feedback increases retention of fundamental concepts and to determine if there was a correlation between student interaction with the CCS and student performance in the class.
Methods: We used a wireless (infrared) classroom communications system, the InterWrite Personal Response System. Signals from student transmitters were collected by an infrared receiver at the front of the room connected to the instructor’s computer. The system software quickly sorted the students’ responses and displayed the frequency of responses as a bar chart. This chart was projected after each question to provide students, and instructor, with feedback on student’s performance. The overall class performance on these questions dictated whether the class moves on to the next topic or went back to review the current topic. In all cases ample time was allowed for discussion of the results. Twenty of the 90 questions asked throughout the semester were revisited on the final examination. Student interaction with the PRS system was analyzed at the end of the semester and compared with student performance. Students were surveyed at the end of the semester to solicit their feedback on the use of a CCS.
Results: Retention on the final exam, relative to initial in-class CCS performance, was significantly better on 12 of the 20 questions and significantly worse on only two of the questions (P<.05, paired t-test). Both of the latter questions dealt with memorization of factual information, rather than concepts. The original questions were asked shortly after the material was presented in class, when the material was still fresh on students’ minds. Most of the questions repeated on the final exam were posed several weeks after the initial PRS questions were asked in class. We found a strong correlation between student performance in the class and the degree to which they participated in class via interaction with the PRS system. Student surveys indicate that they liked the anonymity of the system, felt it was a valuable use of class time, stimulated them to ask questions, and helped them pay closer attention in class.
Discussion and Conclusions”>Students do not always interpret information that is presented in the classroom the way the instructor intended. There is little that can be gained in moving forward with a new concept if the last one was misunderstood by a large percentage of the class. A CCS provides the learner with the opportunity to assess conceptual understanding while at the same time providing the instructor with a sense of how the students interpreted what was presented in class. More importantly, it provides both instructor and learner a second opportunity to clarify the concept. As a consequence of this formative assessment, more class time is spent on discussing difficult concepts.
Instructors must be prepared to suspend their plan for the remainder of the class period in favor of guiding class discussion that follows a CCS question. Every concept has a number of misconceptions associated with it. We try to design questions so that each of these misconceptions appears as a distracter in a multiple choice question. That way, we are reasonably confident that students will discover their own misconceptions and have an opportunity to correct them in a timely manner. This formative assessment is preferable to uncovering these difficulties later in homework assignments or examinations.
Acknowledgements: This work was supported primarily by the Engineering Research Center Program of the National Science Foundation under Award Number EEC9876363.
Author 1: Robert J. Roselli [email protected]
Author 2: Sean P. Brophy [email protected]
Article Link: www.asee.org