COS 37-1 - Evaluating interactive activities by measuring student learning gain

Tuesday, August 7, 2012: 8:00 AM
E141, Oregon Convention Center
Malin J. Hansen, Zoology, University of British Columbia, Vancouver, BC, Canada
Background/Question/Methods

A key element in the design and use of interactive activities is to evaluate their effect by measuring student learning gain. The evaluation process may look different from case to case, but should include most of the following steps: 1) matching an activity’s learning goals to those of the course or module, 2) identifying the appropriate tool(s) for measuring learning gain, i.e. conceptual inventory, exam questions, or iclicker questions, 3) assessing student learning before and after activity, 4) implementing the activity in class, ideally with a control group, 5) identifying questions on which students improved or did not improve, and 6) revising activity to align goals and actual learning.  I designed an in-class interactive case study activity to facilitate student understanding of population dynamics and followed the above steps to evaluate student learning. I used the activity in an introductory ecology course (358 students), where half of the class participated in the activity, while the other half attended a regular lecture. The learning gains of all students in the class were measured using eight conceptual questions before and after the activity.

Results/Conclusions

Students participating in the activity increased their total score from 56% to 71%, while students attending the regular lecture increased their score from 54% to 70%. There was therefore no significant difference in the performance of the two groups when all questions were considered. There was, however, a significant difference between the two groups for several of the individual questions. For example, students participating in the activity scored higher on questions that involved data analysis (gain from 25 to 63% vs. 32 to 45%) and comparison between graphs (gain 55 to 74% vs. 56 to 64%). As for graph interpretation, the results varied. On some questions the activity group had a higher gain, while the lecture group had higher gain on others. The results from the first trial run were used to revise the activity to better meet its learning goals and to more effectively facilitate student learning. The results suggest that in-class interactive activities can be superior in increasing student learning compared lectures, but that it depends on their design. Several iterations may be needed to ensure activity goals are met and to ensure maximum learning gain. Activity designs that tend to promote learning will be discussed.