COS 182-4 - Understanding experimental design: How instant feedback, constrained simulations, and practice help students learn to do good experiments

Friday, August 11, 2017: 9:00 AM
C122, Oregon Convention Center
Eli Meir1, Denise Pope2, Susan Maruca1, Kerry Kim1, Jennifer Palacio1 and Jody Clarke-Midura3, (1)SimBio, Missoula, MT, (2)CIRTL, Northampton, MA, (3)Instructional Technology and Learning Sciences, Utah State University, Logan, UT
Background/Question/Methods

A challenge of using student-centered approaches in large classes is providing useful feedback from open-ended exercises. Simulations allow students to practice scientific and experimental skills that are often too time- or resource-intensive for typical classrooms, such as designing an experiment to test a student-generated hypothesis. Our group has produced numerous simulation-based labs and found a tradeoff between simulation freedom and our ability to provide detailed automated feedback: Low-constraint simulation-based labs give students freedom to design and conduct experiments, but this open-endedness defeats attempts to provide specific automated feedback. Conversely, highly constrained simulation-based labs limit student exploration and open-ended practice, but allow us to give detailed automated feedback to the student.

Here we explore whether greater freedom or more detailed feedback is more useful to the student. Specifically, we address the question: how does lightly constraining an experimental design task, and thereby providing student specific feedback, affect student learning of experimental design?

To answer, we wrote a new simulation-based lab called Understanding Experimental Design where introductory-biology students hypothesize and then experimentally determine the cause of a mysterious illness. We wrote three versions of the lab: (1) medium-constraint with detailed feedback; (2) medium-constraint but without feedback; and (3) low-constraint without feedback. We randomly assigned students to each lab, and did pre- and post-tests and interviews to assess learning.

Results/Conclusions

Student learning was significantly higher in the medium-constraint version with feedback than in the version with low constraints and no feedback. The percent of students who designed a “good” experiment (systematic variation of hypothesized variable; valid controls; replicates of each treatment) was 36% in the low-constraint lab as compared with 79% in the medium-constraint version with feedback (N=14 per treatment). Similarly, students were less likely to include parallel replicates in their experiments when using the low-constraint version without feedback (43%) compared to those using the medium-constraint version with feedback (79%). Results from the medium-constraint version without feedback were intermediate between the other two versions. Learning gains appeared to be aided by the practice students received in designing and conducting experiments within the simulation, and may not have appeared after a more didactic presentation of the material. In addition to presenting these and other quantitative results, we’ll show the lab itself and discuss some of the design research that went into building the lab.