OOS 32-5 - Experimenting with constraint: Using technology to recognize mistakes and provide feedback as students practice designing ecological experiments

Thursday, August 11, 2016: 2:50 PM
Grand Floridian Blrm D, Ft Lauderdale Convention Center
Eli Meir1, Denise Pope2, Susan Maruca1, Kerry Kim1, Jennifer Palacio1, Jenna Conversano2 and Jody Clarke Midura3, (1)SimBio, Missoula, MT, (2)SimBio, Cambridge, MA, (3)Instructional Technology & Education, Utah State University, Logan, UT
Background/Question/Methods

One of the challenges of using student-centered approaches in the classroom is providing students in large classes the feedback they need in order to learn from open-ended exercises. We face a similar issue in our simulation-based modules: while they allow students to explore biological systems in complex ways, they often rely on multiple-choice and similar question types to provide feedback when instructors cannot provide personal guidance. This situation is frustrating, because we design our modules to help students address higher-order thinking skills and deep conceptual (mis)understandings such as are often encountered in addressing ecological problems, but we know that multiple choice questions aren’t ideal for assessing those complex ideas. In this project, we have used a combination of lightly constraining the open-ended simulations in our modules, together with algorithmic categorization of patterns in student answers, to offer students automated feedback specific to their level of understanding. We ask whether these cyberlearning techniques allow effective personalized feedback to students on higher-order thinking skills.

Results/Conclusions

In this talk we’ll discuss a new ecological detective themed module called Understanding Experimental Design. Students are presented a problem (critters are dying, what’s the cause?) and asked to design and carry out experiments to solve the mystery. In a precursor to this module used in 30 classes (1,144 students), we found that automated, personalized feedback improved student experiments: when feedback was available, student experiments were more likely to be well-controlled (83% vs. 72% without feedback) and to include appropriate replication (50% vs. 18%). In addition to presenting those results, we’ll talk about how we designed the module with constraints in order to enable feedback, and data from a randomized, controlled experiment on the effects of both feedback and constraints on student success in experimental design. The results so far from this project indicate that cyberlearning methods can help teach and reinforce fundamental concepts, such as experimental design, that must be understood by budding ecologists.