COS 27-8
How to do a good experiment: Giving students automated feedback on experimental design, interpretation, and graphing through SimBio's SimUText system

Tuesday, August 12, 2014: 10:30 AM
Regency Blrm D, Hyatt Regency Hotel
Eli Meir, SimBio, Cambridge, MA
Denise Pope, SimBio, Cambridge, MA
Susan Maruca, SimBio, Cambridge, MA
Kerry Kim, SimBio, Cambridge, MA
Jody Clarke-Midura, Massachusetts Institute of Technology, Cambridge, MA
Background/Question/Methods

The experimental process for testing hypotheses is the backbone of all fields of science, but is something many students struggle with. All the new science standards and frameworks (NGSS, Vision and Change, etc.) strongly encourage active classrooms that help students understand and learn to do good experiments. Yet in practice, with large class sizes, it can be hard to give students hands-on experience designing their own experiments, and especially hard to give them the timely feedback they need in order to improve. SimBio has many virtual labs where students must design and carry out their own realistic experiments in rich biological simulations. In this study, we are developing new analysis methods and exploring how to put sufficient constraints on those virtual experiments to allow us to characterize student actions and provide immediate feedback on their understanding of the experimental process.

We have implemented elements of constrained experimental process in two simulation labs: Darwinian Snails, which teaches students about natural selection and guides them in testing whether selection is occurring in a system; and Isle Royale, which teaches students about population growth and predator-prey dynamics and includes a section on graphing data. Both are widely used in introductory biology classes worldwide.

Results/Conclusions

From the Darwinian Snails lab, we have student data that include both statements about their planned experimental design and the actual experiments they set up. We have developed an initial algorithm to classify the errors in their experiments, which successfully classifies over 90% of their experiments. The most common error was not including any replication; most students did include appropriate controls. The design they outlined before beginning to set up the experiment very often did not match their experiment (e.g., almost all students stated that they would include replication), suggesting that while they may recognize terms used in teaching experimental design, they do not understand the underlying concepts. Incorporating feedback is the next step.

In the Isle Royale lab, we have developed a graphing interface that allows students to make some common graphing mistakes, including confusing dependent and independent variables and plotting raw data rather than summary data. We provide appropriate feedback for each of these mistakes to allows students to learn better graphing techniques.

Our approach, by putting in the right constraints to allow for automated feedback on very open-ended virtual explorations, appears to be a promising avenue for helping students learn about the experimental process.