Many research universities require all of their graduates to achieve standards of scientific reasoning and quantitative literacy (SRQL) skills in the liberal arts component of the curriculum. For example, at Michigan State University graduates are expected to evaluate evidence, construct reasoned arguments, and communicate inferences and conclusions based on scientific and quantitative information. Increasingly more faculty and constituents outside the university are asking the questions: do we know if students are achieving these goals, and what is the evidence? A cross-disciplinary team of scientists and statisticians addressed these questions by designing a large-scale investigation of student mastery of quantitative and scientific skills, beginning with an assessment of 3500 freshmen before they started their university careers. We developed an instrument to measure SRQL using backwards design. We began with developing eight objectives that all students are expected to achieve and assessment items that aligned with these goals. We asked faculty from six different colleges on campus to rank the SR/QL goals. We tested 65 assessment items with students enrolled in seven general education science courses and one biology course for majors. For each tested item, we calculated three parameters: difficulty, discrimination and guessing factor and narrowed the instrument to 21 items.
Results/Conclusions
The highest ranked goals by faculty included: students should describe methods of inquiry leading to scientific knowledge, make predictions based on data, and evaluate the credibility, use, and misuse of scientific information. On average, incoming students answered 57% of the items correctly. They were more successful at answering the scientific reasoning questions (63%) than quantitative literacy questions (46%). We compared the responses of STEM majors with non-STEM majors on the preliminary version of the instrument. While responses (both correct and incorrect) to many items showed no difference between the two cohorts, for several items the STEM majors clearly exceeded the performance of non-STEM majors. This identifies potential areas for improving the scientific reasoning skills of non-STEM majors. The assumption behind this conclusion is that the questions themselves are not difficult nor depended on knowledge of specific content and are not assessing some innate reasoning ability. We plan to follow cohorts of freshmen through their liberal arts and science courses and retest them after approximately 60 credits and at the end of their senior year. Our longitudinal assessment study will reveal whether scientific reasoning and quantitative literacy improve in our students after participating in science or other courses and experiences.