PS 23-100 - Promoting inclusion in STEM fields through REU programs: An evaluation of common program assessment techniques

Tuesday, August 8, 2017
Exhibit Hall, Oregon Convention Center
Manisha V. Patel, Andrew L. McDevitt and Aaron M. Ellison, Harvard Forest, Harvard University, Petersham, MA
Background/Question/Methods

Undergraduate research experiences can strengthen student participation in STEM fields by providing authentic opportunities in scientific research. NSF has supported undergraduate research since 1987 through the Research Experience for Undergraduates (REU) program, with specific goals of increasing participation of traditionally under-represented groups in STEM fields beginning in the early 2000’s. In 2010, REU programs began to use the Undergraduate Research Student Self-Assessment (URSAA) survey to evaluate learning outcomes of their participants. Additionally, these programs are tasked with tracking participants for employment and matriculation in STEM fields. Although both have helped to increase program assessment, collaborative efforts to understand facilitative mechanisms and program impact are limited.

The Harvard Forest Summer Program in Ecology (HF-SRPE), has received core support from the NSF REU program for nearly three decades and has one of the largest publicly available archives of programmatic data. As with many REU programs, these data are collected through various surveys and questionnaires, each with specific intentions behind their deployment. We evaluated HF-SRPE’s various program assessment data using cultural-historical activity theory (CHAT) to highlight the gaps that need to be addressed for better understanding of the role REU programs have on promoting inclusion in the STEM fields.

Results/Conclusions

Demographic information, commonly collected across programs, contextualizes the student populations a program is serving. These data highlight HF-SRPE’s ability to recruit women and minority students above national baselines, but focusing on demographics alone does not address how students are selected or supported within the research community. URSSA helps link demographic information with student learning gains, but there are inconsistencies in its deployment of various survey questions. Across its 40 core questions, HF-SRPE cohorts responded at or above national averages for all items. However, URSSA cannot provide mechanistic understanding behind these generalized learning gains because it does not account for student’s prior knowledge and skills nor contextualize the learning environment of the individual program. Although URSSA has united programs under a common assessment protocol, these limitations result in ambiguous comparisons among programs that may not predict future success in STEM fields. HF-SRPE’s alumni surveys indicate that >80% of respondents remain within ecological disciplines and >50% pursue or receive graduate degrees, but these outcomes are difficult to attribute directly to the REU program. To strengthen the inferences we can draw from survey results, we advocate using a common theoretical framework that adequately contextualizes variation among students and across programs.