Evaluations of Federal STEM Workforce Training Programs
(Science and Technology)
Thursday, November 12, 2015: 8:30 AM-10:00 AM
Grenada (Hyatt Regency Miami)
*Names in bold indicate Presenter
Panel Organizers: Margaret Sullivan, Mathematica Policy Research
Panel Chairs: Clemencia Cosentino, Mathematica Policy Research
Discussants: A. James Hicks, National Science Foundation and Cheryl Leggon, Georgia Institute of Technology
The National Science Foundation (NSF) has been at the forefront of efforts to grow and diversify the science, technology, engineering, and mathematics (STEM) workforce, and has supported rigorous testing of its approaches to achieve this goal. Because rigorous evaluations of these types of programs are rare (U.S. Department of Education 2007; GAO 2012), recent efforts provide a unique opportunity to share evidence on the effectiveness of current practices and draw relevant policy implications.
This panel presents results from three recent evaluations of NSF fellowship programs to support students pursuing postsecondary degrees in STEM: the Graduate Research Fellowship Program (GRFP), NSF’s flagship fellowship program to provide funding to students of any ethnicity; the Louis Stokes Alliances for Minority Participation (LSAMP) Bridge to the Doctorate (BD) fellowship that provides grants to LSAMP institutions that select underrepresented minority students; and the Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) program that awards grants to a geographically diverse set of institutions of higher education to provide scholarships and academic supports for students who are academically talented, demonstrate financial need, and are pursuing degrees in STEM.
Each study uses a different quasi-experimental methodology to measure program impacts on students’ educational and early career outcomes. Specifically, the studies use different comparison groups depending on available data: the GRFP evaluation uses applicants who ranked high enough to be placed in the pool of those eligible for a fellowship but who did not make the final cut and became Honorable Mention designees instead; the BD evaluation uses propensity score matching to find appropriate matches among other students at the same institutions and programs who did not receive the fellowship; and the S-STEM evaluation also uses propensity score matching but among students in a national data set to identify comparable matches.
We find that all three of these programs are largely having a positive impact, but some results are mixed and unexpected. Findings also provide useful evidence of specific approaches that are either effective or associated with successful outcomes. Given that many government agencies and private foundations sponsor similar programs that have not been rigorously evaluated— such as the National Oceanic and Atmospheric Administration’s Educational Partnerships Program or the Alfred P. Sloan Foundation Minority Fellowship— findings will provide important evidence to guide policy decisions regarding the implementation of such similar efforts.
• Government Accountability Office. “Science, Technology, Engineering, and Mathematics Education: Strategic Planning Needed to Better Manage Overlapping Programs Across Multiple Agencies.” GAO-12-108. Washington, DC: Government Accountability Office, January 2012.
• U.S. Department of Education. “Report of the Academic Competitiveness Council.” Washington, DC: U.S. Department of Education, 2007.