Indiana University SPEA Edward J. Bloustein School of Planning and Public Policy University of Pennsylvania AIR American University

Panel Paper: The Value of Information for Choices of Education and Training: Experimental Evidence on Customer-Focused Scorecard Performance Reporting Systems

Saturday, November 14, 2015 : 11:15 AM
Orchid A (Hyatt Regency Miami)

*Names in bold indicate Presenter

Alex Ruder and Barry Sopher, Rutgers University
Education and workforce policymakers have placed increasing emphasis on providing consumers with information in order to improve their educational and training choices.  In workforce development, this focus is evident in the Workforce Innovation and Opportunity Act of 2014, where easily-accessible information tools are described as being a significant benefit to participants and members of the general public seeking to choose effective training programs and providers. Despite the significant efforts undertaken to collect and disseminate information about programs, providers, and outcomes, surprisingly little is known about how workforce system customers interpret and use this information when making choices about training. In particular, scholars have only mixed or suggestive evidence to describe how customers use information to compare and choose programs, and whether this information ultimately leads to better choices. Extant research either applies to domains not directly applicable to the unique circumstances of workforce development, or offers inconclusive findings.

              In collaboration with the State of New Jersey Department of Labor and Workforce Development and several local Workforce Investment Boards, we conduct a field experiment to assess how different information about training program providers influences customer evaluation of occupation-specific variables and choice of providers. Shortly after workforce customers begin to receive services at one-stop career centers, we randomly assign them into one of three experimental groups, where we vary information about the characteristics of the training provider and the post-training outcomes shown on an online information tool. The treatments differ in the information available about the completion rate, placement rate, and post-training earnings of users of different training providers.  A “basic” information treatment provides subjects with aggregated information, by training provider, on outcomes for customers using a training provider. A “rich” treatment allows subjects to break out this information by relevant demographic variables. The control group uses the existing NJ TOPPS system available in one-stop career centers.

            First, we elicit intentions and expectations from subjects about their intended occupational and training choices, their expected earnings and employability in the intended occupation, and the expected time length of training. Subjects are then guided through “scorecard” information about their intended occupation and training providers for that occupation, specific to the information treatment.  Finally, we again elicit intentions and expectations related to occupational choice and training, allowing us to measure changes in these variables. In addition to the immediate results of the information treatment, which allows us to test for changes in expectations about occupation-specific earnings and employability, we use administrative data to follow subjects through their choice of training provider, allowing us to test for differences in the distribution of training provider choices across information treatments.  The field experiment results will provide crucial knowledge to aid policymakers as they develop information tools for customers in workforce development and other policy areas, and, more broadly, will evaluate to what degree information can help meet policy goals of improving customer training experiences and employment outcomes.