Indiana University SPEA Edward J. Bloustein School of Planning and Public Policy University of Pennsylvania AIR American University

Panel Paper: Using Research Evidence to Transform Technical Assistance in Employment Services

Thursday, November 12, 2015 : 3:50 PM
Ibis (Hyatt Regency Miami)

*Names in bold indicate Presenter

Michelle Derr, Mathematica Policy Research
The federal and state governments invest billions of dollars each year in providing public assistance programs for disadvantaged populations. Despite substantial investments, much of the programming is not grounded in the best available research. This paper presentation provides a framework and three case study examples for translating the academic and applied research into program technical assistance (TA) with the goal of transforming how employment services are provided to disadvantaged populations. With evidence-based technical assistance (EBTA), evidence is infused throughout the content, process, and evaluation of public policy and program change.

Content. Effective program technical assistance is based on reliable evidence about what works. With EBTA, TA content is identified using a rigorous, systematic assessment of policy and program needs. After this content is identified, TA providers work with programs to determine their goals and areas for improvement. TA providers then review high quality research to provide evidence-based recommendations for program improvement. Together, the TA provider and programs develop a customized action plan for implementing program change.


Process. To be transformational, the content delivered by TA providers must be accessible and relevant to the recipients. EBTA is based on proven methods for how adults learn best. These methods increase how much the TA recipients learn, retain, and use the information. Adult learning methods include careful meeting design, participant engagement, clear objectives, a supportive learning environment, and recognition of and support for different learning styles. These methods enhance the quality of the TA provider’s interactions with recipients, making trainings, telephone calls, peer-to-peer collaborations, and meetings more productive. Written TA products such as practice briefs and toolkits should also incorporate these methods, communicating the most relevant points to the audience in an accessible way.


Evaluation. The ultimate goal of EBTA is to inspire changes and innovations that improve programs. Using rigorous evaluation techniques such as rapid-cycle evaluation or implementation science can help TA providers gauge their success in meeting this goal. Rapid-cycle evaluation—including randomized controlled trials using existing administrative or other data—can be conducted relatively easily and quickly. Implementation science may be used to test program fidelity, or whether a change was implemented consistently and correctly. Data analytics can also be useful for informing and tracking program change. The findings from these types of evaluations can provide useful feedback to the program and build the knowledge base for the field.

This paper describes this framework in greater detail and shares findings from an initiative where EBTA was used to change how employment services are delivered to disadvantaged populations in three case study sites—Washington State; Routt County, Colorado; and the District of Columbia. We describe how we used analytic techniques to identify and prepare programs for change, designed interventions grounded in rigorous evaluation findings, and conducted “rapid-cycle learning” pilot tests and “rapid-cycle evaluations” of these initiatives. This paper is intended to inspire and improve the dialogue between researchers and practitioners.