Panel Paper: Using Implementation Science As a Framework for Bridging Research and Practice

Saturday, November 10, 2018
Wilson C - Mezz Level (Marriott Wardman Park)

*Names in bold indicate Presenter

Teresa Derrick-Mills1, Samantha Harvell2 and Mary K. Winkler1, (1)Urban Institute, (2)What Works in Reentry Clearinghouse


Implementation Science was developed to improve the implementation of evidence-based programs by systematizing the implementation process. It draws attention to the stages and drivers of implementation fostering successful integration of programs into organizational structures. Implementation evaluations document the features of program implementation including identifying key processes, procedures, challenges and workarounds, and any inconsistencies that may threaten program fidelity. The goal is sometimes to improve practices, sometimes to provide information for scale up, and sometimes to document what took place retrospectively. Rarely, however, are all the features of implementation as delineated in implementation science included in implementation evaluations.

Our team hypothesized that if implementation science delineates the stages and processes integral to successful adoption, integration and sustainability of programs, it would serve as a useful lens for bridging research and practice (Derrick-Mills et al. 2016). We are currently using this approach to translate research into actionable policy and practice guidance to help youth probation agencies implement practices that align with the latest research on adolescent development and effective interventions with youth through a cooperative agreement with the federal Office of Juvenile Justice and Delinquency Prevention.

We used the National Implementation Research Network’s (NIRN) framework (Fixsen et al. 2005; Fixsen et al. 2015) to code and draw themes from source materials, and to frame interview and focus group questions with practitioners. We have screened nearly 300 source documents (primarily articles in academic journals). After skimming the articles, we screened out nearly half as not appropriate for our purpose. We coded 145 sources using NVivo to codes reflective of implementation science drivers and stages. As we expected, very few sources provided a comprehensive view of program implementation, e.g. covering aspects of all implementation drivers (competency drivers, organizational drivers, and leadership drivers) and discussing one or more stages of implementation. For example, professional development of staff and organizational climate were the most frequently discussed at 51 percent and 55 percent each.

In this paper, we discuss the benefits and challenges of using the NIRN implementation science framework to translate into practitioner-friendly materials implementation research developed using other frameworks, issues that surfaced related to assessing the quality of implementation research, and the implications of this work for implementation research and research-to-practice translation more broadly. At the time of the APPAM conference, we expect to have developed and deployed the materials developed in this cooperative agreement.

Derrick-Mills, T, Winkler, M, Harvell, S, Gaddy, M, Liberman, A, Love, H, and Willison Buck, J. 2016. Bridging Research and Practice for Juvenile Justice: Systematizing the Approach. Washington, DC: Urban Institute.

Fixsen, D.L., Naoom, S.F., Base, K.A., Friedman, R.M. and Wallace, F. 2005. Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network.

Fixsen, D.L., Blase, K., Naoom, S. and Duda, M. 2015. Implementation Drivers: Assessing Best Practices. Chapel Hill: National Implementation Science Network (NIRN), Frank Porter Graham Child Development Institute, UNC-CH.