Principles of Big Data Practice and the Science of Implementation: Applications to Housing Policy in Child Welfare Interventions
(Housing and Community Development)
Saturday, November 4, 2017: 3:30 PM-5:00 PM
Wright (Hyatt Regency Chicago)
*Names in bold indicate Presenter
Roundtable Organizers: Bridgette Lery, San Francisco Human Services Agency
Moderators: Jocelyn Everroad, San Francisco Human Services Agency
Speakers: Bridgette Lery, San Francisco Human Services Agency, Mike Pergamit, Urban Institute and Jennifer Haight, University of Chicago
This session reconsiders the role that administrative data systems (“big data”) can play in maximizing the potential for an intervention’s success, particularly at the earlier phases of implementation. The panelists will focus on four principles of big data practice that, if adhered to, can aid in predictive analytics, intervention development, implementation, and evaluation design: (1) start by asking a (good) question, (2) arrange and analyze the data in ways that maximize knowledge development (3) be disciplined in converting data to evidence, and (4) use the evidence to build a theory of change and iterate a program model.
The durability and applicability of these principles during the implementation of a new initiative is demonstrated by each presenter in a brief review of a critical implementation activity that made use of the core data principles. In this context, panelists will refer to their recent experiences evaluating housing interventions for child welfare-involved families. Panelists will focus on targeting, specifically, who to target (triage), when to intervene (timing), and how to monitor implementation (the intervention).
The first panelist will consider the example of triage – the process of locating those who both need the intervention and are likely to benefit from it. Arranged and analyzed appropriately, administrative data can be queried to build predictive models that correctly identify service or risk histories so that an intervention can be directed to individuals who need it and who also stand the best chance of benefiting from it.
The second panelist will consider timing the delivery of an intervention, a task that requires disciplined consideration of the risk group (denominator) and the window during which change is expected to occur. During an intervention’s design phase, predictive analytics using administrative data can help determine when the treatment should begin in order to optimize the chance that the intervention has its intended effect.
The final panelist will demonstrate how the implementation of a new intervention was strengthened by a continuous quality improvement process that relied on the ability to synthesize existing and purpose-built databases. The resulting master data set was flexible enough to provide real time evidence about the process of implementation, the extent to which program goals were met, and where adjustments were necessary in order to implement with fidelity.