Panel Paper: Evaluating Public Health Performance: Equivalency Framing Effects in Quality Improvement Data

Friday, November 9, 2018
Wilson A - Mezz Level (Marriott Wardman Park)

*Names in bold indicate Presenter

Andrew Ballard, Rutgers University


Evaluating Public Health Performance: Equivalency Framing Effects in Quality Improvement Data

The study of performance in public health agencies has often focused on factors influencing organizational performance and managerial reforms such as accreditation (Carman & Timsina, 2015; Erwin, 2008). However, little attention has been paid to the study of information cognition and interpretation. A vital component to any performance and quality improvement system is the reporting of programmatic data (Moynihan, 2008). The very nature of this necessitates a transfer of knowledge between the data collector and the decision-maker, often leaving room for interpretation errors and bias.

There exists a growing body of literature that explores framing effects and data interpretation, both in the public sector and elsewhere (Kühberger, 1998; Levin, Schnittjer, & Thee, 1988; Olsen, 2015; Tversky & Kahneman, 1981); however, little research has applied these types of behavioral techniques to the way public health professionals view performance and quality improvement. The research that has explore these effects in the public sector have focused on citizen interpretation of performance data. Although interesting, this approach ignores the population that is most likely to actually use such information in any decision-making context, bureaucrats (Behn, 2012). This study extends the study of equivalency framing effects into the field of public health practice and investigates potential traps in the reporting of performance that could lead to less than optimal decision-making.

Through the use of an experimental survey vignette design, this study randomly assigns participants different information “frames” that present the same underlying fact in either positive or negative lights. Respondents are then asked to evaluate the performance of the hypothetical health agency. The data used represent two commonly used performance indicators in public health agencies; the achievement of performance targets and citizen satisfaction with training and outreach activities. Previous research in the fields of citizen interaction and marketing have shown individuals are susceptible to these framing effects and that the way data is contextualized influences how they evaluate the underlying phenomena.

A national sample of 353 U.S. local public health professionals was used to evaluate the effects of framing on the interpretation of quality improvement data. The results show that professionals working in the field of public health are indeed prone to display framing bias in their information evaluation. Interestingly, these results suggest the same effect size as other research conducted on citizens, which complicated the notion of subject matter expertise reducing the influence of heuristics. This study is an important piece to a larger body of research examining potential behavioral challenges to data driven decision-making and may help to encourage increased objectivity and transparency in data reporting.

Full Paper: