Panel Paper:
Judging the Evidence: How Citizens Value Outputs, Outcomes and Costs
Friday, November 9, 2018
Harding - Mezz Level (Marriott Wardman Park)
*Names in bold indicate Presenter
Policy analysts recommend a focus on outcomes rather than mere outputs as evidence of the true effectiveness and efficiency of government programs. But prior research suggests that citizens may have difficulty with the counterfactual thinking required to interpret outcomes correctly and thus exhibit a bias toward more frequent, lower cost (but ineffective) outputs—with important implications for democratic accountability. In this paper, we report on two survey experiments with US adults (n=840) that probe the public’s judgments about real social programs with published evidence about their outputs, outcomes and costs. In the first experiment, we replicate a prior paradigm in which respondents are asked to judge the effectiveness and efficiency of an HIV/AIDS program in California with randomly assigned real information about outputs, outcomes and costs. Results mirror previous findings, suggesting a bias favoring high frequency, low cost (but ineffective) outputs. In the second experiment, we extend the paradigm to a program for high school students with special needs called Check and Connect, again randomly assigning real information about outputs, outcomes and costs. In this second experiment, however, respondents appear to be more persuaded by relatively low frequency, high cost (but effective) outcomes, suggesting a greater willingness to engage in counterfactual thinking. Although differences in the wording of the two vignettes may account for some of these conflicting findings, we believe that a deservingness heuristic could be part of the explanation as well. That is, respondents may be more motivated to engage in the effort of counterfactual thinking when the beneficiaries of a program are seen as more deserving. In both experiments, providing information on the costs to society of the social problem at issue leads people to view the programs a more effective and efficient. Implications for future research along these lines as well as the practice of performance reporting and democratic accountability are discussed.