Panel Paper:
How Transparency and Reproducibility Can Connect Research and Policy: A Case Study of the Minimum Wage Cost Estimates
*Names in bold indicate Presenter
To better understand how research affects estimates of policy analysis, we need to learn all the details behind how those estimates were carried out. Until very recently there was neither a clear framework nor a mainstream tool to perform this task in a straightforward manner. I argue that the new practices of transparency and reproducibility in social and biomedical sciences provide such a framework. One example of the new practices is the guidelines on data, analytic methods and code transparency advanced by the Center for Open Science (TOP Guidelines 2015). Further, recent developments of dynamic documentation, a reporting technology that combines the narrative, mathematical modeling, and coding components of the analysis in one single document, provide an accessible tool to document all the details of a policy analysis.
As a case study I reproduce the Congressional Budget Office’s report on the effects of different minimum wage policies in 2014, and trace its relationship with the evidence provided by research. I perform extensive sensitivity analysis to all the inputs in the policy analysis and re-estimate the meta-analysis behind the main evidence provided by the research (elasticity of labor demand) in order to simulate the effects of potential new pieces of evidence. I will simulate different elasticity estimates (and different standard deviations) and document how these affect the estimates of the meta-analysis. With this analyses we will be able to understand which components of the policy analysis (e.g. elasticity estimates, characteristics of the population of interest or other modeling assumptions) have the largest effect on the final estimates and its uncertainty, the potential value of new research, and how to update the policy analysis in a transparent and rigorous fashion.