Panel Paper: Open Science for Policy Research: Toward Transparent, Reproducible Workflows

Friday, November 9, 2018
Marriott Balcony A - Mezz Level (Marriott Wardman Park)

*Names in bold indicate Presenter

Sean Grant, RAND Corporation

The scientific community has created a reward system that does not sufficiently incentivize the vital features of science: i.e., transparency, openness, and reproducibility (McNutt, 2014). Concerns about research waste, scientific misconduct, and lack of replication are consequently rising (Glasziou et al., 2014; Open Science Collaboration, 2012, 2015; Tajika, Ogawa, Takeshima, Hayasaka, & Furukawa, 2015). For instance, a project to replicate 67 papers in economics was only able to replicate less than half of the papers, even with help from the original authors (Chang & Li, 2015). To improve the credibility of the scientific enterprise, researchers have begun to list, develop, and implement “open practices” that can help increase the transparency and reproducibility of research workflows.

This presentation will introduce current best practices in open science to policy researchers. This presentation has two overall goals: (1) to convince attendees to incorporate best practices of open science in future projects, and (2) to motivate attendees to promote the use of best practices in open science to other stakeholders of policy research. To achieve the above objectives, this presentation will discuss a conceptual framework for transparent, reproducible workflows in policy research, based on the work of several groups that focus on research transparency, namely: the Lancet REWARD Campaign (Moher et al., 2015); the Center for Open Science and their Transparency and Openness Promotion (TOP) Guidelines (Nosek et al., 2015); the Berkeley Initiative for Transparency in the Social Sciences (Miguel et al., 2014); the Data Access and Research Transparency (DA-RT) group (Data Access and Research Transparency group, 2015); the Meta-Research Innovation Center at Stanford (Ioannidis, Fanelli, Dunne, & Goodman, 2015); and the Laura and John Arnold Foundation (Preston, 2011).

This framework for transparent, reproducible workflows involves an organized set of practices during study design, conduct, dissemination, and archiving. As part of design, researchers should pre-register their studies; pre-registrations should include pre-analysis plans and details on plans to ensure adequate statistical power for detecting minimally important effect sizes. As part of conduct, researchers should ensure proper management of their data using bespoke tools that incorporate version control software and open notebooks. As part of disseminating study results, researchers should utilize reporting guidelines for transparent reporting and disclosure of study details, more immediately disseminate findings through pre-print servers and publications, and aim to publish findings open access so that research outputs are freely available to interested readers. As part of archiving studies, researchers should share their data (with explanatory metadata) in trusted data repositories and record their analyses using dynamic documents that include all analytical code underpinning reported findings.

Policy researchers can use these open practices during each study to ensure that their workflows are more transparent and reproducible. In addition, other stakeholders can facilitate the adoption of open practices by policy researchers, such as journal editors, peer-reviewers, research funders, policy-makers, and practitioners. Faculty should incorporate the instruction on these practices into their courses and mentoring to build the capacity of the next generation of policy researchers to produce transparent and reproducible workflows.