Poster Paper: Leveraging Evaluation Methodologies for Policy Analysis and Evidence Based Decision–Making

Thursday, July 13, 2017
Palace Ballroom II (Crowne Plaza Brussels - Le Palace)

*Names in bold indicate Presenter

Jacqueline H. Singh, Qualitative Advantage, LLC
The purpose of any policy analysis should be clearly stated and defined. Much like evaluation, there is not a “best- or one-way” to answer questions about how well a policy or program works. And, there is not a singular way to navigate the complexities of public programs. Similar to evaluation, various analytical methods can be used simultaneously; depending upon the intervention’s stage of development, certain methods may be more helpful, while others are used later on when it is more stable and mature. For example, when information is needed quickly rapid data collection or Real World Evaluation (RWE) design may be chosen. At other times, robust research evaluation designs can be used to understand program administration and participants’ behaviors.

Another important key feature that policy analysis and evaluation share is they ensure “meaningful” questions are written. The “question types” have implications for data collection, analysis, and which resources are needed. What is most fundamental for analysts and evaluators to understand is the problem a policy aims to address, as well as the programs or actions implemented as a result of the policy. At this poster session, participants will learn about frameworks that speak to purpose, modeling, and how to articulate evaluation questions that inform evaluation designs. Participants are encouraged to engage in discussion and ask questions. Examples and resources will be shared.

The Association For Public Policy Analysis & Management (APPAM) International Conference is an opportune setting to share fundamental evaluation practices that can guide policy analysts for evaluating public programs within regional, multi-governmental, and international policy arenas. This poster session acknowledges policy analysis is as much an art, as it a science. It is both the process of assessing policies or programs, and the product of an analysis. Policy analysts and program implementers engaging in research and evaluation can be more effective if they maintain basic understanding of effective program evaluation practices—and, are aware that the field of evaluation continues to evolve, cross-pollinate, and improve its analytic methods. 

While there are many theories, approaches, and strategies from which to draw, both policy analysis and evaluation consider the problem being addressed, which involves locating and systematically gathering evidences needed to answer stakeholders’ questions. Answers to these questions inform future decision-making regarding the policy or program. In effort to assist participants to take charge of their next policy analysis and/or evaluation, this poster session displays and shares practical tools, models, and frameworks to engage participants in conversation to raise awareness of the intersections policy has with program evaluation. These practical evaluation tools help focus key policy and/or evaluation questions that enable policy analysts to predict better and evaluate with more confidence the consequences of alternative policies.

The goals of the poster are to: 1) demystify evaluation; 2) differentiate and recognize overlaps that policy analysis has with research and program evaluation; 3) demonstrate how a policy purpose and questions informs an evaluation’s purpose and questions; and, 4) increase participants’ awareness of frameworks, strategies, approaches, and practical tools to increase evaluation capacity.