SMALL n IMPACT EVALUATION. Howard White and Daniel Phillips 3ie

Similar documents
he left the Bank to create

Bradford Hill Criteria for Causal Inference Based on a presentation at the 2015 ANZEA Conference

Year Area Grade 1/2 Grade 3/4 Grade 5/6 Grade 7+ K&U Recognises basic features of. Uses simple models to explain objects, living things or events.

Prediction, Causation, and Interpretation in Social Science. Duncan Watts Microsoft Research

3: (Semi-)Structured Interviews. Tina FREYBURG [Introduction to Qualitative Methods]

Quantitative Methods. Lonnie Berger. Research Training Policy Practice

Methodology Guide to Process Tracing for Christian Aid

POLI 343 Introduction to Political Research

Political Science 15, Winter 2014 Final Review

Version No. 7 Date: July Please send comments or suggestions on this glossary to

August 29, Introduction and Overview

Conducting Research in the Social Sciences. Rick Balkin, Ph.D., LPC-S, NCC

CSC2130: Empirical Research Methods for Software Engineering

Analysis A step in the research process that involves describing and then making inferences based on a set of data.

EBA seminar 16/ Kim Forss, Andante tools for thinking AB and EBA

Reduce Tension by Making the Desired Choice Easier

The degree to which a measure is free from error. (See page 65) Accuracy

Experimental Methods. Policy Track

King, Keohane and Verba, Designing Social Inquiry, pp. 3-9, 36-46, notes for Comparative Politics Field Seminar, Fall, 1998.

How to write & publish a scientific paper Basic Concepts & Methods

The Scientific Method. Philosophy of Science. Philosophy of Science. Epistemology: the philosophy of knowledge

Qualitative Data Analysis. Richard Boateng, PhD. Arguments with Qualitative Data. Office: UGBS RT18 (rooftop)

The Practice and Problems of Social Research

Western Philosophy of Social Science

Causal inference nuts and bolts

Causal inference: Nuts and bolts

Chapter 02 Developing and Evaluating Theories of Behavior

Anti-Bias Programming. Elizabeth Levy Paluck Princeton University

- Triangulation - Member checks - Peer review - Researcher identity statement

FMEA AND RPN NUMBERS. Failure Mode Severity Occurrence Detection RPN A B

Scientific Method in Biology

6. A theory that has been substantially verified is sometimes called a a. law. b. model.

PSYCHOLOGICAL CONSCIOUSNESS AND PHENOMENAL CONSCIOUSNESS. Overview

Critical Thinking Rubric. 1. The student will demonstrate the ability to interpret information. Apprentice Level 5 6. graphics, questions, etc.

Chapter 4: Understanding Others

PRELIMINARY EXAM EVALUATION FACULTY SCORE SHEET

Sociological Research Methods and Techniques Alan S.Berger 1

Scientific Method in Biology

Everyone is Responsible for Safety But who is accountable and for what?

Seminar 3: Basics of Empirical Research

Daniel Little University of Michigan-Dearborn www-personal.umd.umich.edu/~delittle/

Introduction to the Scientific Method. Knowledge and Methods. Methods for gathering knowledge. method of obstinacy

Study Designs in Epidemiology

Chapter 1.1. The Process of Science. Essential Questions

M2. Positivist Methods

Behaviorism: An essential survival tool for practitioners in autism

Mixed Methods Study Design

In this chapter we discuss validity issues for quantitative research and for qualitative research.

Chapter 20 Persuasive Appeals from Public Speaking: An Idea Focus W. Clifton Adams Return to Table of Contents

Propensity Score Analysis Shenyang Guo, Ph.D.

Lesson 1 Understanding Science

Fodor on Functionalism EMILY HULL

Introducing Reflexivity.

CONCEPTUAL FRAMEWORK, EPISTEMOLOGY, PARADIGM, &THEORETICAL FRAMEWORK

Science, Society, and Social Research (1) Benjamin Graham

The Regression-Discontinuity Design

From Description to Causation

Level 2 Award Thinking and Reasoning Skills. Mark Scheme for June Unit 1 B901: Thinking and Reasoning Skills.

Durkheim. Durkheim s fundamental task in Rules of the Sociological Method is to lay out

How to Think Straight About Psychology

Quantitative research Methods. Tiny Jaarsma

Glossary of Research Terms Compiled by Dr Emma Rowden and David Litting (UTS Library)

DOING SOCIOLOGICAL RESEARCH C H A P T E R 3

TECH 646 Analysis of Research in Industry and Technology

SINGLE-CASE RESEARCH. Relevant History. Relevant History 1/9/2018

Overview of cognitive work in CBT

Chapter 13. Social Psychology

UWMD Workshop: Outcomes and Logic Models. CIG Application Workshop for FY

Propensity Score Methods for Estimating Causality in the Absence of Random Assignment: Applications for Child Care Policy Research

JOIN THE UW CHALLENGE

Why Human-Centered Design Matters

Applying GRADE-CERQual to qualitative evidence synthesis findings paper 4: how to assess coherence

Research and science: Qualitative methods

Perception LECTURE FOUR MICHAELMAS Dr Maarten Steenhagen

Causality and counterfactual analysis

Experimental design (continued)

PSYC1024 Clinical Perspectives on Anxiety, Mood and Stress

Research methods. Summary of the book Research methods for operations management Christer Karlsson. Sharmaine Linker

COURSE: NURSING RESEARCH CHAPTER I: INTRODUCTION

A to Z OF RESEARCH METHODS AND TERMS APPLICABLE WITHIN SOCIAL SCIENCE RESEARCH

What Constitutes Reliable Evidence? Rare Diseases and Clinical Trials

lab exam lab exam Experimental Design Experimental Design when: Nov 27 - Dec 1 format: length = 1 hour each lab section divided in two

SOCIOLOGICAL RESEARCH PART I. If you've got the truth you can demonstrate it. Talking doesn't prove it. Robert A. Heinlein

Research Design. Beyond Randomized Control Trials. Jody Worley, Ph.D. College of Arts & Sciences Human Relations

Study Design STUDY DESIGN CASE SERIES AND CROSS-SECTIONAL STUDY DESIGN

Bending it Like Beckham: Movement, Control and Deviant Causal Chains

My Notebook. A space for your private thoughts.

Quantitative Analysis and Empirical Methods

Realist Interviewing and Realist Qualitative Analysis

Recent developments for combining evidence within evidence streams: bias-adjusted meta-analysis

Understanding Social Problems. Sociology 230 Dr. Babcock Unit I Chapter 1: Research

Fitting the Method to the Question

Ignorance Isn t Bliss

Systems Thinking Rubrics

The Role and Importance of Research

ARCHITECTURAL RESEARCH METHODS

Evaluating Social Programs Course: Evaluation Glossary (Sources: 3ie and The World Bank)

Asking & Answering Sociological Questions

Future directions: making construals to inform and evaluate government policy. Karl King Associate Director Winning Moves and Databuild Consulting

Realism and Qualitative Research. Joseph A. Maxwell George Mason University

Transcription:

SMALL n IMPACT EVALUATION Howard White and Daniel Phillips 3ie

MY PRIORS Bad evaluation Good evaluation

MY PRIORS

MY PRIORS

MY PRIORS

All projectaffected persons (PAPs) DEFINITIONS I Impact evaluations answer the question as to what extent the intervention being evaluated altered the state of the world Outputs or outcomes All outcomes and types of outcome Intended and unintended

DEFINITIONS III Small n impact evaluation is Still about attribution, i.e. what difference did the intervention make? And so outcome monitoring alone is not enough, i.e. before versus after doesn t work, as it is likely that something would have happened anyway as other forces are at work But many small n evaluations ignore other forces, even when they are blindingly obvious Difference between small and large n Large n establishes causation through statistical means Small n builds up case based on weight of evidence, strength of argument, and absence of other plausible explanations In mixed methods designs the large n component is the DNA evidence that can usually clinch the causal argument

THE COUNTERFACTUAL Outcome Factual Counterfactual Time

We would have done it anyway

THE COUNTERFACTUAL: WOULD HAVE HAPPENED ANYWAY Outcome Counterfactual Factual Time

THE COUNTERFACTUAL II Outcome Factual Counterfactual Time

What is the counterfactual?

APPROACHES TO SMALL n IMPACT EVALUATION Explanatory (mechanism-based) approaches Contribution analysis, general elimination methodology, realist evaluation, process tracing Based around theory of change (causal chain or mechanisms) Explicit accounting for context and other factors Possibly generate testable hypotheses, to be tested using mixed methods Participatory approaches Method for Impact Assessment of Programs and Projects (MAPP), most significant change, success case method, outcome mapping Uses participatory data to analyze cause and effect (role programme has played in changes at community and individual level)

SO WHERE DOES THIS LEAVE US? Common elements Clear statement of intervention Lay out theory of change, allowing for context and external factors (e.g. CCTV) Document each link in causal chain If things happened as planned, and intended outcome observed, then conclude causation (or not e.g. Peru microcredit) using triangulation and other evidence e.g. Ghana hospital But the last step is getting a bit dodgy. Something is missing what constitutes valid evidence of a link in the causal chain?

The study will use mixed methods, combining both quantitative and qualitative approaches

The study will use mixed methods, combining both quantitative and qualitative approaches

WHAT IS VALID CAUSAL EVIDENCE? Want an approach which uncovers the true causal relationship in the study population (internal validity) In large n studies threats to internal validity can come from sampling error and selection bias Analogously, in small n studies bias arises if there is a systematic tendency to over or underestimate the strength of the causal relationship

IS THERE SYSTEMATIC BIAS IN QUALITATIVE DATA COLLECTION AND ANALYSIS? There exist significant sources of bias in both the collection and analysis of qualitative data These biases arise both in the responses given and the way in which evaluators interpret these data These biases are likely to result in the systematic over-estimate of programme impact

RESPONDENT BIASES Courtesy bias Political correctness bias Fundamental error of attribution Teleological fallacy Self-importance bias Self-serving bias So the role of evaluators is to construct a consistent narrative from disparate voices, understanding each respondent s perspective and biases

EVALUATOR BIAS We fall for courtesy bias which confirms our self-importance Contract renewal bias Friendship bias That s not how it was Who we talk to and at all who we believe Blah, blah, blah blah, blah... Priors and own experience bias

WHAT TO DO 1. Have an explicit methodology 2. Well structured research with systematic analysis of qualitative data

Define intervention to be evaluated Theory of change Identify evaluation questions Identify mix of methods to answer each question Data collection and analysis plan