What Counts as Credible Evidence in Contemporary Evaluation Practice?

Similar documents
What Counts as Credible Evidence in Evaluation Practice?

Reviewed by Jason T. Burkhardt and Jan K. Fields Western Michigan University

This article, the last in a 4-part series on philosophical problems

1 Qualitative Research and Its Use in Sport and Physical Activity

Realism and Qualitative Research. Joseph A. Maxwell George Mason University

What is good qualitative research?

Defining Evidence-Based : Developing a Standard for Judging the Quality of Evidence for Program Effectiveness and Utility

GENERAL EDUCATION JOURNAL; Vol. 4; 1ssue 2; Pages 1-14; September 2015; Published By Mount Meru University Research Unit

What is program evaluation?

CONFLICT MANAGEMENT Fourth Edition

Commentary on Constructing New Theory for Identifying Students with Emotional Disturbance

The Objects of Social Sciences: Idea, Action, and Outcome [From Ontology to Epistemology and Methodology]

Core Competencies. Course Key. HPH 550: Theories of Health Behavior & Communication

CHAPTER 3. Methodology

A QUALITY FRAMEWORK FOR CASE STUDY RESEARCH: CONVINCINGNESS

Precaution and the Affirmation of Risk The EMF Case

2) Ethical and scientific standards that guide the use of quarantine or other movement restrictions during public health emergencies.

Philosophical Approaches in the Social Sciences. The Importance of Philosophy in Understanding/Conducting Educational Research

Issues in Conducting Rigorous and Relevant Research in Education

Basics of philosophy of science

In this second module, we will focus on features that distinguish quantitative and qualitative research projects.

Theory and Methods Question Bank

Evidence Based Practice Position Statement

CRIMINOLOGY TODAY. AN INTEGRATIVE INTRODUCTION sixth edition. By FRANK SCHMALLEGER. Pearson Education, Inc.

BEING A LEADER and LEADERSHIP

Personal Talent Skills Inventory

Participatory Consumerism. Consumer society as a concept. plus. Participation as a concept. Participatory Consumerism

INTERVIEWS II: THEORIES AND TECHNIQUES 1. THE HUMANISTIC FRAMEWORK FOR INTERVIEWER SKILLS

Overview of Study Designs in Clinical Research

Definitions of Nature of Science and Scientific Inquiry that Guide Project ICAN: A Cheat Sheet

1 of 16 24/05/ :06

SOCQ121 & BIOQ321. Session 12. Research into Practice. Department: Social Science.

Benefits and constraints of qualitative and quantitative research methods in economics and management science

Qualitative Data Analysis. Richard Boateng, PhD. Arguments with Qualitative Data. Office: UGBS RT18 (rooftop)

The Interpreting Marketplace

Qualitative and Quantitative Approaches Workshop. Comm 151i San Jose State U Dr. T.M. Coopman Okay for non-commercial use with attribution

reward based power have ability to give you what you want. coercive have power to punish

Instructor s Test Bank. Social Research Methods

Department of Criminal Justice and Criminology

Reducing fat stigma in health: A flexible intervention drawing on post-structuralism. Dr Jenny Setchell

SOCIAL WORK PROGRAM Field Education Director s Evaluation of Practicum Agency

DISCUSSION QUESTION WHAT IS AUTISM? 11/8/2016 ETHICAL IMPLICATIONS OF WORKING WITH INDIVIDUALS WITH ASD

EVIDENCE AND RECOMMENDATION GRADING IN GUIDELINES. A short history. Cluzeau Senior Advisor NICE International. G-I-N, Lisbon 2 November 2009

Running Head: LEGALIZATION OF MARIJUANA 1 LEGALIZATION OF RECREATIONAL MARIJUANA IN CALIFORNIA

FORUM: QUALITATIVE SOCIAL RESEARCH SOZIALFORSCHUNG

Theorizing Interviews within Qualitative Research

PLANNING THE RESEARCH PROJECT

PADDLING IN THE SAME DIRECTION. Dr Cathy Bettman

Durkheim. Durkheim s fundamental task in Rules of the Sociological Method is to lay out

Karin Hannes Methodology of Educational Sciences Research Centre KU Leuven

Professor Paul Bissell BA MA PhD ScHARR University of Sheffield

Repositioning Validity

Scientific Literacy (SL) How can we define SL? How and what should we teach to ensure that our students attain a high level of SL?

Behaviorists and Behavior Therapy. Historical Background and Key Figures

Does executive coaching work? The questions every coach needs to ask and (at least try to) answer. Rob B Briner

Chapter 1 Introduction to Educational Research

INTRODUCTION. Evidence standards for justifiable evidence claims, June 2016

Evidence Informed Practice Online Learning Module Glossary

Educational Research

Differences and Similarities between Theories. Sociodynamic counselling Crossing lines, broadening minds Anita Keskinen

In many healthcare situations, it is common to find

Modeling Terrorist Beliefs and Motivations

Research Methodology in Social Sciences. by Dr. Rina Astini

Resilience: A Qualitative Meta-Synthesis

Research in Development Studies: Philosophy, Methods and Rigor

CONCEPTUAL FRAMEWORK, EPISTEMOLOGY, PARADIGM, &THEORETICAL FRAMEWORK

Daniel Little University of Michigan-Dearborn www-personal.umd.umich.edu/~delittle/

Criminology Courses-1

SOCQ121/BIOQ121. Session 8. Methods and Methodology. Department of Social Science. endeavour.edu.au

1/25/2013. Selecting the Right Program: Using Systematic Reviews to Identify Effective Programs. Agenda

TSE in Animal Populations - Fact and Fiction September 10 11, 2003 Fort Collins, Colorado Conrad G. Brunk, PhD University of Victoria, Canada

9/5/ Research Hazards Awareness Training

PROFESSIONALISM AND MEDICINE S SOCIAL CONTRACT WITH SOCIETY

Evaluation Model for Evaluating Transformational Community Development Programs

CRIMINAL JUSTICE (CJ)

Theoretical Perspectives in the PhD thesis: How many? Dr. Terence Love We-B Centre School of Management Information Systems Edith Cowan University

Table of Contents. Chapter 1 Theoretical Criminology: An Introductory Overview [page 79] Chapter 3 Biosocial Theories of Crime [page 99]

Research Designs and Frameworks for Population Health Improvement. Setting a Research Agenda for Population Health

CSC2130: Empirical Research Methods for Software Engineering

Core Competencies for Peer Workers in Behavioral Health Services

Criminal Justice (CJUS)

Welcome to the IPLAN. Prevention

MOOC: February 2017: Qualitative Research Methods: Assignment 1

TTI Personal Talent Skills Inventory Emotional Intelligence Version

DEFINING THE CASE STUDY Yin, Ch. 1

A framework for evaluating public health interventions for obesity prevention. An IOM committee report

SOCI SOCIOLOGY. SOCI Sociology 1. SOCI 237 Media and Society

The use of Interpretive Narratives in the analysis of Project Failures. (Lynette Drevin)

Validity Issues of Bibliometric and Performance Indicators: What we can Learn from the Case of the Humanities

On Trust. Massimo Felici. Massimo Felici On Trust c

Ethics of Assessment. Ethics vs. Law. Ethics vs. Law 7/19/2012. Ethics: what one should or should not do, according to principles or norms of conduct

You must answer question 1.

SOCIOLOGICAL RESEARCH

Realist Interviewing and Realist Qualitative Analysis

The Relationship between Critical Realism and Performance Measurement through Data Envelopment Analysis

Effective communication can have a significant effect on client satisfaction.

Qualitative Research. Prof Jan Nieuwenhuis. You can learn a lot just by watching

Sociology Matters. Culture and Socialization. Culture and Socialization. What is Culture? What is Culture? 9/5/2012

Advancing the Social Psychology of Conflict Resolution

A Contextual Approach to Stereotype Content Model: Stereotype Contents in Context

Transcription:

What Counts as Credible Evidence in Contemporary Evaluation Practice? Stewart I. Donaldson, Ph.D. Claremont Graduate University stewart.donaldson@cgu.edu Hawaii-Pacific Evaluation Association Annual Conference September 7, 2007 Hilton Waikiki Prince Kuhio Hotel

Overview Contemporary Evaluation Practice Evidence-based Practice What Counts as Credible Evidence? Discussion of the Gold Standard Practical Program Evaluation

Contemporary Evaluation Evaluation Theory Evaluation Design Evaluation Methods Evaluation Practice The Evaluation Profession Research on Evaluation

An Indicator of the Second Boom in Evaluation Practice 1980s Only 3 National and Regional Evaluation Societies 1990 5 2000 More than 50 2006 More than 70 including a Formal International Cooperation Network

Number of Evaluation Professional Associations 75 70+ 50 50+ 25 0 3 5 1980s 1990 2000 2006

The Top Regional Evaluation Association of the Future Hawaii-Pacific Evaluation Association

Evidence-Based Practice Highly Valued Global Multidisciplinary Many Applications

Sample of Applications Evidence-based Medicine Evidence-based Mental Health Evidence-based Management Evidence-based Decision Making Evidence-based Education Evidence-based Coaching

Sample of Applications Evidence-based Social Services Evidence-based Policing Evidence-based Conservation Evidence-based Dentistry Evidence-based Policy Evidence-based Thinking about Health Care

Sample of Applications Evidence-based Occupational Therapy Evidence-based Prevention Science Evidence-based Dermatology Evidence-based Gambling Treatment Evidence-based Sex Education Evidence-based Needle Exchange Programs Evidence-based Prices Evidence-based Education Help Desk

New Formula + + = Mom The Flag Warm Apple Pie Evidence-based Practice

In God We Trust ALL OTHERS MUST HAVE CREDIBLE EVIDENCE

A Popular Alternative to Evidence-based Practice FAITH-BASED PRACTICE & PROGRAMS

What Counts as Credible Evidence?

Wide Range of Views about Credible Evidence

The Cause-Effect Challenge in Evaluation

The Solution: In Search of a Time Machine

Experimental Design: Gold Standard? Random Assignment Experimental Control Ruling Out Threats to Validity

Supreme Courts of Credible Evidence What Works Clearinghouse Campbell Collaboration Cochrane Collaboration

Bias Control on Judgment Day

Evaluation Design Dominance Single Rigor Hierarchy Versus Situational Variation and Appropriateness (Patton, 2007)

Challenges of the Gold Standard AEA Statement vs. Not AEA Statement Theoretical Practical Methodological Ethical Ideological Political

AEA Statement vs. Not AEA Statement AEA Opposition to Priority on RCTs Privileging RCTs: Back to the Dark Ages Priority Manifests Fundamental Misunderstandings Causality and Evaluation AEA Members Opposition to AEA Statement Lack of Input from Key AEA Members Unjustified, Speciously Argued, Does Not Represent Norms or Many AEA Members Views AEA Compared to the Flat Earth Society

Diverse Prescriptive Theories of Evaluation Practice Social experimentation Science of valuing Results oriented management Utilization focused evaluation Empowerment evaluation Realist evaluation Theory-driven evaluation Inclusive evaluation Fourth generation evaluation

RCTs Not Practical/Feasible Often Impossible to Implement Well Not Cost Effective Very Limited Range of Applications Evidence to the Contrary

RCT Ethical Issues Unethical to Withhold Treatment from Control Groups Why Evaluate if Treatment is Better? Delay Treatment Non Evidence-Based Programs are Unethical

Methodological Challenges Zero Blind vs. Double Blind Experimenter Effects Allegiance Effects Unmasked Assignment Misguided Arguments about Causality External Validity Concerns Recent Developments to Overcome Some Challenges noted in the Past

Political Concerns The RCT Gang has hijacked the term evidence-based for political and financial gain Evidence and especially scientific or rigorous evidence have become code for RCTs Focusing evaluation around these particular ideas about scientific evidence, allows social inquiry to become a tool for institutional control and to advance policy in particular directions

Political Concerns It is epistemological politics, not the relative merits of RCTs, that underlie federal directives on methodology choice The demand for evidence advances a master epistemology. The very dangerous claim is that a single epistemology governs all science Privileging the interests of the elite in evaluation is radically undemocratic

Ideological Differences: Paradigm Wars The positivist can t t believe their luck, they ve lost all the arguments of the last 30 years and they ve still won the war! The world view underlying the current demand for evidence is generously speaking a form of conservative post-positivism, positivism, but in many ways is more like a kind of neo-positivism. positivism.

Ideological Differences: Paradigm Wars Many of us thought we d d seen the last of this obsolete way of thinking about the causes and meanings of human activity, as it was a consensual casualty of the great quantitative- qualitative debate in the latter part of the 20 th century. Human action is not like activity in the physical world. Social knowledge is interpreted, contextual, dynamic or even transient, social or communal, and quite complicated. Privilege and honor complexity.

Ideological Differences: Paradigm Wars Evidence-based evaluation concentrates evaluation resources around one small question, does the program work?, and uses but one methodology, despite a considerable richness of options. The result is but one small answer. So what kind of evidence is needed? Not evidence that claims purchase on the truth with but a small answer to a small question, neat and tidy as it may be.

So What Kind of Evidence is Needed? (Greene, in press) Evidence: that provides a window into the messy complexity of human experience that accounts for history, culture, and context that respects differences in perspective and values about experience in addition to consequences about the responsibilities of government not just responsibilities of its citizens with the potential for democratic inclusion and legitimization of multiple voices - evidence not as proof but as inkling

Practical Program Evaluation: A Program Theory Approach

Practical Program Evaluation Integrative Framework Contextual: Contingency Perspective Method Neutral Culturally Competent Evaluation Standards Guiding Principles

Gather Credible Evidence Definition: Compiling information that stakeholders perceive as trustworthy and relevant for answering their questions. Such evidence can be experimental or observational, qualitative or quantitative, or it can include a mixture of methods. Adequate data might be available and easily accessed, or it might need to be defined and new data collected. Whether a body of evidence is credible to stakeholders might depend on such factors as how the questions were posed, sources of information, conditions of data collection, reliability of measurement, validity of interpretations, and quality control procedures (CDC Evaluation Framework).

Gather Credible Evidence (Continued) Role: Enhances the evaluation s utility and accuracy; guides the scope and selection of information and gives priority to the most defensible information sources; promotes the collection of valid, reliable, and systematic information that is the foundation of any effective evaluation (CDC Evaluation Framework).

Some Guiding Principles for Evaluation Practice Understand the Diversity of Prescriptive Evaluation Theories Be Aware of the Diverse Views on what Counts as Credible Evidence Ongoing Discussions of Stakeholder Expectations about Evidence Secure Buy-in to the Evaluation Design Before Revealing Results Be Aware of Potential Standards of Judgment Be Prepared for Meta-Evaluation Credible Evidence is Often Key for Evaluation Influence & Positive Change

An Ounce of Credible Evidence is Worth a Ton of Enthusiasm and Good Intentions