Introduction to Applied Research in Economics

Similar documents
Introduction to Applied Research in Economics Kamiljon T. Akramov, Ph.D. IFPRI, Washington, DC, USA

Introduction to Applied Research in Economics

Carrying out an Empirical Project

Key questions when starting an econometric project (Angrist & Pischke, 2009):

KOM 5113: Communication Research Methods (First Face-2-Face Meeting)

Chapter-2 RESEARCH DESIGN

PSYCHOLOGY AND THE SCIENTIFIC METHOD

The Limits of Inference Without Theory

Research Approach & Design. Awatif Alam MBBS, Msc (Toronto),ABCM Professor Community Medicine Vice Provost Girls Section

Marno Verbeek Erasmus University, the Netherlands. Cons. Pros

Empirical Strategies

CSC2130: Empirical Research Methods for Software Engineering

The Current State of Our Education

Chapter 1: Explaining Behavior

Chapter 02. Basic Research Methodology

Durkheim. Durkheim s fundamental task in Rules of the Sociological Method is to lay out

EMPIRICAL STRATEGIES IN LABOUR ECONOMICS

Empirical Strategies 2012: IV and RD Go to School

Regression Discontinuity Suppose assignment into treated state depends on a continuously distributed observable variable

Chapter 11 Nonexperimental Quantitative Research Steps in Nonexperimental Research

Psych 1Chapter 2 Overview

Using register data to estimate causal effects of interventions: An ex post synthetic control-group approach

Write your identification number on each paper and cover sheet (the number stated in the upper right hand corner on your exam cover).

Economic incentives in household waste management: just a waste? A relational approach to agents and structures in household waste sorting

Experimental Research in HCI. Alma Leora Culén University of Oslo, Department of Informatics, Design

August 29, Introduction and Overview

Lecture 9 Internal Validity

DEVELOPING THE RESEARCH FRAMEWORK Dr. Noly M. Mascariñas

PLS 506 Mark T. Imperial, Ph.D. Lecture Notes: Reliability & Validity

Ec331: Research in Applied Economics Spring term, Panel Data: brief outlines

MODULE 3 APPRAISING EVIDENCE. Evidence-Informed Policy Making Training

Linking Theoretical and Empirical Accounting Research Part I

Causal Inference in Observational Settings

Do not copy, post, or distribute

A note on evaluating Supplemental Instruction

Isolating causality between gender and corruption: An IV approach Correa-Martínez, Wendy; Jetter, Michael

THEORY DEVELOPMENT PROCESS

Assessing Studies Based on Multiple Regression. Chapter 7. Michael Ash CPPA

Endogeneity is a fancy word for a simple problem. So fancy, in fact, that the Microsoft Word spell-checker does not recognize it.

ICPSR Causal Inference in the Social Sciences. Course Syllabus

Defining Evidence-Based : Developing a Standard for Judging the Quality of Evidence for Program Effectiveness and Utility

CASE STUDY 2: VOCATIONAL TRAINING FOR DISADVANTAGED YOUTH

Why randomize? Rohini Pande Harvard University and J-PAL.

Critical dilemmas in the methodology of economics facing the crisis

Michael J. McQuestion, PhD, MPH Johns Hopkins University. Measurement Concepts and Introduction to Problem Solving

Making sense of published research Two ways: Format Argument

A critical look at the use of SEM in international business research

Group Assignment #1: Concept Explication. For each concept, ask and answer the questions before your literature search.

Experimental Design. Dewayne E Perry ENS C Empirical Studies in Software Engineering Lecture 8

Experimental Psychology

Macroeconometric Analysis. Chapter 1. Introduction

INTERVIEWS II: THEORIES AND TECHNIQUES 5. CLINICAL APPROACH TO INTERVIEWING PART 1

HOW STATISTICS IMPACT PHARMACY PRACTICE?

Causal Methods for Observational Data Amanda Stevenson, University of Texas at Austin Population Research Center, Austin, TX

Dairy intake-related attitudes, subjective norms and perceived behavioural control of South African nutrition professionals

Introduction to the Scientific Method. Knowledge and Methods. Methods for gathering knowledge. method of obstinacy

The validity of inferences about the correlation (covariation) between treatment and outcome.

Reassessing Quasi-Experiments: Policy Evaluation, Induction, and SUTVA. Tom Boesche

PRINCIPLES OF EMPIRICAL SCIENCE: A REMINDER

society. The social perspective is a way of looking at society. It sees society as something over and above the very people who are in that society.

How many speakers? How many tokens?:

This article, the last in a 4-part series on philosophical problems

VALIDITY OF QUANTITATIVE RESEARCH

Daniel Little University of Michigan-Dearborn www-personal.umd.umich.edu/~delittle/

CHECKLIST FOR EVALUATING A RESEARCH REPORT Provided by Dr. Blevins

Assignment 4: True or Quasi-Experiment

6. A theory that has been substantially verified is sometimes called a a. law. b. model.

Social Preferences of Young Adults in Japan: The Roles of Age and Gender

Evaluating Social Programs Course: Evaluation Glossary (Sources: 3ie and The World Bank)

Applying the Experimental Paradigm to Software Engineering

Design of Experiments & Introduction to Research

Still important ideas

Chapter 02 Developing and Evaluating Theories of Behavior

Identification with Models and Exogenous Data Variation

TECH 646 Analysis of Research in Industry and Technology

Foundations of Research Methods

Describe what is meant by a placebo Contrast the double-blind procedure with the single-blind procedure Review the structure for organizing a memo

Statistical Decision Theory and Bayesian Analysis

Introduction to Behavioral Economics Like the subject matter of behavioral economics, this course is divided into two parts:

Response to the ASA s statement on p-values: context, process, and purpose

Mastering Metrics: An Empirical Strategies Workshop. Master Joshway

Executive Summary. Future Research in the NSF Social, Behavioral and Economic Sciences Submitted by Population Association of America October 15, 2010

Behavioural Economics University of Oxford Vincent P. Crawford Michaelmas Term 2012

Analysis A step in the research process that involves describing and then making inferences based on a set of data.

Lecture 4: Research Approaches

RESEARCH METHODS. A Process of Inquiry. tm HarperCollinsPublishers ANTHONY M. GRAZIANO MICHAEL L RAULIN

Validity and Quantitative Research. What is Validity? What is Validity Cont. RCS /16/04

Version No. 7 Date: July Please send comments or suggestions on this glossary to

SEMINAR ON SERVICE MARKETING

PARTIAL IDENTIFICATION OF PROBABILITY DISTRIBUTIONS. Charles F. Manski. Springer-Verlag, 2003

POLI 343 Introduction to Political Research

Western Philosophy of Social Science

EFFECTIVE MEDICAL WRITING Michelle Biros, MS, MD Editor-in -Chief Academic Emergency Medicine

Cross-Lagged Panel Analysis

MEA DISCUSSION PAPERS

9.63 Laboratory in Cognitive Science

CRIM 2130 Emergency Management Spring Chapter 7 Planning. School of Criminology and Justice Studies University of Massachusetts Lowell

The Logotherapy Evidence Base: A Practitioner s Review. Marshall H. Lewis

A Brief Introduction to Bayesian Statistics

Transcription:

Introduction to Applied Research in Economics Dr. Kamiljon T. Akramov IFPRI, Washington, DC, USA Regional Training Course on Applied Econometric Analysis June 12-23, 2017, WIUT, Tashkent, Uzbekistan

Why do we need applied economic research?

Introduction the beauty of economics rests in its theory, but the power of economics lies in its application to current problems. Don Ethridge 2004 Many see economics as both an art and a science 3

Introduction (Cont.) Excitement about applied economic research comes from the opportunity to learn about cause and effect in real world Economic theory suggests important relationships, often with policy implications, but virtually never provides quantitative magnitudes of causal effects Applied economics is more vulnerable than physical sciences to models whose validity never will be clear, because necessity for approximation is much stronger Nevertheless, economics has an important quantitative side, which cannot be escaped The important questions of the day are our questions

Introduction (cont.) Will loose monetary policy spark economic growth or just fan the fires of inflation? What is the effect of a 1 percentage point increase in broad money on inflation? What is the effect of a 1 percentage point increase in interest rates on output growth? Will mandatory health insurance really make people healthier? How does an additional year of education change earnings? What is the effect of farm size on agricultural productivity? How does agricultural diversity impact nutritional outcomes? How does children s nutritional status affect his/her earning potential later in life?

Introduction (cont.) Economists use of data and tools to answer cause-and-effect questions constitutes the field of applied econometrics Tools of applied econometrics are disciplined data analysis combined with statistical inference We are after truth, but truth is not revealed in full, and messages the data transmit requires interpretation Examples Comparisons made under ceteris paribus conditions may have a causal interpretation Ceteris paribus comparisons are difficult to engineer

Objectives of this course The main objective of the course is to strengthen the capacity of young researchers to conduct empirical research The emphasis of the course will be on empirical applications with a focus on introduction to applied econometrics The course will cover basics of Probability theory Statistics Regression analysis Stata

In this course you will learn How to analyze data descriptive statistics of a sample How to conduct inferential statistical analysis testing hypothesis and deriving estimates Population and sample How to build a regression model Focus on applications theory is used only as needed to understand the why of methods Learn to understand the empirical (regression) analysis of others Get some hands-on experience with applied data analysis using Stata

Assumptions about the participants You will conduct or be required to do research in future You have basic knowledge of economic theories You know basic statistics and social sciences analytical techniques You are able to think abstractly and pragmatically You think critically (but not in extreme form cynicism, which is a barrier to understanding) You have the ability to synthesize from the data, facts, and information in front of you Ability to discern privately held beliefs from concepts supported by evidence i.e. need for objectivity You are currently initiating a research project 9

Hypotheses in empirical research Construction of research hypotheses is an important step in applied economics research Hypotheses argue that one phenomenon or behavior causes or is associated with another phenomenon or behavior These phenomena are called constructs Various sources of support routinely used to develop hypotheses Theory and logical analysis (intuition) Past studies: authority and consensus Real life experiences and observations

Framework for assessing empirical research Construct validity: to what extent are the constructs of interest successfully operationalized in the research? Internal validity To what extent does the research design permit us to reach causal conclusions about the effect of the independent variable on the dependent variable? External validity To what extent can we generalize from our sample and setting to the populations and settings specified in research hypothesis?

Maximizing construct validity Constructs are the abstractions and any one construct can be measured in different ways because there are a variety of concrete representations of any abstract idea Variables are partial representations of constructs, and we work with them because they are measurable Operational definitions specify how to measure a variable Reliability of a measure is defined as the extent to which it s free from random error components Validity is defined as the extent to which the measure is free from systematic errors

Three components of variable Construct of interest Achievement Variable Achievement test Construct of disinterest Motivation, Ability, Anxiety, etc. Random errors Grading mistakes Recording mistakes

Maximizing internal validity Threats to the internal validity of regression studies Omitted variable basis Misspecification or incorrect functional form Measurement error Sample selection bias Simultaneous causality bias Unobserved heterogeneity

Maximizing external validity A population is the aggregate of all of the cases that conform to some designated set of specifications All the people residing in a given country, all households residing in a given state, etc. A stratum may be defined by one or more specifications that divide the population into mutually exclusive segments Nonprobability versus probability sampling Probability sampling Simple random sampling gives each element in the population an equal chance of being selected Stratified random sampling

Data: sources and types Experimental versus observational data Cross-sectional data: data on different entities for a single period of time Time-series data: data on a single entity collected at multiple time periods Panel or longitudinal data: data for multiple entities in which each entity is observed at two or more time periods

Main questions in empirical research What is the policy question? What is the causal relationship of interest? What is the dependent variable and how is it measured? What is (are) the key independent variable(s)? What is the data source? What is the identification strategy? What is the mode of statistical inference? What are the main findings?

Common flaws in methodology Failure to: Establish the reason for the research Provide clear & concise objectives Provide complete reference to prior research on the subject and methods Lack of understanding for the conceptual and theoretical basis of the research Selection of analytical structural model for mere empirical convenience (or familiarity) Presenting conclusions that are merely restatements of analytical findings (i.e. results) 18

Examples of bad methodology Unclear about the research problem Unclear about the objectives Lack thorough awareness of previous work Incomplete conceptualization of the problem Confusing research means with ends Good research is no accident. Don Erthridge 2004 19

Creating good habits for researchers Doing research entails planning and designing the research, implementing and completing the analysis and disseminating the results. Conducting research that is defensible, useful and expands our knowledge base is not an accident. 20

Future readings Chetty, R. Yes, Economics Is a Science. New York Times, Oct. 20, 2013. http://www.nytimes.com/2013/10/21/opinion/yes-economics-is-ascience.html?_r=0 Angrist, J. D. and J. S. Pischke. 2010. The Credibility Revolution in Empirical Economics: How Better Research Design is Taking the Con out of Econometrics. Journal of Economic Perspectives, vol. 24, No. 2, pp. 3-30. Angrist, J. D. and J. S. Pischke. 2009. Mostly Harmless Econometrics: An Empiricists s Companion. Princeton University Press. Chapter 1. Angrist, J. D. and J. S. Pischke. 2015. Mastering Metrics: The Path from Cause to Effect. Introduction Rodrik, Dani. 2015. Economics Rules: The Rights and Wrongs of the Dismal Sciences.

Thank you and good luck