challenges we face in studying interviewing, and make some comments about future research about interviewing.

Similar documents
Thank you. Is that Jon introducing me? Thank you, Jon. I you ll see how grateful you are when I m finished, I guess.

Standardized Survey Interviewing

2/9/2017 DISCLOSURES MOTIVATIONAL INTERVIEWING TO PROMOTE BEHAVIOR CHANGE LEARNING OBJECTIVES WHY PHARMACISTS AND TECHNICIANS

Using verbal and paralinguistic behaviors to identify mapping difficulties in responses to self-reported health questions

A Powerful Way to Understand People An introduction of the DISC concept By Robert A. Rohm, Ph.D. Everyone is not like you!

STATS8: Introduction to Biostatistics. Overview. Babak Shahbaba Department of Statistics, UCI

LEADERSHIP AND EMPOWERMENT 1

Experimental Psychology

Choosing Life: Empowerment, Action, Results! CLEAR Menu Sessions. Substance Use Risk 2: What Are My External Drug and Alcohol Triggers?

1. Before starting the second session, quickly examine total on short form BDI; note

Improving Managerial Effectiveness With Versatility

Continuing Care. Part 3 Telephone Monitoring

Choosing Life: Empowerment, Action, Results! CLEAR Menu Sessions. Substance Use Risk 5: Drugs, Alcohol, and HIV

FUNCTIONAL BEHAVIORAL ASSESSMENT

Council on Licensure, Enforcement and Regulation 1

Inferences: What inferences about the hypotheses and questions can be made based on the results?

Conventional questionnaire

Personal Listening Profile Facilitator Report

CHAPTER 1 Understanding Social Behavior

Cognitive Dissonance. by Saul McLeod published 2008, updated

BASIC VOLUME. Elements of Drug Dependence Treatment

What You Will Learn to Do. Linked Core Abilities Build your capacity for life-long learning Treat self and others with respect

Introduction to Research Methods

draft Big Five 03/13/ HFM

How Lertap and Iteman Flag Items

Choosing Life: Empowerment, Action, Results! CLEAR Menu Sessions. Health Care 3: Partnering In My Care and Treatment

Behavioral Activation

Human intuition is remarkably accurate and free from error.

REQUIRED INSTITUTIONAL REVIEW BOARD (IRB) EDUCATIONAL READING FOR FLETCHER SCHOOL RESEARCHERS APPLYING FOR EXEMPTION FROM IRB

Variable Data univariate data set bivariate data set multivariate data set categorical qualitative numerical quantitative

TTI SUCCESS INSIGHTS. corestrata. Jeff Doe. Communicating with Style

GCE. Psychology. Mark Scheme for June Advanced Subsidiary GCE Unit G541: Psychological Investigations. Oxford Cambridge and RSA Examinations

COMMUNITY PATIENT SUPPORT GROUP GUIDEBOOK

Taste of MI: The Listener. Taste of MI: The Speaker 10/30/2015. What is Motivational Interviewing? (A Beginning Definition) What s it for?

PST-PC Appendix. Introducing PST-PC to the Patient in Session 1. Checklist

Discovering Diversity Profile Individual Report

Samantha Wright. September 03, 2003

Modeling Interviewer Effects in a Large National Health Study

Choosing Life: empowerment, Action, Results! CLEAR Menu Sessions. Adherence 1: Understanding My Medications and Adherence

A Brief Introduction to Bayesian Statistics

Stress levels in people today are higher than ever. Our world is much faster paced than

A Case Study: Two-sample categorical data

Discussion. Re C (An Adult) 1994

Understanding Interviewer Probing

1. The Role of Sample Survey Design

Stages of Change The Cognitive Factors Underlying Readiness to Manage Stuttering:Evidence from Adolescents. What Do We Mean by Motivation?

What Stimulates Change? Translating Motivational Interviewing Theory into Practice

THEORY OF CHANGE FOR FUNDERS

Quantitative and qualitative research

Motivational Interviewing

CPRP PRACTICE DOMAIN I: Interpersonal Competencies. Module 4

Key Steps for Brief Intervention Substance Use:

ENGAGING PRIMARY CARE IN BOWEL SCREENING

Smoking Social Motivations

UN Handbook Ch. 7 'Managing sources of non-sampling error': recommendations on response rates

Time for Break: Understanding Information Workers Sedentary Behavior Through a Break Prompting System

The Attribute Index - Leadership

PLANNING THE RESEARCH PROJECT

Motivational Interviewing for Tobacco Cessation. Paul J. Toriello, RhD, CRC, CCS, LCAS, LPC-A East Carolina University

The Psychological drivers that propel and sustain women and men into leadership positions.

INTERVIEWS II: THEORIES AND TECHNIQUES 6. CLINICAL APPROACH TO INTERVIEWING PART 2

A Proposed New Service for People with Diabetes Feedback Report February 2014

54 Emotional Intelligence Competencies

Motivational Interviewing Engaging clients in a conversation about change

Protection of Research Subjects: The IRB Process

Motivational Interviewing in Healthcare. Presented by: Christy Dauner, OTR

Name: Period: Date: Unit Topic: Science and the Scientific Method Grade Level: 9

Qualitative and Quantitative Approaches Workshop. Comm 151i San Jose State U Dr. T.M. Coopman Okay for non-commercial use with attribution

Foundations for Success. Unit 3

Technical Specifications

Basic Biostatistics. Dr. Kiran Chaudhary Dr. Mina Chandra

Kahneman, Daniel. Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2011.

Learning Lab Data dive Mobile for Nutrition. mvam for Nutrition Part I revolutionizing collection of nutrition information

Design Principles. from Don Norman s Design of Everyday Things and Preece, Rogers and Sharp s Beyond Interaction Design

THE VALUE OF 24 HOUR PROFILES IN CONGENITAL ADRENAL HYPERPLASIA

Audio: In this lecture we are going to address psychology as a science. Slide #2

Action science skills for managing down, across or up the ladder

CANDIS. A Marijuana Treatment Program for Youth and Adults SCOPE AND SEQUENCE. An Evidence-Based Program from

Engagement of Individuals and Families in Early Psychosis Programs

A closer look at the Semantic Web journal s review process

CFSD 21 st Century Learning Rubric Skill: Critical & Creative Thinking

IFIC Foundation Food Label Consumer Research Project: Qualitative Research Findings

FRASER RIVER COUNSELLING Practicum Performance Evaluation Form

MAKING PEACE & MOVING ON

Congregational Vitality Survey

(essential, effectual, manageable) for Your Chapter. illustrations by peter grosshauser

Guide to Rating Critical & Integrative Thinking for eportfolios Washington State University, Fall 2006

Observer OPTION 5 Manual

Spiritual, Moral, Social and Cultural Development at The Grove Primary School

UNIVERSITY OF NORTH TEXAS Counseling Psychology Program Evaluation of Practicum

Welcome back to our program!

Supporting Recovery: The Role of the Family

Chapter 7. M.G.Rajanandh, Department of Pharmacy Practice, SRM College of Pharmacy, SRM University.

Statistics and Probability

Research. + Human Subjects Protections for. IRB Review and Approval at UW. October, Bailey Bodell, CIP. Reliance Administrator

VOLUME B. Elements of Psychological Treatment

QUALITY IMPROVEMENT TOOLS

Motivational Interviewing: Walking Through the Four Processes

Standards for the reporting of new Cochrane Intervention Reviews

MATH-134. Experimental Design

Transcription:

I am not going to provide an exhaustive view of the literature or focus on my own work, but I will talk about a few key studies of interviewing as illustration. I ll be making a few specific points about what we know, what we do not know, give some examples that illustrate one perspective for looking at interviewing, describe a recent project we have been involved in that illustrates some of the challenges we face in studying interviewing, and make some comments about future research about interviewing. 1 There are two different scripts that could be of concern when thinking about deviations from the script: Question wording or principles of standardized interviewing. Although these are related, the second is more general and leads us to think about the issues that arise when we develop new interviewing technologies and practices. 2

Because some researchers are very involved in research that uses the web or commercial media, it seems worth making a few comments about why one might spend time thinking about interviewers and interviewing as part of the future of survey research. 3 Looking at the changes in the studies that we have done in the last decade, we might guess the following about future studies: In addition to the sorts of opinion or other studies we do now there will be a stratum of research studies that will be very complex and demanding for both respondents and interviewers. As the cost of reaching sample members increases, from the researcher s point of view it is economically sensible to ask face-to-face interviewers to do many complex tasks once the interviewers have persuaded the sample member to become a respondent. For these complex interviews to be successful, we need to understand more about how measurement is accomplished within interaction, and how to motivate respondents. 4

5 This interactional model of the survey response process is a further development of that presented in Schaeffer and Dykema (2011). The model proposes that the main influences on the behavior of the interviewer and her deviations form the script are: Training in the interviewing practices of standardization Technology, which has both a direct effect on the interviewer s behavior and an indirect effect through the way technology shapes and limits the characteristics of the survey question The characteristics of the survey question (see Schaeffer and Dykema 2012), which affects the interviewer s behavior directly as she reads the question and indirectly by the way the question affects the cognitive processing of the respondent and the respondent s subsequent behavior The behavior of the respondent, which may require that the interviewer respond in ways that challenge her compliance with standardization, Interactional and conversational practices, some of which may be made more or less relevant by characteristics of the question or the behavior of the respondent Technology -- whether paper or some electronic technology presents the script to 6

the interviewer in a way that is often incomplete, so that the interviewer must improvise using the principles of standardization. For example, a paper grid may allow the interviewer to have an overview of the structure of the task and also allow her to enter information that the respondent provides before the interviewer requests it (e.g., the ages of other members of the household). A CAPI instrument, on the other hand, may require that each piece of information be entered on a separate screen, and these constraints may in turn motivate the interviewer to reinforce that standardized order with the respondent. Interactional and conversational practices are often in tension with standardization. Some of these tensions may have only minor consequences for data quality. For example, interviewers routinely use okay both as a receipt for an answer and to announce an upcoming return to the script of the next question. Other tensions may be more consequential. For example, when cognitive assessments or knowledge questions are administered to respondents, the respondent s knowledge that they are being tested may cause discomfort that leads to laughter, self-deprecating remarks by the respondent, or reassurance by the interviewer (Gathman, Maynard, and Schaeffer 2008). These social tensions are not provided for in the rules of standardization, and the improvisations by the interviewer may affect the respondent s willingness to engage in further disclosures or their level of motivation. Understanding better how conversational practices enter into the interaction between interviewer and respondent will help us understand better which behaviors might affect the quality of measurement and how to adapt our standardized interviewing practices to changing technology and to maintaining the motivation of the respondent. 6 These intersect. There is a close relationship between the technology we use and interviewing practices. An important influence on how we integrate technology into our interviews is conversational practices. Thus, we could say that there is a close relationship between technology and interviewing practices -- and conversational practices keep intruding. Interviewing practices are the rules that are oriented to improving reliability and validity of data. The comments in the next slide draw on work of others: Jennifer Dykema; Douglas Maynard; Charles Cannell; Lucy Suchman and Brigitte Jordan (1990); Fuchs, Couper, and Hansen (2000); Jeff Moore. 7

The way that technology, question characteristics, interviewing practices, and conversational practices might intersect might be easier to think of with a specific example. So I m going to draw on an example of how interviewers and respondents use grids. This is an example that illustrates the important role of the respondent s behavior on subsequent behavior of the interviewer. This table is from work by Marek Fuchs (2002), but comments also draw on related work by Fuchs, Couper, and Hansen (2000). The topic of study was variations in the design of grids, by person or by topic. The table shows that the proportion of cases in which respondents provided information for all members of the household at once varies depending on the design of the instrument. We could think of this behavior as a sort of conversational practice that we might want to encourage in order take advantage of efficient ways of providing information that respondents will be familiar with. Or we might think of this behavior as just the sort of respondent behavior that interferes with standardization and requires intervention by the interviewer. 8 Whichever way we think about it, it does appear that we might be able to evoke or suppress the behavior by the design of the instrument. The behavior was more frequent when a topical organization was used and less frequent when the grid was organized by person. 8

The efficiency of the topical organization is evident here: the roster is completed most quickly using a grid organized by topic. Why? Because the respondent learns question the first time, and then provides information without question being repeated. The respondent can also volunteer that a characteristic is the same for all members of the household. Notice that it appears that the wording of the questions were the same in both the item and grid format, but that the behavior of the interviewers and respondents were influenced by the implementation of the question on the interviewer s screen. There are main effects: Both grid and topic are faster, and grid+topic is fastest. 9 Why do interviewers deviate from the script either the wording of the question or the practices of standardization? 10

There are different sorts of deviations from question wording or practices of standardized interviewing, and it is important to distinguish among them. 11 The next few slides summarize the results of studies that had either a record check or an estimate of reliability and that examined how the behavior of the interviewer was associated with those indicators of data quality. Note that there are actually very few studies that have these strong criteria for evaluating the impact of the interviewer s behavior on data quality. Although these two studies suggest that the behavior of the interviewer does not affect interviewer variance or the index of inconsistency, it is important to realize that these are all standardized interviewers and these are telephone interviews. The range of interviewer behaviors is highly constrained. 12

Based on the few available record-check studies, it appears that sometimes interviewers change the question in a way that increases accuracy, sometimes in ways that decrease accuracy, and sometimes in a way that has no effect. Understanding what might account for these differences would require more detailed analyses of the actual recordings. These analyses are time consuming and have not been done. 13 14

Behaviors of the interviewer other than question reading are also important. Probing is a difficult behavior to identify reliably. Some studies refer to probing and others to follow-up. When probing or follow-up occurs, it is almost always associated with lower quality data, regardless of the adequacy of the interviewer s follow-up. This is presumably because of the interviewer s follow-up was occasioned by the inadequacy of the respondent s answer, and inadequacy that the interviewer might not be able to ameliorate. 15 16

Taken together, these studies do not provide a basis for expecting that interviewers freed from the straightjacket of standardization will behave in ways that will automatically improve data quality overall, particularly in face-to-face interviews. 17 The importance of question design can be seen in this graph from a face-to-face study that was designed to allow the analysis to identify the proportion of cluster variance that was due to the interviewer. This graph plots proportion of cluster variance due to the interviewer by an index of question characteristics that might be associated with interviewer variance: Being sensitive, nonfactual, open, and difficult. Median for the measure is higher and values are more consistently higher when the type of item is more vulnerable to interviewer variance. 18

19 These deviations from the script are in the service of improving comprehension. This is standardized interviewing with additional training of respondents and ad lib provision of definitions. This method of providing definitions has not been tested against other possible methods of providing definitions, some of which are discussed in the paper. 20

21 The last sort of deviation from the script I will discuss here is feedback. Feedback has not received much study, but it is a site where conversational practices often intrude on standardization. The experiment reported by Dijkstra was face-to-face and used standardized interviewing. The personal interviewing style only manipulated the feedback that the interviewer was allowed to provide. Note that the Dijkstra study, unlike many such experiments, included a manipulation check, took the structure of the data into account (e.g., the small number of interviewers in each style, respondents nested within interviewers), and attempted direct assessment of data quality (e.g., with map task). 22

The most variable form of interviewing. To the extent that there is no script, deviations from the script are undefined. 23 The most variable form of interviewing. To the extent that there is no script, deviations from the script are undefined. 24

Let s return to the question of deviations from the script by thinking about another approach to grids that tries to take into account the close relationship between technology and interviewing practices. 25 A collaborative form of standardization that gives the respondent access to a visual display might, in principle, be able to learn from previous experience with visual aids. We have used visual aids before, but perhaps not as successfully as we could have. We have used visual displays for the respondent, but what have we learned about how to use them effectively? 26

Perhaps more of a collision than intersection! 27 Technology is used to provide a live, dynamic display to help respondent see the structure of the task, stimulate recall, and allow the respondent to review the interviewer s entries. 28

29 30

31 Every technology requires that we review and reinvent interviewing practices. When respondents provide information in variable form, interviewers need techniques for verifying answers as they are entered. In this case, we did not ask for gender but handled it by verification when the name was entered. 32

Training interviewers to recognize and clarify potential ambiguities is quite daunting. The range of possible ambiguities is large, and many of them are not ambiguities in much everyday talk. 33 Although many interviewing shops use verification as an authorized deviation from the script, the range of possible structures and practices is large, and it is not clear which will be best for the quality of data. 34

In this case the last meal that was asked about -- interviewers were authorized to use any of the parenthetical text in their verification. 35 36

We plan to investigate these interviews to determine how the technology is actually used, what conversational practices the participants recruit, and what techniques of interviewing are needed. 37 Previous research has described some places where deviations from the script of standardization are likely to occur and what those deviations look like. 38

Deviations from standardization originate with the behavior of the respondent, which in turn deviates from the paradigmatic question-answer sequence because of: question design state uncertainty not sure what their answer is task uncertainty not sure how to fit their answer into the structure of the task 39 40

41 These are situations in which interviewer could confirm or tune but does not. Instead the interviewer engages in what Hak calls immediate coding. Some of these are probably appropriate, but some may not be. Interviewers may not be aware that they are engaging in coding, and it is not clear how we should train interviewers to deal with such answers. But in this situation, the interviewer s behavior is governed by conversational practices rather than the rules of standardized interviewing. It is also not clear that the quality of the data is harmed in the examples shown here. 42

Here the respondent repeats enough of the response category for the interviewer to code the answer and go to the next question. 43 The challenge of training interviewers to identify a codable answer can be previewed by considering what has to be done to train coders to identify a codable answer. 44

In this case, the interviewer does not engage in immediate coding but produces a leading probe. Most leading probes produced by interviewers are probably produced in a situation like this one in which the respondent s answer appears to indicate a direction (Moore and Maynard 2008). This is another example of the intrusion of conversational practices into the standardized interview. 45 46

In this example, the interviewer also produces a leading probe, but with less justification. 47 48

This interviewer probably goes too far in inferring the appropriate response category and produces a leading probe. 49 50

Research desiderata Technology, instrument and question design, and interviewing practices are closely related and interact with conversational practices When we implement inadequate studies, we live with the results for a long time. 51 Small-scale lab experiments as a site for intensive development of details of method For example, Schober & Conrad s (1997) training for respondents about the importance of asking questions Obtaining recordings of the method for later development Training interviewers Methods for monitoring interviewers in production Role for single-method studies? Large-scale field experiments 52

53 54