How Stable are Responses Over Time

Similar documents
National Survey of Substance Abuse Treatment Services (N-SSATS)

The Impact of Survey Communications on Response Rates

Behavioral Health Barometer. North Dakota, 2013

What are Indexes and Scales

Behavioral Health Barometer. Oregon, 2015

The Logotherapy Evidence Base: A Practitioner s Review. Marshall H. Lewis

Indirect Threat Assessments and Direct Violence Risk Evaluations

DOING SOCIOLOGICAL RESEARCH C H A P T E R 3

THE FIRST SESSION CHECKLIST

In 2008, an estimated 282,000 persons

Patient and public involvement. Guidance for researchers

COMMUNITY ONCOLOGY ALLIANCE YOU WON T BELIEVE WHAT CMS WILL BE REPORTING ON YOUR ONCOLOGISTS

ULTIMATE GUIDE TO SOBER LIVING HOUSES

Oregon PTOC Program: Domestic Violence Screening Process for Mediators Overview and Instructions OVERVIEW

Beyond the Horizon: Recruiting for VMMC services at Bwaila ANC and STI clinics. Geoffrey Menego, Project IQ Malawi 6 th February 2019

Documentation TwinLife Data: Filtering inconsistencies in discrimination items

Estimates of the Reliability and Criterion Validity of the Adolescent SASSI-A2

CONTENTS. Where do stem cells come from? Source of stem cells to be utilized. What are the clinic and physicians credentials?

Chapter 18 Section 2. EXPIRED - Department Of Defense (DoD) Cancer Prevention And Treatment Clinical Trials Demonstration

CHAPTER V. Summary and Recommendations. policies, including uniforms (Behling, 1994). The purpose of this study was to

HEES for Spanish- Speaking Customers

Seven Questions to Ask Your Next Dentist

Calculating CAHPS Hospice Survey Top-Box Scores

Adult Consumer and Family Member Perceptions of Care 2012: Findings from the Annual Survey of Pennsylvania Behavioral Health Service Recipients

MODULE 3 APPRAISING EVIDENCE. Evidence-Informed Policy Making Training

Family Tariff. General Tariff Information. Scope of the family referral. Extended Services

Was the psychologist helpful? Parent/carers perceptions of educational psychologists

Agency Information Collection Activities: Proposed Collection; Comment Request

SINGLE SUBJECT RESEARCH TO EVALUATE THE EFFICACY OF ENERGY PSYCHOLOGY TECHNIQUES Janet Farrel, Ph.D. John Freedom, CEHP

General Terms and Conditions

DIVISION OF PUBLIC & BEHAVIORAL HEALTH BUREAU OF HEALTH CARE QUALITY AND COMPLIANCE LCB File No. R Informational Statement per NRS 233B.

Tools and Resources for Dental Offices. Prepared for the Pennsylvania Dental Association

ISSUE NO. 11 OCTOBER 2017 DRINKING WATER IS GOOD FREQUENTLY ASKED QUESTIONS CIMAS CARES

Volunteer State Health Plan Webinar Event March 23 rd, 2012

Calculating CAHPS Hospice Survey Top-Box Scores

National Quitline Data Warehouse (NQDW): Changes to Data Collection in 2016

Ministry of Children and Youth Services. Follow-up to VFM Section 3.01, 2013 Annual Report RECOMMENDATION STATUS OVERVIEW

Copenhagen, Denmark, September August Malaria

Developing Core Competencies for the Counselling Psychologist Scope: Initial Consultation and Call for Nominations

Novo Nordisk Pharma AG Methodology Note - reporting year 2016 ( Methodology )

Clinical Genetics Welcome to Clinical Genetics

2017 Social Service Funding Application - Special Alcohol Funds

National Survey of Substance Abuse Treatment Services

Receipt of Services for Substance Use and Mental Health Issues among Adults: Results from the 2015 National Survey on Drug Use and Health

Re: Docket No. FDA D Presenting Risk Information in Prescription Drug and Medical Device Promotion

Food for Thought: Children s Diets in the 1990s. March Philip Gleason Carol Suitor

Using the Theory of Socially Distributed Cognition to Analyse the Establishment Survey Response Process

EPF s response to the European Commission s public consultation on the "Summary of Clinical Trial Results for Laypersons"

All-Party Committee on Mental Health and Addictions. Progress Report

Administering and Scoring the CSQ Scales

Title:Video-confidence: a qualitative exploration of videoconferencing for psychiatric emergencies

THE BASICS, Second Edition

Survey Research Centre. An Introduction to Survey Research

Breast Cancer Network Australia Breast Care Nurse Breast Reconstruction Survey September 2011

PHYSICAL MEDICINE AND REHABILITATION CSHCN SERVICES PROGRAM PROVIDER MANUAL

VERDIN MANUSCRIPT REVIEW HISTORY REVISION NOTES FROM AUTHORS (ROUND 2)

Emergency Department Visits Involving Nonmedical Use of Selected Prescription Drugs --- United States,

This report summarizes the stakeholder feedback that was received through the online survey.

Linking Public Interests to Ensure Sustainable Statewide Quitlines

Improving Response to the American Community Survey (ACS)

PROVIDER CONTRACT ISSUES

What is CHADIS? (888) 4-CHADIS

Research shows that the

FINNISH NATIONAL HEALTH INTERVIEW SURVEY USING BLAISE CATI

Dave Ure, OT Reg. (Ont.), CPA, CMA Coordinator

How to Select a VRI Provider

Client Retention Paper. Scott J. Busby. Abilene Christian University

Vulnerable Elder Protection Team: A Collaborative Intervention

Changes to Australian Government Hearing Services Program and Voucher scheme

Feedback of trial results to participants: A literature review and stakeholder survey

Survey Methods in Relationship Research

UNDERSTANDING CAPACITY & DECISION-MAKING VIDEO TRANSCRIPT

Cooperative Agreement to Benefit Homeless Individuals-States (CABHI-States) Request for Applications Review

AVELEY MEDICAL CENTRE & THE BLUEBELL SURGERY

Canadian Grain Commission

Emily Reid M. Cl. Sc. (Aud) Candidate School of Communication Sciences and Disorders, U.W.O.

General Terms and Conditions

Language Access Plan Basics

Level 2 Basics Workshop Healthcare

Comment Number. Date Comment Received

Use of Reminder Messages to Improve Utilization of an Automated Telephone-Based Treatment for Methadone Patients

Requiring premiums as well as instituting lockout periods and enrollment limits will increase the number of uninsured and result in barriers to care

Report on FY2015 Annual User Satisfaction Survey on Patent Examination Quality

Iowa Gambling Treatment Outcomes System: Year 5

FREQUENTLY ASKED QUESTIONS MINIMAL DATA SET (MDS)

Psi High Newsletter. October, Volume 38, No. 1. Does the term bullying in cyberbullying hinder prevention of cyber aggression?

FRAILTY PATIENT FOCUS GROUP

Implementation: Public Hearing: Request for Comments (FDA-2017-N-6502)

Foundations for Success. Unit 3

6. PATIENT RECRUITMENT

Assessing the Risk: Protecting the Child

What is your age? Frequency Percent Valid Percent Cumulative Percent. Under 20 years to 37 years

MINISTRY OF HEALTH PROMOTION SUBMISSION

THE COMPLEX JOURNEY OF A VACCINE PART I The manufacturing chain, regulatory requirements and vaccine availability

Safeguarding Annual Report

M.O.D.E.R.N. Voice-Hearer

Giving and Receiving Feedback for Performance Improvement

SUPPORTING COLLABORATIVE CARE THROUGH MENTAL HEALTH GROUPS IN PRIMARY CARE Hamilton Family Health Team

Approaching Refusal of Colorectal Cancer Screening

Transcription:

How Stable are Responses Over Time Karen CyBulski 1, Melissa Krakowiecki 2, Larry Vittoriano 3, Matt Potts 4, Tiffany Waits 5, Geraldine Mooney 6, Cathie Alderks 7 1 6 Mathematica Policy Research, P.O. Box 2393, Princeton, NJ 08543-2393 7 The Substance Abuse and Mental Health Services Administration/Center for Behavioral Health Statistics and Quality (CBHSQ/SAMHSA), 1 Choke Cherry Rd. Rm 2-1069 Rockville, Maryland 20857 Abstract The Substance Abuse and Mental Health Services Administration (SAMHSA) is mandated by Congress to collect data on the provision and utilization of substance abuse treatment services. As part of this mandate, SAMHSA (1) maintains a national inventory of facilities that offer substance abuse treatment and prevention services; (2) conducts an annual multi-mode survey of all known substance abuse treatment facilities; and (3) maintains a national database on clients admitted to substance abuse treatment programs. Together, these components form the Drug and Alcohol Services Information System. Mathematica Policy Research partners with the prime contractor, Synectics for Management Decisions, Inc. to collect the data for the annual survey and works with SAMHSA to continue to improve the quality of the data. Although this multi-mode survey has had response rates in the middle 90s, over time the response rate has declined slightly and there has been concern over facility burden. This paper will explore the stability of the response, over time, while responding to a direct request from our recurring respondent base to reduce their burden. Key Words: Data, Stability of Respondent Response 1. Pre-filling the Web The National Survey of Substance Abuse Services (N-SSATS) is an annual multi-mode establishment survey conducted by Mathematica Policy Research on behalf of SAMHSA. The survey collects information on facility characteristics, the specific treatment services provided, and client count information for over 17,000 substance abuse facilities across the nation and its territories. The data also provides the information that is published in the National Directory of Drug and Alcohol Abuse Treatment Programs and Online Treatment Facility Locator annually. The N-SSATS employs mail, phone, and web modes to collect data from respondents. Mathematica has conducted this survey for the past 15 years and the response rate has consistently been above 90 percent. Over the years that Mathematica has conducted the survey, respondents consistently contacted us to request data from their previous year s survey submission. A significant number of respondents provided feedback indicating that many of their services and data points did not change from year to year. Their requests for previously-submitted survey

responses were an effort to complete the new year s data collection with much more speed and efficiency. While wanting to be responsive to requests to reduce respondent burden, the study team was concerned that providing previously-reported data may have a negative effect on the quality of the data. Two main areas of concern were: 1) the respondent may not pay close attention to the answers they were providing and would simply accept any prefilled response; and 2) the rates of data changes would likely decrease, causing a trend that may not be based on purely empirical facts. In 2008, Mathematica conducted an experiment that prefilled responses from the prior year into the web questionnaire. After discussions with SAMHSA, it was determined that only data points for questions which had previously changed very little over time were designated to be prefilled. The results of this initial experiment indicated that the prefilled responses encouraged respondents to complete the online version of the survey, which was the only mode to offer this prefilled response option. This development s secondary benefit, in addition to addressing respondent burden, was that higher rates of data were being collected using the preferred online method. The study team had an extra level of confidence in the high quality of the data being collected due to the web instrument s self-cleaning/edit process. It also demonstrated that the rate of data changes, for prefilled questions, was in line with the rates of change for those that were not changed in the other two modes. Based on positive feedback from respondents, the web prefill was implemented as standard operating procedure in 2009. For each subsequent year, the data from the prior year was prefilled into the web questionnaire. This methodology was useful in encouraging facilities to complete their annual questionnaires on the web, since responses from the prior year were prefilled in the web questionnaire for all modes of prior response. Given that the prior year responses have been prefilled since that time, a concern remained that facilities may not be reviewing and updating the prefilled information. To address the concern, we looked at differences in response changes from the prior year by web versus mail and telephone respondents, over the course of three years 2009, 2010 and 2011. 1.1 The Sample The full sample size of the N-SSATS is approximately 17,000 facilities each year. A consistent turnaround of facilities that are included in any given data collection period means that the sample fluctuates. Each year about 2,000 facilities are found to be ineligible for the survey due to facility closure, no longer providing substance abuse treatment, or other reasons. Approximately 14,000 facilities complete the N-SSATS questionnaire by one of the three modes each year. At the start of each cycle about 2,000 facilities are newly identified and added to the sample frame, and fewer facilities may be added sporadically throughout the six-month data collection period. To look at response change over time, the analysis frame for this paper was limited to the 8,670 facilities that participated and provided data each year from 2008 through 2011. Figure 1 shows how these facilities responded in the three years that were used in the analysis.

Number of Facilites used as Analysis Base 8000 7291 7000 6000 5299 5562 5000 4000 3000 3371 3108 Not 2000 1379 1000 0 Figure 1: The number of facilities that responded in each year (2009-2011) by web (prefilled) or by mail or telephone (not prefilled). 1.2 The Data About 120 fields are prefilled in the web questionnaire with data from the prior year s survey. Only factual questions that are used for updating SAMHSA s Online Locator and National Directory are the focus of this effort, since these data points are the least likely to change. The types of questions ultimately determined for prefill are: facility characteristics, facility contact information, type of services offered, languages available other than English, and types of payment accepted. To clearly distinguish the questions that were prefilled for the respondent, an on-screen icon was added to the web display of each of these questions. We received positive feedback from respondents, indicating that this identifier was helpful to them as they completed the questionnaire. 1.3 The Response Results To analyze the responses, we selected 15 prefilled items and then calculated the percent of facilities that changed their response from the prior year, to these items. Then, we compared the change for facilities that completed on the web () versus those that completed by mail or telephone (not prefilled). This analysis (t-test) was done for each of the past three years for which the web prefill was the standard protocol. The percent of data changes are illustrated in the figures that follow. The first question in the survey asks the respondent which basic substance abuse services are provided at their specific physical location. Many of our reporting facilities are part of a larger network of facilities that, in some instances, share a common respondent. A key component of the data collection process is to emphasize that these questions are about only one physical location. There are three main types of substance abuse services referenced: intake (including assessment and referral), detoxification, and substance abuse treatment. In Figures 2-4, we see that there were no significant differences in the percent of responses that were changed from the prior round for respondents that reported on the web (prefill) and those that reported by mail or telephone (not prefilled).

Percent Changed Percent Changed 4.0% 3.5% 3.0% Which of the following substance abuse services are offered by this facility at this location? Intake, assessment, and referral? 2.8% 2.8% 3.3% 3.2% 2.5% 2.0% 2.2% 2.1% Not 1.5% 1.0% 0.5% Figure 2: Percent of data changed from the prior year for intake, assessment, and referral services. Which of the following substance abuse services are offered by this facility at this location? Detoxification? 6.0% 5.0% 4.0% 3.0% 5.1% 5.5% 3.8% 4.2% 3.0% 3.1% Not 2.0% 1.0% Figure 3: Percent of data changed from the prior year for detoxification services.

Percent Changed Percent Changed Which of the following substance abuse services are offered by this facility at this location? Substance abuse treatment? 1.0% 0.9% 0.8% 0.8% 0.9% 0.7% 0.6% 0.5% 0.4% 0.6% 0.5% 0.5% 0.4% Not 0.3% 0.2% 0.1% Figure 4: Percent of data changed from the prior year for substance abuse treatment services. In addition to substance abuse services, which are the primary focus of the survey instrument, some questions reference mental health services. In the following figure (Figure 5), we see no difference by group in the percent of respondents that change their response from the prior year. In fact, in 2009 slightly more facilities that changed their responses when the response was prefilled on the web versus those that responded by mail or telephone. In 2010, this response pattern shifted, and the percent of facilities that changed their response from the prior year was slightly higher for those that responded by mail or telephone. Which of the following services are provided by this facility at this location? Comprehensive mental health assessment or diagnosis? 10.2% 1 9.8% 9.8% 1 9.6% 9.4% 9.2% 9.4% 9.2% 9.3% 9.4% Not 9.0% 8.8% Figure 5: Percent of prefilled data changes for comprehensive mental health assessment or diagnosis.

Percent Changed In addition to the types of services provided at each facility s specific location, other basic data that is often the same year to year were prefilled into the web survey. The next figure (Figure 6) details the percent of response changes that occurred as respondents updated data as to whether they were a hospital, or if they were located in/operated by a hospital. Again, there are no significant differences between the groups, and the respondents that reported on the web showed a greater tendency to change their response than those that reported by mail or telephone. Is this facility a hospital or located in or operated by a hospital? 2.0% 1.8% 1.8% 1.6% 1.4% 1.2% 1.0% 1.4% 1.4% 1.4% 1.3% 1.2% Not 0.8% 0.6% 0.4% 0.2% Figure 6: Percent of prefilled data changed of the facilities that are a hospital or are located in or operated by a hospital. In Figure 7, another important characteristic of a facility, providing substance abuse treatment services in a language other than English, is shown. For this variable, significantly (p<.05) more facilities than responded by mail or telephone in 2010 changed their response from the prior year compared to facilities that responded by web (prefill). Overall, there were very minor differences in the response patterns of facilities that were presented with prefilled data when they completed by web versus those that did not have prefilled data when they completed by telephone or mail. Any significant differences that were seen were minor and could be attributed to the multiple comparisons that were conducted.

Percent Changed 14.0% Does this facility provide substance abuse treatment services in a language other than English at this location? 12.0% 1 8.0% 6.0% 11.8% 11.5% 10.9% 10.1% 8.1% 8.0% Not 4.0% 2.0% Figure 7: Percent of prefilled data changed for facilities providing substance abuse treatment services in a language other than English. 2. Respondent Burden and Perception 2.1 The Respondent Base Since the web survey was launched, the number of respondents reporting in this mode steadily increased. To illustrate this point, we can compare the number of respondents over a four year period and look at the mode they selected for completion. In 2008, 6,921 facilities completed on the web, representing 47.9 percent of all completes. In 2009, 8,134 facilities completed on the web, representing 57.2 percent of all completes. In 2010, 8,431 facilities completed on the web, representing 59.9 percent of all completes. And, in 2011, the number of completes jumped to 11,191 facilities completing on the web, representing 78.4 percent of all completes. As the response rates and modality were analyzed, we discovered that 8,670 facilities completed the N-SSATS questionnaire every year from 2008-2011, of those, 47 percent always completed on the web; 9 percent always completed by either mail or telephone; and 44 percent completed some on the web and some non-web. 2.1.1 Time to Complete The time for a respondent to complete the web instrument varies slightly from year to year, and fluctuates to some small degree in the same year of data collection. On average, respondents take 36 minutes to complete a web questionnaire, regardless of whether or not responses are prefilled. The prefilled data did not have much of an actual impact on reducing the burden to a respondent. Feedback from respondents indicated that they were pleased with the historical data provided and their perception of the experience

completing the survey had been positive. Therefore we hypothesize that the perception of a reduced time to complete burden on behalf of the respondent was more of a benefit than the actual reduction in time to complete. 2.1.2 Respondent Comments At the conclusion of each survey completed on the web, respondents are asked if they would like to leave a comment. While not every respondent decides to leave a comment, some do. The comments can indicate a change to something related to the facility itself, or general facts or requests. Some respondent comments received discussed the prefilled data, include: The survey was easy to work with. Having the pre-filled information included was helpful. Also having the "icon" highlighting newly requested information was helpful. This questionnaire is so much easier to complete with the previous year's information entered. Having last year's information prefilled was VERY helpful! Pre-filling the survey continues to be a great advantage. Also, I like the new items, flagged as 'new'. I really liked the ability to review and submit my prior responses, there were only minor changes. The time needed to complete the survey was very reasonable and I appreciate the process. 3. Summary Overall, concerns about data quality related to prefilling responses with data from the previous year on the web have been mitigated. Respondents change responses to items from one year to the next at about the same rate, regardless of whether or not the previous response has been prefilled. This suggests that respondents don t appear to be satisficing, but rather attending to the data that is being presented and changing it accordingly. One of the goals of prefilling responses was to reduce respondent burden. Although the prefill did not reduce the amount of time spent completing the web questionnaire, however, respondents comments suggest that there is less perceived burden, and possibly less cognitive burden. This is particularly important given that more facilities are completing the questionnaire by web with each passing year of the survey and maintaining a high response rate is particularly important.