Define Objectives. Define Solution. Design Evaluation. Use Findings. Report Findings. Implement Program. Program. Conduct Evaluation

Similar documents
Medicaid Denied My Request for Services, Now What?

Logic Model Development

Choosing Life: Empowerment, Action, Results! CLEAR Menu Sessions. Substance Use Risk 2: What Are My External Drug and Alcohol Triggers?

THE FIRST SESSION CHECKLIST

Safeguarding adults: mediation and family group conferences: Information for people who use services

Quality Checking the gateway to taking control of our lives Dr THOMAS DOUKAS.

LEADERSHIP AND EMPOWERMENT 1

Making Connections: Early Detection Hearing and Intervention through the Medical Home Model Podcast Series

in Non-Profit Organizations Serving

Sample Observation Form

Information for Service Providers

Logic models to enhance program performance

ISHN Keys to Increasing Participation in an Occupational Safety Process. I recently heard one of my partners at SPS (Safety Performance

Information for Service Providers

BASIC VOLUME. Elements of Drug Dependence Treatment

Chapter 1. Dysfunctional Behavioral Cycles

Systemic Assessment of Sexual Issues in Individual and Couples Counseling

Choosing Life: Empowerment, Action, Results! CLEAR Menu Sessions. Substance Use Risk 5: Drugs, Alcohol, and HIV

UWMD Workshop: Outcomes and Logic Models. CIG Application Workshop for FY

I MAY NOT HAVE ALL THE ANSWERS BUT AT LEAST I HAVE THE QUESTIONS TO GET THE PROPER. care guidelines

How to Work with the Patterns That Sustain Depression

Key Steps for Brief Intervention Substance Use:

Managing Your Emotions

Interviewer: Tell us about the workshops you taught on Self-Determination.

GOALS FOR LEADERS SAMPLE SESSION OUTLINE

Enhancing Telephone Communication in the Dental Office

Good Communication Starts at Home

Private renting and mental health: A way forward

How Ofsted regulate childcare

THOUGHTS, ATTITUDES, HABITS AND BEHAVIORS

Mental capacity and mental illness

Health Care Callback Survey Topline August 2001

FAMILY ENGAGEMENT EXPLORING COMMON UNDERSTANDING AND METHODS TO ASSESS FAMILY ENGAGEMENT

Consumer Perception Survey (Formerly Known as POQI)

1. Before starting the second session, quickly examine total on short form BDI; note

4/14/2016. Take ownership of the care - This is my patient!

Getting well using online social networks; Swanswell and Netmums Evaluation report

White Supremacy Culture perfectionism antidotes sense of urgency

This is an edited transcript of a telephone interview recorded in March 2010.

THE PROVISION OF HEARING AID SERVICES BY AUDIOLOGISTS

The. What-if? Workbook. How to Make Decisions About Your Mental Health Treatment. Ahead of Time. w i t h a d va n c e d i r e c t i v e s

Healing, Justice, & Trust

Lesson 8 STD & Responsible Actions

Addressing the spectrum of human conflict

Professional Development: proposals for assuring the continuing fitness to practise of osteopaths. draft Peer Discussion Review Guidelines

ODP Deaf Services Overview Lesson 2 (PD) (music playing) Course Number

Connecting to the Guest. Dr. John L. Avella Ed.D Cal State Monterey Bay

A guide to Getting an ADHD Assessment as an adult in Scotland

HIV Care & Treatment Program STATE OF OREGON

Meeting someone with disabilities etiquette

EMOTIONAL QUOTIENT ASSESSMENT RESULTS

The First Five Sessions: Coach People to Lose Weight

The Clean Environment Commission. Public Participation in the Environmental Review Process

Overcoming barriers. Our strategy for

Attention and Concentration Problems Following Traumatic Brain Injury. Patient Information Booklet. Talis Consulting Limited

Healing, Justice, & Trust

Working with Public Officials

State Advocacy: How to Assist with CMV Legislation

Welcome to Pathways. - Auckland - Information about Pathways services in the Auckland region.

Professional Training Program

Program Evaluation and Logic Models. ScWk 242 Session 10 Slides

Idaho Suicide Prevention Hotline Analysis of Options for Decision Making

Helping you understand the care and support you can ask for in Wales.

Bradford District Community Advice Network (CAN)

Introduction. Preliminary POV. Additional Needfinding Results. CS Behavioral Change Studio ASSIGNMENT 2 POVS and Experience Prototypes

MENTAL HEALTH ADVANCE DIRECTIVE

Feedback Informed Treatment: David Nylund, LCSW, PhD Alex Filippelli, BSW

Creating a Trauma-Informed Care Culture

Self Esteem and Purchasing Behavior Part Two.

Choosing Life: empowerment, Action, Results! CLEAR Menu Sessions. Adherence 1: Understanding My Medications and Adherence

Language Access Plan Basics

PSYCHOLOGIST-PATIENT SERVICES

MA 1 Notes. moving the hand may be needed.

Teresa Anderson-Harper

Stanford Youth Diabetes Coaches Program Instructor Guide Class #1: What is Diabetes? What is a Diabetes Coach? Sample

Deciding whether a person has the capacity to make a decision the Mental Capacity Act 2005

Counselling Should: Recognize that behaviour change is difficult and human beings are not perfect

The Mirror on the Self: The Myers- Briggs Personality Traits

Strengths based social care in Leeds City Council

This presentation focuses on recent changes in vaccine storage and handling requirements for the State Childhood Vaccine Program.

Angela Bourge Cardiff Council Children s Services. Supported by:

JUST DIAGNOSED WITH DIABETES?

Mastering Emotions. 1. Physiology

MA 1 Notes. Deaf vs deaf p. 3 MA1 F 13

How to Select a VRI Provider

The Inception of the Outlier Provision REISINGER,(JOHN((1(

INFUSING DESIGN THINKING INTO PROBLEM SOLVING

Good enough? Breast cancer in the UK

National NHS patient survey programme Survey of people who use community mental health services 2014

Leadership Practices Inventory: LPI

New Approaches to Survivor Health Care

Logic models to enhance program performance

MORE FUN. BETTER RESULTS. 40% OFF YOUR 28-DAY TEST DRIVE

TTI Success Insights Emotional Quotient Version

Recommendations from the Report of the Government Inquiry into:

European Standard EN 15927:2010 Services offered by hearing aid professionals. Creating a barrier-free Europe for all hard of hearing citizens

Mental and Behavioral Health Needs Assessment CONSUMER SURVEY

Media pack for secondary breast cancer campaigners

IAPT for SMI: Findings from the evaluation of service user experiences. Julie Billsborough & Lisa Couperthwaite, Researchers at the McPin Foundation

New Mexico TEAM Professional Development Module: Autism

Transcription:

Quality Assurance, Customer Satisfaction and Follow-Up: Definition and Practical Experience Micki Thompson 2-1-11 Tampa Bay Cares, Inc James B. Luther, Ph.D. Luther Consulting, LLC May, 2009

Overview Brief Review Evaluation in I&R AIRS Standards Definitions lot s of em Call Quality Assurance Customer Satisfaction Survey Call Follow-Up In light of new AIRS Standards Developing & Implementing QA Measures Funder expectations What we found Questions

But before we start. Write down for yourself your answers to the following questions: How does your I&R measure the quality of it s services? How does your I&R measure the success or failure of the service it provides? May be the same, may be different. Be as specific as possible Do not share aloud right now (we ll come back to these)

What is Program Evaluation? The systematic collection of information about the activities, characteristics, and/or outcomes of programs to make judgements about the program, improve program effectiveness, and/or inform decisions about future programming - Michael Quinn Patton, Utilization- Focused Evaluation

Program Planning and Evaluation Model Define Solution Define Objectives Define Problem Plan Program Use Findings Design Evaluation Report Findings Conduct Evaluation Implement Program

Understanding How the Program Works AIRS Standard 1 (v6): Assessment and Referral Provision Program Logic Model: a theoretical picture of how a program operates. It describes logically, step-by-step, how a program is intended to work to achieve objectives. Inputs: program resources Activities: program activities Outputs: program products

From: University of Wisconsin Extension Website on Program Development and Evaluation http://www.uwex.edu/ces/pdande/evaluation/

Logic Model in I&R -Money -Staff -I & R Software ACTIVITIES -Ask inquirers questions -Send out Resource DB Update Survey OUTPUTS -Provide inquirers referral -Publish Resource Book Inquirer gets human services help

Program Logic Model OUTCOMES Quality Assurance OUTPUTS ACTIVITIES INPUTS

Program Evaluation Anything related to how good you re doing, whether people like you, whether people would use your service again, whether people got the help they needed or if your community is better off because I&R was there.

Definitions Problem How do we make sure we re all talking about the same things in program evaluation? New AIRS Standards

AIRS Standards Version 6.0 New name: AIRS Standards d for Professional Information and Referral and Quality Indicators Quality Indicators have been introduced that can be used to determine the degree of adherence to the standards or achievement of quality goals. These indicators are ideals that support and provide a framework for the actual standards. (AIRS Summary)

AIRS Standards Numerous sections standards have initiated or clarified definitions Helped provide common language and framework

Definitions Follow-up Client Survey Client Outcomes System Outcomes Random Sample Client Satisfaction Quality Assurance Database Accuracy Annual Survey

What do you think? How would you define Follow-Up Customer Satisfaction Quality Assurance Resource Database Verification

Follow Up We ve been using it for two DIFFERENT things Calling a client back to make sure a client receives the help they need Driven by need/ situation of the client no target Calling a client back to gain information on whether they were happy with the service

Follow Up AIRS Standards: Primary purpose of follow-up is for the benefit of client/inquirers to see if their needs were met. (Standard 5) Although follow-up may provide useful information on customer satisfaction and service outcomes, this is not why it is being conducted. Does that match your definition? How is it similar il or different? What are the consequences of the similarities or differences?

Customer Satisfaction Customer Satisfaction (CS): A measure of how the services supplied meet or surpass client/ inquirer expectation. MAY occur during original contact, follow-up, or in separate call made for quality assurance purposes (Standard 28) NOTE: Every client s expectation is different it is NOT standardized

Quality Quality Assurance (QA) A system of procedures, checks, audits and corrective actions that are undertaken to ensure that an organization s products and services meet the expectations and needs of the people they serve. For information and referral programs, quality assurance relates to service delivery, the resource database, reports and measures, disaster preparedness, cooperative relationships and organizational effectiveness. This is your evaluation PLAN. It includes both AIRS Quality Indicators and internal standards. Also includes how you are going to measure your service against those standards. Includes customer satisfaction but goes beyond that.

Discussion Based on these standards How do these definitions match what you thought before? How do you see them working together?

Definitions Resource Database Verification Annual Survey in AIRS Standards The process that the I&R services uses to verify the accuracy of information in the resource database. Various methods allowed to obtain data Goal is 100% update rate within the 12 month cycle Each I&R can specify their own target return rate and associate responses.

Random Sample Ideally, we d wantto to know every client s satisfaction, every client s outcome, and the accuracy of every record di in the resource data base Random sample lets us accomplish that

Random Sample If you don't believe in random sampling, next time you go to the doctor for a blood test, have him take it all. Let s you get basically the same answers as if you asked everyone AND everyone responded

Random Sample Done properly, p randomly sampling enables you to successfully generalize your survey results to the target group (e.g., clients, resource database entries). To achieve this, random sampling has two key requirements: Randomness: : an equal chance of selecting any member of the population ( probability sampling ). External selection: : respondents are chosen to participate rather than deciding to take the survey themselves.

To return to the questions Going to share some of the practical considerations from the work we ve done. Based on what we have just talked about How does your I&R measure the quality of it s services? How does your I&R measure the success or failure of the service it provides?

Quality Assurance Plan Two components: Resource Database Verification External rating of calls

Resource Database Verification Underlying question Would this error interfere with contacting the agency Random sample of database Actually, 2 samples Rank ordered by number of referrals given to each program 1 sample (larger) from most frequently referred to (top 50% referrals) 1 sample (smaller) from other half Trying to weight more the accuracy of the information that was being given out most frequently

Resource Database Verification Two steps Is the record in the database complete? Examined Service Point record Is the information accurate? Called the agency Established rating criteria i for each of the data elements

Resource Database Verification Elements Tested Name of program Phone number Address - Physical Address - Mailing Days and Hours of operation Core services provided Taxonomy Codes

External Call Rating Developed Procedures for rating calls. Develop set of criteria based on AIRS standards and local standards (such as funder requests, United Way expectations) - have to operationalize the criteria Randomly selected calls Use two independent raters

Break In the second part We ll review the QA Plan, Resource Database Verification and External Call Rating Discuss our findings

Why Can t We Just Use Follow-Up Surveys? It doesn t measure all of the quality indicators in the AIRS Standards Customer Satisfaction Referral Outcome But does not measure whether the specialists: Respond to each inquirer in a professional, nonjudgmental, culturally- appropriate and timely manner Make an accurate assessment of the inquirer s problems and needs Provide three referral options

Why Can t We Just Use Follow-Up Surveys? If done by same specialist, may be perceived as biased and unscientific by external funder/ regulator. Sometimes even if done by same agency If NOT done by same specialist, undermines established rapport with client for real purpose of follow-up it s to be done for the benefit of the clients/ inquirers.

Why Can t We Just Use Follow-Up Surveys? Often biased based on client experience with referral agency. Can t be used for determining specialists behavior DURING the call (polite, good assessment, etc.) Based on client recall Ratings by clients/ inquirers not standardized. Often not a true random sample therefore may not be representative of actual caller experience.

2-1-1 Tampa Bay Cares, Inc. A Case Study

2-1-1 1 Tampa Bay Cares, Inc. Info Handling approximately 90,000 calls per year. Regional Call Center located in Clearwater Florida serving Pinellas, Hernando and Citrus Counties (approximately 1.3 million residents) 211 TBC has all eligible staff CIRS certified. 211 TBC has one resource specialist who is CRS certified. 211 TBC is AIRS Re-Accredited (Jan. 2009)

2-1-1 Tampa Bay Cares, Inc. Call Handling Follow-Up The Early Days

2-1-1 1TBC Follow-Up Program One 30 hour per week follow-up specialist funded by one funder. One of the Call Center Reps who sat in the call center with the rest of the reps. Customer Satisfaction via Follow-Up: 98% - 100% of our callers were satisfied with 211 services. Same Methodology for at least 20 years.

2-1-1 1TBC Follow-Up Program Questions s Asked Did you contact at least one referral? Did you feel the call center rep understood your needs? Was the agency you referred to able to help you? If not, why not. Were you satisfied with the agency who provided you assistance? If not, why not. Would you recommend 211 to your friends and family. Overall are you satisfied with 211?

One Funder s Expectations Over the Last 20 Years! 1 st - Generally felt that follow-up should be used as a measure of call center and call center rep performance hence a measurement of contract compliance on agency performance. 2 nd Felt follow-up was critical to measure Unmet Needs of a Community hence funding decisions could be made. 3 rd Monitoring Tool regarding agencies on who was not returning calls, who was out of funds, etc. 4 th Callers contacting at least one referral was being used as a method to determine 211 performance. Rarely Support to callers in getting connected to local services.

One Funder s Expectations Over the Last 20 Years! Last 5 years on-going discussions with Funder concerning follow-up 211 TBC felt that: Performance of CCRs can t be based on whether a caller contacts at least one referral. They could be calling as research and isn t that a good thing! Performance of 211 should not be based on whether a caller contacts at least one referral. Unmet needs from 211 follow-up should not be used as a method to determine funding of agencies marketing can have an effect on results.

One Funder s Expectations Over the Last 20 Years! Funder felt the results were: Issues with the number of callers contacted they felt 200 follow up contacts (actually speaking with a caller) was not enough. Follow-up results were not statistically valid. Not asking every X caller. Felt the 211 CCRs were only asking happy callers for follow-up. We were following up with on our own calls. Follow-Up Methodology Not researched/evidenced based.

So.What DID we do 1 st Suspended Follow-Up as We Knew It! Bid Process for an Independent Evaluator Involved the funder in the process of choosing the vendor. Most Challenging: Not having enough funding to implement every quality assurance activity. it Deciding what areas to focus on first.

Areas of QA Focus Call Center Compliance with AIRS Standards for Handling Calls. Resource Database Compliance with AIRS Standards and Accuracy.

What We Learned March 2008 First Call Handling Report To Establish a Base Line 211 TBC did not tell the Call Center Reps they were being evaluated by an independent evaluator until July 2008. 211 TBC call center had significant supervision (Call Center Manager and Supervisors) turn-over. Past Supervision i and Monitoring: i 211 TBC did monitor calls however CCRs were aware that they were being monitored and scored and overall 95% when they were aware calls were being monitored. These did not include standards for handling calls.

What We Learned Call Center Supervisors/Follow-Up Person were biased in their evaluation of calls. Supervisors/Follow-Up Person sat in the same room in some cases just a few feet away from each other. Knew personal struggles of their peers and didn t want to get them in trouble.

What We Learned: The First Call Handling Report FACT: Caller Satisfaction as reported during follow-up over the last 20 years was 98%-100% as reported in followup!!!! (Supportive of only asking follow up for happy callers or on calls that went well. ) CCRs were not handling calls based on AIRS Standards!!!!

What We Learned: The First Call Handling Report For example: 15% were not using the scripted greeting. More than 50% were not asking all demographics required by contract. 18% interrupted the caller. 12% did not avoid giving personal advice.

What We Learned: The First Call Handling Report For example: Only 22% of the calls provided organizational name, number, hours of operation, information about the referral, and/or eligibility information. Very few identified the caller s own resources. More than 40% of the time, callers were not given 3 referral options (or an explanation why 3 options were unavailable or not applicable).

What We Learned: The First Call Handling Report For example: Informed choice Only 30% provided enough information about the organization for informed choice. Locate Alternative Sources - Only 42% provided problem solving on alternative sources when traditional resources have expired. Call Closing 96% did not ask May I help you with anything else? and 93% were not encouraged dto re-contact t 211.

Strategy for Improvement Go back to the fundamentals AIRS Standards. New Call Center Manager was hired in October 2008. 211 TBC had to update the current training manual to be in compliance with AIRS standards. Re-Train all CCR staff to be in compliance with AIRS standards 211 TBC invested in Essential Learning for all staff. AIRS provides trainings based on standards. Staff could learn the language g of AIRS. 211 TBC has implemented a small scale individual CCR performance ratings to determine who needs additional training.

Strategy for Improvement Started checking the data entry fields based on the recorded call. Many data fields were filled in even though the question was never asked by the call center representative. ese e

Documenting Improvement Demographics - Luther's Evaluation 120 100 80 60 91.4 90 74.3 91.2 79.4 97.1 97.1 82.4 100 100 97.1 100 94.3 95.6 88.2 88.2 85.3 76.5 73.5 40 20 0 October November December January February March April Greetings Age Demographics Race/ethinicity Demographics

Documenting Improvement Active Listening - Luther's Evaluation 100 90 80 70 60 50 40 30 20 10 0 98.4 97.7 96.4 98.1 100 100 95.7 18.6 19.1 14.3 17.6 20.6 7.4 8.8 1 2 3 4 5 6 7 Use of Open-ended questions Closed questions

Documenting Improvement Active Listening 2 - Luther's Evaluation 100 95 90 85 80 75 70 October November December January February March April Explain pauses Reflects callers feelings Matches pace Uses a pleasant tone Culturally appropriate Avoids interrupting caller Uses proper grammar and speaks clearly

Documenting Improvement Accessing need - Luther's Evaluation 100 95 90 85 80 75 70 October November December January February March April Interacts in a non-judmental w ay Retains information Identifies key issues Accurately assessed caller's problem Addressed additional problems

Individual Performance Monitoring Tool

2-1-1 Tampa Bay Cares, Inc. Database Quality

What We Learned The First Database Quality Report There was a discrepancy between formal update and small updates date stamp on record. LET S BE HONEST..In the current economic climate keeping up with agency and program updates is challenging with limited resource staff. Expecting agencies to update their records is extremely unrealistic.

Strategy for Improvement Go back to fundamentals of reviewing the AIRS standards. Review 211 TBC Training Manuals and Training Program for the Resource Department. Implement Essential Learning for 211 TBC Resource Department t Staff so they learn consistent AIRS standards language.

Strategy for Improvement Review Resource Department policies on updating. Should we be updating the programs/services more frequently than the programs and services we do not refer to? (60% database utilization.) What constitutes a formal update vs. calls during the year that are small changes? Should we change the timing of our updates to coincide with major funder fiscal years which result in many changes to programs and services July and October. (Used to do this in May/June of each year)

2-1-1 Tampa Bay Cares, Inc. Independent Evaluation

Value of Independent Evaluation Gave a realistic picture of what was happening in the call center and resource department. Identified gaps in our training. Provoked great policy and procedure debates. Challenge all Information and Referral Challenge all Information and Referral Centers to at least try an independent evaluator.

Value of Independent Evaluation Federal funding does require accountability and formal evaluation. Economic Stimulus Grants Budget for quality assurance programs and independent evaluation ahead of time in your grant applications! Independent evaluations will also add research that is desperately needed in the field of I&R!

Questions? Micki Thompson Executive Director 2-1-11 Tampa Bay Cares, Inc. 50 S. Belcher Rd., Suite 116 Clearwater, Florida 33765 (727) 210-4240 James B. Luther, Ph.D. Luther Consulting, LLC 423 Massachusetts Ave. Indianapolis, IN 46204 (317) 636-0282