S10: Data Quality Assessment and Control Framework for Secondary Use of Healthcare Data

Similar documents
Real-world data in pragmatic trials

Registry Assessment of Peripheral Interventional Devices (RAPID)

Uses of the NIH Collaboratory Distributed Research Network

Race and Ethnicity Reporting in Clinical Research and Its Role In Pragmatic Clinical Trials February 28, 2014

Leveraging Electronic Health Data in a Multinational Clinical Trial: Early Learnings from the HARMONY- OUTCOMES EHR Ancillary Study

Accelerating Patient-Centered Outcomes Research and Methodological Research

Case-based reasoning using electronic health records efficiently identifies eligible patients for clinical trials

Greater Plains Collaborative Breast Cancer Group. Phase I Wrap up Webinar November 20, 2015

Linkage of Indiana State Cancer Registry and Indiana Network for Patient Care

NEUROSCIENCE TRIALS OF THE FUTURE: A WORKSHOP Pragmatic Trials: Challenges and Opportunities for Neuroscience Trials

Using the NIH Collaboratory's and PCORnet's distributed data networks for clinical trials and observational research - A preview

Building a learning healthcare system for suicide prevention

Research in Real-World Settings: PCORI s Model for Comparative Clinical Effectiveness Research

Sepsis-3: clarity or confusion

IBM Patient Care and Insights: Utilizing Analytics to Deliver Impactful Care Management

Social Determinants of Health

Estimates of the Reliability and Criterion Validity of the Adolescent SASSI-A2

The Academy Capitol Forum: Meet the Experts. Diagnosing Which Health Treatments Improve Outcomes: A PCORI Overview and How to Get Involved

2/17/2018. Prof. Steven S. Saliterman Department of Biomedical Engineering, University of Minnesota

Prof. Steven S. Saliterman. Department of Biomedical Engineering, University of Minnesota

Designing CME to Change Competence, Performance and Patient Outcomes

ClinicalTrials.gov a programmer s perspective

Two: Chronic kidney disease identified in the claims data. Chapter

7/17/2013. Evaluation of Diagnostic Tests July 22, 2013 Introduction to Clinical Research: A Two week Intensive Course

Presenter Disclosure Information

Nature of Science and Scientific Method Guided Notes

Critical Review Form Diagnostic Test

observational studies Descriptive studies

Determining differences in health care costs before and after Alzheimer s diagnosis using SAS Studio

Evidence Based Medicine Prof P Rheeder Clinical Epidemiology. Module 2: Applying EBM to Diagnosis

Pragmatic Clinical Trials. Disclosure/Conflict of Interest. Learning Objectives 7/15/2015. Friedly: No disclosures

Gaps In Successful EHR Implementations: The Challenges & Successes Of Providers In The Marketplace

Clinically Meaningful Inclusion of Participants in Clinical Trials. David Hickam, MD, MPH Washington, DC April 9, 2015

Clinical Epidemiology for the uninitiated

The journey toward Clinical Characterization. Patrick Ryan, PhD Janssen Research and Development Columbia University Medical Center

Data and Safety Monitoring in Pragmatic Clinical Trials. Susan S. Ellenberg, PhD Greg Simon, MD, MPH Jeremy Sugarman, MD, MPH, MA

Chapter 4 Section 24.1

The NIH Collaboratory Distributed Research Network

Presenters: Robin Newhouse Steven Goodman David Hickam

David O. Meltzer* Opportunities in the Economics of Personalized Health Care and Prevention

FDA Sentinel Initiative: Status Update and Examples of Contributions to Regulatory Decisions

Common Data Elements: Making the Mass of NIH Measures More Useful

MACRA Quality Payment Program Guide. Sample page. Simplifying Medicare MIPS & APM reporting for practitioners. Power up your coding optum360coding.

Lia Hotchkiss: I'm Lia Hotchkiss and I'm with the Agency for Healthcare. Research and Quality. We are one of the 12 agencies part of the Department of

Exploring Outcomes and Value across the Spectrum of Alzheimer s Disease Pennsylvania Avenue NW, Washington, DC June 20, 2017

Population based studies in Pancreatic Diseases. Satish Munigala

Data Sharing Consortiums and Large Datasets to Inform Cancer Diagnosis

Responding to the Spirit of 21 st Century Cures Legislation

QOPI and the Rapid Learning Oncology Care System. Copyright 2011 American Society of Clinical Oncology. All rights reserved 1

Institute of Medicine Workshop: DIGITAL DATA PRIORITIES FOR CONTINUOUS LEARNING IN HEALTH AND HEALTH CARE

Suicide Spectrum Assessment and Interventions. Welcome to RoseEd Academy. Disclaimer

LUNG CANCER CLINICAL TRIALS

Pragmatic Trials: What the Heck Are They and Why Should You Care?

DRAFT (Final) Concept Paper On choosing appropriate estimands and defining sensitivity analyses in confirmatory clinical trials

Get the Right Reimbursement for High Risk Patients

MULTICARE Health System Care of the Adult Chronic Obstructive Pulmonary Disease (COPD) Patient

Trigger. Myths About the Use of Medication in Recovery BUPRENORPHINE TREATMENT: A TRAINING FOR MULTIDISCIPLINARY ADDICTION PROFESSIONALS

Health Aging Data Inventory Project

Improving Chronic Disease Management with Pieces

LEVEL TWO MODULE EXAM

Examining the impact of moving to on-screen marking on concurrent validity. Tom Benton

Study Design STUDY DESIGN CASE SERIES AND CROSS-SECTIONAL STUDY DESIGN

QUALITY IMPROVEMENT TOOLS

Following the health of half a million participants

This evaluation was funded by the Centers for Disease Control and Prevention

Moderator & Speaker. Speakers FORUM. Working Group Background Cancer population selection using secondary data sources in the oncology literature

Essentials of Pilot Study Design and Conduct. Kenneth E. Schmader, MD Duke Pepper OAIC Durham VA GRECC Duke University and Durham VA Medical Centers

This presentation focuses on recent changes in vaccine storage and handling requirements for the State Childhood Vaccine Program.

CHAPTER 15: DATA PRESENTATION

Patient-Centered Endpoints in Oncology

PrEP MEASUREMENT IN NYC

PCORI s Role in Supporting CER

Riding the Current: Upstream and Downstream Approaches to Implement Adult Immunization Strategies

Progress from the Patient-Centered Outcomes Research Institute (PCORI)

Efficiency Methodology

Robert Temple, MD Deputy Center Director for Clinical Science Center for Drug Evaluation and Research Food and Drug Administration

SIM HIT Assessment. Table 1: Practice Capacity to Support Data Elements

Purpose. Study Designs. Objectives. Observational Studies. Analytic Studies

EVIDENCE AND RECOMMENDATION GRADING IN GUIDELINES. A short history. Cluzeau Senior Advisor NICE International. G-I-N, Lisbon 2 November 2009

Continuous Quality Improvement: Learning from Events

New Models of Evidence Generation and Cancer Care Delivery: Distance Medicine Technologies

Taking Control of Anger. About Anger

Current Quality Measurement Priorities and Challenges in the Medical Environment

OBSERVATIONAL MEDICAL OUTCOMES PARTNERSHIP

COVERAGE WITH EVIDENCE DEVELOPMENT

Comparative Effectiveness in Health Care Reform. Carrie Hoverman Colla, Ph.D. Dartmouth Medical School Norris Cotton Cancer Center

Talking the same language for effective care of older people

Types of Biomedical Research

Evidence-Based Medicine and Publication Bias Desmond Thompson Merck & Co.

Note: This is an authorized excerpt from 2016 Healthcare Benchmarks: Population Health Management. To download the entire report, go to

2018 Edition The Current Landscape of Genetic Testing

Semantic Alignment between ICD-11 and SNOMED-CT. By Marcie Wright RHIA, CHDA, CCS

U.S. Preventive Services Task Force Methods and Processes. Alex R. Kemper, MD, MPH, MS June 16, 2014

Final Meaningful Use Objectives for 2016

ICD-10 RSPMI Provider Education April 16, Cathy Munn MPH RHIA CPHQ Sr. Consultant

Taking the Common Approach to Improve Child Wellbeing. Webinar Presentation July

Meaningful Use Stage 2: ONC Request for Comments. Ivy Baer, Jennifer Faerberg

A Practical Guide to Getting Started with Propensity Scores

Transcription:

S10: Data Quality Assessment and Control Framework for Secondary Use of Healthcare Data Presented by: Meredith Nahm Zozus Duke University October 7, 2014 1:00 pm - 1:45 pm 1

What do we mean by Secondary Use? 1 o Use: data generated or used in care of a patient 2 o Use: all uses other than that for which the data were originally collected, e.g.: Quality improvement Public health Performance measurement Research

Topics Kinds of data generated in healthcare Secondary use-cases Observational research and characterization Hypothesis driven research Two categories of DQA DQA for the Health Care Systems Research Collaboratory Accuracy assessment examples Accuracy of EHR data via Chart Review Accuracy of EHR data via asking patients 3

Kinds of Data in Healthcare Nature of the phenomena Anatomic or pathologic e.g., fracture, lesion Physiologic or functional e.g., ejection fraction, creatinine clearance, blood pressure Patients symptomatic experiences e.g., pain, nausea, dizziness Patients behaviors or functioning e.g., food intake, Activities of daily living Method of ascertainment Asking patients Observing patient behavior, anatomy, function or images Measurement Synthesis, interpretation by a trained clinician 4

What Happens to Data Nahm, M., Johnson, C., Johnson, T., Zhang, J., What Happens to Data Before Secondary Use? AMIA 2010. Synthesis of four published models.

Example Data Problems Records receive but never viewed Inaccurate statements Charting on the wrong patient Assumed Diagnoses from off-label drug use Problems at the registration desk Have to fill in something Only have a limited number of fields / space / time Local workflow / practices impacting what gets charted and how All of these problems and more are present in routine healthcare data 6

Arguments Against 2 o Use All medical record information should be regarded as suspect; much of it is fiction. - Burnum, 1989 First law: Data shall be used only for the purpose for which they were collected. Collateral: If no purpose was defined prior to the collection of the data, then the data should not be used. - van der Lei, 1991 Burnum IF The misinformation era: fall of the medical record. Ann Intern M 1989; 110: 482-4. van der Lei, J., Use and Abuse of Computer- Stored Medical Records. Meth. Inform. Med.. Vol. 30. No.2. 1991

Tipping Point for 2 o Use in Research Many unanswered questions can be solved with 2 o data We can t afford randomized controlled trials (RCT) for all of these RCTs have evolved several significant inefficiencies that double or triple the cost National initiatives built on 2 o use Patient Centered Outcomes Research network (PCORnet) Healthcare Systems Research Collaboratory FDA MiniSentinel Clinical & Translational Science Awards (CTSA) 8

Not Just Claims Data 9

Special attention on Health Care Systems Research Collaboratory DQAC Framework FRAMEWORKS 10

Two / Three Varieties Predicated on a Common Data Model (tools, IR, no accuracy asmt.) OMOP observational & CER MiniSentinel Post market surveillance of marketed therapeutics PCORnet PCOR & CER Space in the Middle for metadata driven frameworks (healthcare is not there yet requires documenting metadata first) Promise = Tools + no IR! However no accuracy asmt. Assumes NO common data model (no tools, no IR, accuracy asmt.) Healthcare Systems Research Collaboratory Pragmatic Clinical Trials 11

Data Quality Assessment Recommendations v1.0 Initiated by 1. Inventory of data sources and data quality assessment plans proposed in first round UH2 applications, 2. Demonstration project grant review criteria requiring data validation The work presented here was funded by the Health Care Systems Research Collaboratory Coordinating Center grant number 1U54AT007748-01 through the National Center for Complementary & Alternative Medicine, a center of the National Institutes of Health.

Guiding Principles Need to demonstrate that data are capable of supporting research conclusions Should not assume use of a common data model for individual research projects Recommendations should be practical and reasonably achievable Ouch! 13

Recommendations 1 Three key data quality dimensions to be measured 2 Description of formal of assessments 3 Formal impact assessment 4 Reporting data quality assessment with research results 14

Recommendation 1 Accuracy, completeness, and consistency be formally assessed for data elements used in subject identification, outcome measures, and important covariates. Why? These are most impactful on the ability of data to support research conclusions. 15

Recommendation 2 Specifics for measuring accuracy, completeness and consistency Completeness assessment recommendation: four-part completeness assessment. Same column and data value completeness measures can be employed for monitoring completeness during the study. The completeness assessment applies to both prospectively collected and secondary use data. Additional requirements suggested by the Good Clinical Data Management Practices (GCDMP) document, such as onscreen prompts for missing data where appropriate, apply to data collected prospectively for a study. 16

rsely correlated with data accuracy, then percent missing may be an indicator of l racy. hierarchy of sources for comparison shown in Figure 1 provides a list of po parisons ranging (from bottom to top) from those that are achievable in every situ provide less information about true data accuracy, to the ideal but rarely achievable provides an actual data error rate. This hierarchy simplifies the selection of sourc parison: where more than one source for comparison exists, the highest pra parison in the list should be used. Recommendation 2 cont. The highest practical accuracy assessment in the hierarchy should be used Comparison o t a so urce of truth Comparison to an independent measurement Accuracy Comparison to independently managed data Comparison to an upstream data source Comparison to a known standard Comparison to valid values Comparison to validated indicators Comparison to aggregate statistics Partial accuracy Discrepancy detection Gestalt 17

Recommendation 2 cont. Consistency assessment recommendation: Identification of: a) areas where differences in clinical documentation, data collection, or data handling may exist between individuals, units, facilities, sites, or assessors, or over time and b) measures to assess consistency and monitor it throughout the project. A systematic approach to identifying candidate consistency assessments should be used. Such an approach will likely be based on review of available data sources, accompanied by an approach for systematically identifying and evaluating the likelihood and impact of possible inconsistencies. This recommendation applies to both prospectively collected data and secondary use data. 18

Recommendation 3 Impact assessment recommendation: Use of completeness, accuracy, and consistency assessment results by the project statistician to test sensitivity of the analyses to anticipated or identified data quality problems, including a plan for reassessing based on results of data quality monitoring throughout the project. 19

Recommendation 4 Data quality assessments should be reported with research results. Note: there is ongoing work content and format standard for reporting dta quality assessments with research results (Michael Kahn PI). Funded by a PCORI contract. 20

A little more about ACCURACY 21

ard the bottom identify only data discrepancies, i.e., items that may or may not repr ctual error. For example, if it has been shown that a percentage of missing valu rsely correlated with data accuracy, then percent missing may be an indicator of l racy. Accuracy Asmt. Hierarchy hierarchy of sources for comparison shown in Figure 1 provides a list of po parisons ranging (from bottom to top) from those that are achievable in every situ provide less information about true data accuracy, to the ideal but rarely achievable provides an actual data error rate. This hierarchy simplifies the selection of sourc Chart review, parison: where more than one source for comparison exists, the highest pra Medical Record Abstraction parison in the list should be used. Comparison o t a so urce of truth Comparison to an independent measurement Comparison to independently managed data Comparison to an upstream data source Comparison to a known standard Comparison to valid values Comparison to validated indicators Comparison to aggregate statistics Accuracy Partial accuracy Discrepancy detection Gestalt 22

Chart Review Use Cases 1. Data acquisition 2. Phenotype validation validation of data elements + logic/algorithm to extract data from electronic sources 23

Issues with Chart Review Can be subjective Often poorly documented Often poorly controlled Methods infrequently reported Associated with HIGH error rates 24

Chart Review System 292 Unique factors that impact the accuracy of data abstracted from Medical Records 25

Abysmal Reporting of Chart Review Methods a Category includes validation of administrative data, performance measures, or indicators (18); data quality assessment (11); and questionnaire validation (1). 26

MURDOCK Data Quality Study Problem Statement Availability and completeness of the EHR data have not been assessed The accuracy of 1) the self-report data and 2) the EHR data have not been assessed We do not know what type of data discrepancies there are, what data elements are impacted, or how the data discrepancies are otherwise distributed Need to know PPV and NPV These should be assessed to support continuing use of the data. 27

Data Collected in Both EHR and Self-report 34 Medical conditions Medications 8 Procedures Hospitalizations Smoking status Data collection points Self-report: Baseline and annual Follow-up EHR: Longitudinally, quaterly 28

29 Literature Not much help

Study Comparisons Ultimate plan: At the end of the study, will have the Gold standard and can use it to develop and validate indicators of EHR Data Quality. 30 Goal: to make accuracy assessment more attainable!

Comparison Strategy Writing phenotypes for self-report and EHR Three-way classification Self Report Yes Don t know No EHR Confirmatory Suggestive but uncertain No evidence of condition For the 2x2 table, will drop middle categories and analyze separately 31

Study Structure % n = 12,000 n =100 % % 32

Other Sources for Accuracy Asmt. Registry data (electronic compare) Health plan / Medicare data (electronic compare) Inpatient versus outpatient Database for a research study Recorded or double/repeated observer encounters Partial/subset assessment versus whole dataset An exercise in creative opportunicity! 33

Thank You! Meredith Nahm Zozus meredith.nahm@duke.edu Please complete the evaluation form. 2014 IDQS. All rights reserved. 34