Fitting the Method to the Question

Similar documents
Fitting the Method to the Question

Gerene S. Bauldoff, PhD, RN, FAAN Professor of Clinical Nursing EBP Mentor CRITICAL APPRAISAL OF DESCRIPTIVE STUDIES

MODULE 3 APPRAISING EVIDENCE. Evidence-Informed Policy Making Training

Appraising the Literature Overview of Study Designs

Collecting & Making Sense of

Evidence Informed Practice Online Learning Module Glossary

Collecting & Making Sense of

Research Approach & Design. Awatif Alam MBBS, Msc (Toronto),ABCM Professor Community Medicine Vice Provost Girls Section

Clinical problems and choice of study designs

PLS 506 Mark T. Imperial, Ph.D. Lecture Notes: Reliability & Validity

The Scientific Method. Philosophy of Science. Philosophy of Science. Epistemology: the philosophy of knowledge

Research Methods & Design Outline. Types of research design How to choose a research design Issues in research design

The degree to which a measure is free from error. (See page 65) Accuracy

Lecture 5 Conducting Interviews and Focus Groups

Checklist for Randomized Controlled Trials. The Joanna Briggs Institute Critical Appraisal tools for use in JBI Systematic Reviews

Lecture 4: Research Approaches

2 Critical thinking guidelines

M6728. Goals. Depression Scores. Research Designs

Experimental and Quasi-Experimental designs

UNDERSTANDING EPIDEMIOLOGICAL STUDIES. Csaba P Kovesdy, MD FASN Salem VA Medical Center, Salem VA University of Virginia, Charlottesville VA

Variables Research involves trying to determine the relationship between two or more variables.

Psych 1Chapter 2 Overview


Social Research (Complete) Agha Zohaib Khan

University of Wollongong. Research Online. Australian Health Services Research Institute

Study Design. Svetlana Yampolskaya, Ph.D. Summer 2013

Research Methodology. Characteristics of Observations. Variables 10/18/2016. Week Most important know what is being observed.

RESEARCH METHODS. Winfred, research methods,

The Role and Importance of Research

Biases in clinical research. Seungho Ryu, MD, PhD Kanguk Samsung Hospital, Sungkyunkwan University

Mixed Methods Study Design

RESEARCH METHODS. Winfred, research methods, ; rv ; rv

Designs. February 17, 2010 Pedro Wolf

Handout 1: Introduction to the Research Process and Study Design STAT 335 Fall 2016

Live WebEx meeting agenda

Epidemiologic Methods and Counting Infections: The Basics of Surveillance

Research and science: Qualitative methods

Design of Experiments & Introduction to Research

Checklist for Randomized Controlled Trials. The Joanna Briggs Institute Critical Appraisal tools for use in JBI Systematic Reviews

PLAN OF PRESENTATION. Background Understanding the Elaboration of a research Step by step analysis Conclusion

PTHP 7101 Research 1 Chapter Assignments

Overview of Study Designs in Clinical Research

CSC2130: Empirical Research Methods for Software Engineering

Study Design STUDY DESIGN CASE SERIES AND CROSS-SECTIONAL STUDY DESIGN

Research Design. Beyond Randomized Control Trials. Jody Worley, Ph.D. College of Arts & Sciences Human Relations

Non-Randomized Trials

Choosing and Using Quantitative Research Methods and Tools

VALIDITY OF QUANTITATIVE RESEARCH

STATISTICAL CONCLUSION VALIDITY

HPS301 Exam Notes- Contents

Evidence Based Medicine

INTRODUCTION TO EPIDEMIOLOGICAL STUDY DESIGNS PHUNLERD PIYARAJ, MD., MHS., PHD.

Doctoral Dissertation Boot Camp Quantitative Methods Kamiar Kouzekanani, PhD January 27, The Scientific Method of Problem Solving

Critical Appraisal Series

Formative and Impact Evaluation. Formative Evaluation. Impact Evaluation

Clinical Research Scientific Writing. K. A. Koram NMIMR

04/12/2014. Research Methods in Psychology. Chapter 6: Independent Groups Designs. What is your ideas? Testing

Lecture 2 Research as Second Language: Knowing the Rules of the Game

Chapter Three Research Methodology

Funnelling Used to describe a process of narrowing down of focus within a literature review. So, the writer begins with a broad discussion providing b

Epidemiological study design. Paul Pharoah Department of Public Health and Primary Care

Experimental design (continued)

Webinar 3 Systematic Literature Review: What you Need to Know

Objectives. Distinguish between primary and secondary studies. Discuss ways to assess methodological quality. Review limitations of all studies.

Realism and Qualitative Research. Joseph A. Maxwell George Mason University

CHAPTER VI RESEARCH METHODOLOGY

CRITICAL APPRAISAL AP DR JEMAIMA CHE HAMZAH MD (UKM) MS (OFTAL) UKM PHD (UK) DEPARTMENT OF OPHTHALMOLOGY UKM MEDICAL CENTRE

Types of Biomedical Research

The Effective Public Health Practice Project Tool

Analysis A step in the research process that involves describing and then making inferences based on a set of data.

Quantitative research Methods. Tiny Jaarsma

Overview and Comparisons of Risk of Bias and Strength of Evidence Assessment Tools: Opportunities and Challenges of Application in Developing DRIs

UNIT 5 - Association Causation, Effect Modification and Validity

Statistical Considerations for Research Design. Analytic Goal. Analytic Process

Validity and Quantitative Research. What is Validity? What is Validity Cont. RCS /16/04

Psychology: The Science

Sociological Research Methods and Techniques Alan S.Berger 1

Quantitative Research Methods and Tools

Biases in clinical research. Seungho Ryu, MD, PhD Kanguk Samsung Hospital, Sungkyunkwan University

Smith, J., & Noble, H. (2014). Bias in research. Evidence-Based Nursing, 17(4), DOI: /eb

PA 552: Designing Applied Research. Bruce Perlman Planning and Designing Research

Defining Evidence-Based : Developing a Standard for Judging the Quality of Evidence for Program Effectiveness and Utility

Process of Designing & Implementing a Research Project

Chapter 2 Research Approaches and Methods of Data Collection

Assignment 4: True or Quasi-Experiment

Psychology 205, Revelle, Fall 2014 Research Methods in Psychology Mid-Term. Name:

26:010:557 / 26:620:557 Social Science Research Methods

12/26/2013. Types of Biomedical Research. Clinical Research. 7Steps to do research INTRODUCTION & MEASUREMENT IN CLINICAL RESEARCH S T A T I S T I C

Impact of an Interactive Online Nursing Educational Module on Insulin Errors in Hospitalized Pediatric Patients

Use of the Quantitative-Methods Approach in Scientific Inquiry. Du Feng, Ph.D. Professor School of Nursing University of Nevada, Las Vegas

Survival Skills for Researchers. Study Design

3. Factors such as race, age, sex, and a person s physiological state are all considered determinants of disease. a. True

PYSC 224 Introduction to Experimental Psychology

Study design. Chapter 64. Research Methodology S.B. MARTINS, A. ZIN, W. ZIN

SINGLE-CASE RESEARCH. Relevant History. Relevant History 1/9/2018

Definition and Purpose

TECH 646 Analysis of Research in Industry and Technology

Gathering. Useful Data. Chapter 3. Copyright 2004 Brooks/Cole, a division of Thomson Learning, Inc.

Educational Research. S.Shafiee. Expert PDF Trial

Evidence-Based Medicine Journal Club. A Primer in Statistics, Study Design, and Epidemiology. August, 2013

Transcription:

Quantitative Research Design: Fitting the Method to the Question Deborah Eldredge, PhD, RN Director, Nursing Quality, Research & Magnet Oregon Health & Science University Healthcare Margo A. Halm, RN, PhD, ACNS-BC, FAHA Director, Nursing Research, Education & Magnet Salem Hospital

Objectives Describe how research designs differ depending on the questions being asked. Identify concepts cepts of bias, threats to validity, strengths, and limitations as related to observational designs Identify concepts of bias, threats to validity, strengths, and limitations as related to experimental designs

SCIENCE Basic aim of SCIENCE is to explain natural phenomena with generalizable knowing Identify / Understand Describe Explain Predict Control 3

Research design Research design is an attempt limit variability and minimize complexity Control Well-designed research increases chances that findings are real Generalizable Well-designed d research takes time, planning, and resources and well-designed human science research takes even more of all

Research Process Select a Conduct general literature problem review Exhaustive Preliminary review search, later expanded State conclusion/ generalization about problem Select specific problem, Collect Analyze and Interpret research question, or data present data findings hypothesis Decide design and methodology Statistical tables Integrative diagrams

Research Process Select a Conduct general literature problem review Exhaustive Preliminary review search, later expanded State conclusion/ generalization about problem Select specific problem, Collect Analyze and Interpret research question, or data present data findings hypothesis Decide design and methodology Statistical tables Integrative diagrams

Key concepts in measurement Reliability consistency; the likelihood that you ll see the same results from subject- subject, or within the same subject over time. Reduce error variance Validity the degree to which h the investigator t is measuring/describing the intended phenomenon

Key concepts in design Internal validity the likelihood that the results obtained in a study are due to the treatment, and not to some other factor. Good research designs = strong internal validity External validity Aspects of design that make it more likely the results from one study can be applied to a different sample in a different setting. Similar to generalizability.

Key concepts Bias Anything that could distort the results of the study, reducing the likelihood that the findings are true true. Different kinds of bias can reduce internal or external validity

Research designs Task of the investigator is to maximize internal and external validity To the extent possible, eliminate or account for possible sources of bias Lack of internal validity=lack of confidence in the result Strength of the evidence Choice of design is contingent upon Study question Ethics and pragmatics at

Hierarchy of Evidence Internal validity increases with each step up Randomized control trials Meta-analyses Evidence Reviews Cohort studies Non random trials Case-control studies Case series report Single case report Observation Studies Intervention Studies 2008 Susan E. Shapiro

What do levels indicate? Increasing probability that results reflect some objective reality Limit investigator-induced bias in measuring intervention or outcome Reduce threats to internal validity

Generation of practice knowledge Exploratory Qualitative Descriptive Quasi-experimental i Experimental Clinical trials Evaluation research Utilization in practice Correlational, regression, time series, path model Meta-analysis Efficacy, utility, costbenefit, feasibility Practice dissemination 13

Design dichotomies Qualitative vs. quantitative Descriptive vs. analytical Experimental/ quasi-experimental vs. non-experimental Hypothesis-generating vs. hypothesis testing Cross-sectional vs. longitudinal Retrospective vs. prospective Observational vs interventional entional Design to match question being asked

Nursing research designs Observational: Identify, describe and explain characteristics of Nurses Patients Processes Experimental Evaluate interventions (predict / control) Establish causation (predict / control)

Observational Designs: Identify, Describe, Explain Identify subjects Observe & record characteristics Data readily obtained Subject to bias: Selection Measurement Performance Attrition How subjects selected or assigned to groups How outcomes measured How subjects exposed to factor of interest How participants lost (dropout, non-response, withdrawal, protocol deviators), creating groups unequal in regard to exposure &/or outcome

DESCRIPTIVE STUDIES Measure and report on: - Selected subject characteristics - Relationships between characteristics Case Report/Case Series Survey Designs Case Control Cohort

Case Report/Case Series Identify and describe an unusual patient care situation Retrospective or prospective Includes patient presentation, interventions, outcomes Identifies patterns; raises awareness Example: Pyelonephritis and urosepsis in pregnancy

Case Report/Case Series Strengths Relatively inexpensive to design and analyze Describes phenomena as they naturally occur Initial step in understanding phenomena Limitations No causation can be inferred Minimal control over threats to internal & external validity Sample Non-random assignment / selection bias

Survey Research Describe or explain almost anything!!! Nurse satisfaction surveys Behavioral health risk surveys Survey results can be used as measures of predictor or outcome variables Cross sectional vs longitudinal Cross-sectional vs. longitudinal (one moment in time vs. series of observations over time)

Survey Research Strengths Flexible Broad in scope: Can survey for anything Limitations Data: Superficial; self-report Information on how survey developed is important Repeated measures Testing effects - Attrition (dropouts)

Survey Research Internal Validity Reliability of measurement Response biases in surveys and questionnaires - e.g., Selective recall, social acquiescence External Validity Sampling biases Return rates 70% gold standard (difficult to obtain) Reporting errors in data sets - Uploading results

Case Control & Cohort Look at relationships between predictors (independent variables) and outcomes (dependent variables) Intervention/exposure = Independent variable +/- outcome = Dependent variable

Case Control Study Usually retrospective Depends on presence/absence of outcome Example: 1. Identify patients who fell during hospital stay, versus those who did not (controls) 2. Analyze groups for presence of predictors that explain fall risk -Age - Mobility problems (balance, weakness) - Confusion/delirium - Medications -Urgency

Usually prospective Cohort Study Cohort depends on presence/absence of predictor Example: 1. Identify cohort of patients at risk for HAPU Hospitalized patients >65 2. Follow cohort to see who develops HAPU 3. Analyze for influence of + or of predictors

Case Control & Cohort Studies Strengths Useful when outcome of interest is rare, or takes a long time to develop Useful for initial studies Limitations Exposures not manipulated Case control & cross- sectional studies generally require small samples and are relatively inexpensive Does NOT establish causality Only levels of risk and association between risk and outcome

Case Control & Cohort Studies Internal Validity Reliability of measures for pedicto predictor & outcome variables - e.g., Inter-rater rater reliability Quality of recorded data on exposures & outcomes External Validity Defining cases & controls - Careful selection criteria Exhausting all possible predictors

Design & Methods for Getting Started Descriptive studies Chart review & other measurements Need precise variable definition Inter-rater rater reliability Data limited by what was recorded What about this Discovering something you weren t looking for

Experimental Designs Identify subjects Place in common context Intervene Observe effects of intervention Hard to do well Answer narrow question definitively

Pre- & post-test test intervention trial May or may not involve control group Participants rarely randomized Prominence in nursing studies Example: most studies involving educational interventions More likely to estimate effectiveness than efficacy

Randomized clinical trials Gold standard to predict or control Participants randomized to intervention or control All parties blinded (participant, investigator, analyst) Presence of control group similar in every way except for intervention

Quasi-Experimental Design When it is not possible to meet the gold standard to predict or control Participants randomized to intervention or control All parties blinded (participant, investigator, analyst) Presence of control group similar in every way except for intervention

Strengths and limitations Strengths Least opportunity for bias Greatest likelihood that outcomes are caused by intervention Limitations Dependent on integrity of investigator for randomization Fidelity to intervention critical Measures efficacy; may not translate directly to real world.

Internal and external validity Internal validity Extent to which investigator is blinded Integrity of control Effective randomization re: hypothesized covariates External validity Sampling biases Generalizability limited by complexity of intervention and sample selection

Criteria for causation Preponderance of the evidence Need reasonable explanation for relationships Need consistency across time and populations Caution: the basic science may change!

Design & methods for getting started Quick intervention Time for intervention to work Completeness of intervention Influences external to research project

Criteria of (good) Research Design Answer the research question Does the design test the hypotheses? Research question / hypotheses need to be consistent with research design Caution: lack of congruence 37

Criteria of (good) Research Design Control extraneous independent variables Does the design adequately control independent variables? Solution: RANDOMIZE Select participants at random Assign participants to groups at random Assign experimental treatments to groups at random 38

Criteria of (good) Research Design Generalizability Can we generalize the results of a study to other participants, other groups, and other conditions? Basic research (add knowledge to field of study) Applied research (generalizabilty is primary concern) 39

Two sources of research weakness Intrinsically poor designs Inability to manipulate independent variables Lack of power to randomize Risk of improper interpretation Good designs, poorly executed 40

Research and design Research is basic work of science Careful design helps reduce bias Improves internal and external validity Contributes to the scientific basis for nursing practice

Research design There is no perfect design! The choice of design depends on the question and pragmatics of the project The investigator s s responsibilities are to: Conduct the study ethically Report results honestly Identify limitations to study, both design and conduct

References Craig, J.V. & Smyth, R.L. (2007). The Evidence- based Practice Manual for Nurses. Churchill Livingstone, Edinburgh Guyatt, G., Gutterman, D., Baumann, M. H., Addrizzo-Harris, D., Hylek, E. M., Phillips, B., et al. (2006). Grading strength of recommendations and quality of evidence in clinical i l guidelines. Chest, 129, 174-181. Guyatt, G., Vist, G., Falck-Ytter, Y., Kunz, R., Magrini, N., & Schunemann, H. (2006). An emerging consensus on grading recommendations? ACP Journal Club, 144, A8-A9. A9

References Flew, A. (1979) A Dictionary of Philosophy (rev. 2 nd Ed.). New York: St. Martins Press Pedhazur, E.J. and Schmelkin, L. P. (1991). Measurement, Design, and Analysis: An Integrated Approach. Lawrence Erlbaum Associates, New Jersey Polit, D. F., & Hungler, B. P. (1999). Nursing research: Principles and methods (Sixth ed.). Philadelphia: l Lippincott. i Valanis, B. (1999)Epidemiology in Health Care. Stamford, CT: Appleton & Lange