Overview of Experimentation
|
|
- Randall O’Neal’
- 5 years ago
- Views:
Transcription
1 The Basics of Experimentation Overview of Experiments. IVs & DVs. Operational Definitions. Reliability. Validity. Internal vs. External Validity. Classic Threats to Internal Validity. Lab: FP Overview; Help on OBS Project &/or Assign 8. Overview of Experimentation Experiments are the most powerful research methods in science because, if done correctly (if high in internal validity) they allow us to establish cause & effect. Establishing Cause & Effect is the first step toward Explanation & Control. The Main Components of Any Experiment: 1. Statement of a Hypothesis. 2. Random Assignment of Participants to Conditions. 3. Manipulation of Antecedent Conditions (IVs). 4. Measurement of Behavior (DVs). 5. (Statistical) Analysis of Results. Overview of Experimentation A hypothesis is a (concrete, testable, falsifiable) statement about the relation between two or more variables. [If IV, then DV]. Independent Variable (IV): What the E manipulates. Examples include degree of anxiety, noise level, amount of training, time between study & test, color of survey, position of product, type of therapy, amount of counseling, etc. etc. etc. Levels of an IV: The different values of the variable. All IVs have at least two levels and often more. 1 IV: Degree of Anxiety; 2 Levels: Calm, Anxious. 1 IV: Degree of Anxiety; 3 Levels: Low, Moderate, High. 1 IV: Color of Survey; 4 Levels: White, Red, Blue, Yellow. 1 IV: Noise Level; 5 Levels: 20dB, 40dB, 60dB, 80dB, 100dB. 1 IV: Type of Therapy; 2 Levels: Drug, Cognitive-Behavioral. 1
2 Overview of Experimentation Dependent Variable (DV): What the E measures. Examples include % errors, % correct responses, exam score, degree of affiliation, level of extroversion, # of aggressive acts, change in academic performance, sales, etc. DVs do not have levels (although one might measure more than one thing; or one thing in more than one way). Operational Definitions: Defining variables in terms of specific/ concrete operations or procedures, leaving no room for guesswork or interpretation, so that they can be reproduced by another (naive) E. Experimental Operation Definitions : The precise procedures involved in implementing and/or manipulating an IV. Measured Operation Definitions: The precise procedures involved in measuring a DV. Reliability & Validity Reliability & Validity are at the heart of measurement. Reliability refers to whether a measure is consistent. Is my measuring instrument (test, survey, scale, etc.) dependable? Can I count on it to give the same scores over people, items, & time? In general, reliability is assessed via correlational measures that are interpreted similar to Pearson correlations. Validity refers to whether an instrument (test) truly measures the variable we think it measures (or want it to measure). Is my measurement (or IQ, Depression, Memory, etc.) really getti ng at the construct I'm interested in? In general, validity is also assessed via correlational measures, but usually in more sophisticated (theory-based) ways than those associated with reliability. Without reliable & valid measures there's no reason to do experiments. Reliability Inter-rater Reliability: Do two or more people produce the same ratings or measurements? Often assessed via (a) average % agreement, (b) Cohen's Kappa; or (c) average r over raters. Inter-item Reliability (aka. Internal Consistency): Do different parts of a test produce similar measurements? Split-half technique: Correlation between two halves of a test or survey (e.g., 1st & 2nd half; odd & even items). Cronbach's Alpha ( ): Average r among all possible split-halves. Test-Retest Reliability: Does a test produce similar measurements over time (e.g., at Time 1 & Time 2)? Treated as a special case of split-half; assessed as a correlation between Test 1 & Test 2 scores. 2
3 Face & Content Validity Face & Content Validity are non- empirical (non-data-based) forms of validity, based on argument and theory. Face Validity: Does my instrument (test, survey, question, etc.) look or feel like it's measuring what it's supposed to measure? Measures with high face validity may not always provide truly valid indices of the construct we are trying to measure (e.g., survey questions about racism/prejudice; cross-word puzzles as IQ tests). Measures with low face validity may can provide excellent measures of the construct we are after (e.g., Perceptual-Motor Speed or Raven's Progressive Matrices as indices of IQ). Content Validity: Is my instrument capturing the entirety of the construct I'm trying to measure? The extent to which the measure (a) includes relevant aspects of the construct and (b) exclude irrelevant aspects of the construct. E.g., Does the SAT measure more than academic achievement? Construct Validity Construct Validity looks at how well a measure (or factor) captures the relevant aspects of a construct and excludes irrelevant aspects. It can be thought of as the hard-core, data-based version of content validity. Convergent Validity: Does my instrument correlate with other instruments designed to measure the same construct? Does my measure of Intelligence (Raven's, cross-word puzzle ability) correlate with other measures of intelligence (WAIS)? Divergent Validity: Does my instrument not correlate with other instruments designed to measure different constructs? Does my measure of Intelligence (Raven's, cross-word puzzle ability) correlate with measures of memory, personality, athletic ability etc.? Criterion Validity Criterion Validity looks at the degree to which a measure (or factor) is related to other (often real-world) outcomes. It can be thought of as the hard-core, data- based version of face validity. Concurrent Validity: Does my instrument predict (correlate with) a currently available outcome related to the construct? Does my measure of Intelligence (Raven's, SAT, cross-word puzzle ability) correlate with current GPA? Predictive Validity: Does my instrument predict (correlate with) a future outcome related to the construct? Does my measure of Intelligence (Raven's, SAT, cross-word puzzle ability) correlate with GPA at graduation (or future Job/Pay level)? 3
4 Different Kinds of Validity Validity Non-Data-Based Data-Based Face Content Construct Criterion Convergent Divergent Concurrent Predictive Internal & External Validity Up until now we have been discussing the validity of measurements. Internal & External Validity are concerned with the validity of an experiment as a whole. Internal Validity. The degree to which a research design allows you to make causal statements (or draw firm conclusions). Well-designed experiments have high internal validity. Experiments with confounds have low internal validity. External Validity. The degree to which research findings generalize to people or situations outside the research setting. As a general rule, high external validity (generalization) will be dependent on high internal validity (a welldesigned experiment). Classic Threats to Internal Validity The internal validity of an experiment can be lowered by many things (e.g., bad design, noisy room, etc.). However, there are eight "classic" threats that should be especially guarded against. 1. History: Any outside event that affects the DV. Especially a problem when multiple measures are taken over time. or when different groups of subjects are tested at different points in time. 2. Maturation: Any physical or psych change that affects the DV. Mainly a problem when measures are taken over time (pre-/post-testing). Examples include fatigue, boredom, increased knowledge, etc. 3. Testing: Any change in performance due to prior test experience. Problematic when measures are taken over time (e.g., Within-Ss Designs). Examples include practice effects & test pre-sensitization. 4. Instrumentation: Changes in a measurement device; or in the criteria used by observers for recording behavioral events. E.g.'s: Changes in the sensitivity of a button; Different criteria for "violence". 4
5 Classic Threats to Internal Validity 5. Regression to the mean: Movement away from an extreme value toward the mean value. Mainly a problem measuresover-time (pre/post) problem. 6. Selection: Anything that results in non-equivalent groups being exposed to different treatment conditions. Ex Post Facto Designs; Non-random assignment; Self-selection; etc. 7. Attrition (aka. Mortality): Differential loss of subjects from particular treatment groups or conditions. Often a problem in drug or aging studies; or when more difficult or boring conditions are compared to easier or more interesting conditions. 8. Selection Interactions: When any of the above threats affects one treatment group more than another. E.g., History or Test effects on Males vs. Females or Young vs. Old. Final Project Overview I The Final Project for this course is a Paper [50 points] & Presentation [20 points] based on a 2x2 factorial experiment [2 IVs with 2 levels each]. Experiments will be designed & conducted by teams of 4 students. Topic ideas will be handed out in class. I strongly encourage each team to (a) get started early in designing and setting up their experiment; and (b) conduct test runs of their experiment before actual data is collected. Two class periods have been set aside for data collection (see course Schedule). You may also collect data outside of these periods, but All experiments must be reviewed & cleared by me before any data is collected. Final Project Overview II Paper: A complete report of your experiment. Papers should (a) strictly follow APA format, (b) use the hourglass method, & (c) include at least three references of papers you have read. Grading will be based on the format of your report as well as its content (including stats). More detailed guidelines to grading will be posted on the web. Presentation: Each team will give a brief (10-15 minute) presentation of their experiment to the class on Dec. 4. Presentations should be in PowerPoint and include an Intro, Method, Results & Discussion, with each member presenting one section. To receive full credit, PowerPoint Presentations must be mailed to MethodsTA@yahoo.com by 5pm on Dec. 4. To receive full credit, Hard copies of your final report must be in my mailbox by 5pm on Dec. 6. 5
Validity. Ch. 5: Validity. Griggs v. Duke Power - 2. Griggs v. Duke Power (1971)
Ch. 5: Validity Validity History Griggs v. Duke Power Ricci vs. DeStefano Defining Validity Aspects of Validity Face Validity Content Validity Criterion Validity Construct Validity Reliability vs. Validity
More information11-3. Learning Objectives
11-1 Measurement Learning Objectives 11-3 Understand... The distinction between measuring objects, properties, and indicants of properties. The similarities and differences between the four scale types
More informationReliability and Validity checks S-005
Reliability and Validity checks S-005 Checking on reliability of the data we collect Compare over time (test-retest) Item analysis Internal consistency Inter-rater agreement Compare over time Test-Retest
More informationHPS301 Exam Notes- Contents
HPS301 Exam Notes- Contents Week 1 Research Design: What characterises different approaches 1 Experimental Design 1 Key Features 1 Criteria for establishing causality 2 Validity Internal Validity 2 Threats
More informationValidity. Ch. 5: Validity. Griggs v. Duke Power - 2. Griggs v. Duke Power (1971)
Ch. 5: Validity Validity History Griggs v. Duke Power Ricci vs. DeStefano Defining Validity Aspects of Validity Face Validity Content Validity Criterion Validity Construct Validity Reliability vs. Validity
More informationLecture 4: Research Approaches
Lecture 4: Research Approaches Lecture Objectives Theories in research Research design approaches ú Experimental vs. non-experimental ú Cross-sectional and longitudinal ú Descriptive approaches How to
More informationChapter 4: Defining and Measuring Variables
Chapter 4: Defining and Measuring Variables A. LEARNING OUTCOMES. After studying this chapter students should be able to: Distinguish between qualitative and quantitative, discrete and continuous, and
More informationalternate-form reliability The degree to which two or more versions of the same test correlate with one another. In clinical studies in which a given function is going to be tested more than once over
More informationImportance of Good Measurement
Importance of Good Measurement Technical Adequacy of Assessments: Validity and Reliability Dr. K. A. Korb University of Jos The conclusions in a study are only as good as the data that is collected. The
More informationVARIABLES AND MEASUREMENT
ARTHUR SYC 204 (EXERIMENTAL SYCHOLOGY) 16A LECTURE NOTES [01/29/16] VARIABLES AND MEASUREMENT AGE 1 Topic #3 VARIABLES AND MEASUREMENT VARIABLES Some definitions of variables include the following: 1.
More informationSample Exam Questions Psychology 3201 Exam 1
Scientific Method Scientific Researcher Scientific Practitioner Authority External Explanations (Metaphysical Systems) Unreliable Senses Determinism Lawfulness Discoverability Empiricism Control Objectivity
More informationEmpowered by Psychometrics The Fundamentals of Psychometrics. Jim Wollack University of Wisconsin Madison
Empowered by Psychometrics The Fundamentals of Psychometrics Jim Wollack University of Wisconsin Madison Psycho-what? Psychometrics is the field of study concerned with the measurement of mental and psychological
More informationUse of the Quantitative-Methods Approach in Scientific Inquiry. Du Feng, Ph.D. Professor School of Nursing University of Nevada, Las Vegas
Use of the Quantitative-Methods Approach in Scientific Inquiry Du Feng, Ph.D. Professor School of Nursing University of Nevada, Las Vegas The Scientific Approach to Knowledge Two Criteria of the Scientific
More informationConstructing Indices and Scales. Hsueh-Sheng Wu CFDR Workshop Series June 8, 2015
Constructing Indices and Scales Hsueh-Sheng Wu CFDR Workshop Series June 8, 2015 1 Outline What are scales and indices? Graphical presentation of relations between items and constructs for scales and indices
More informationValidity and reliability of measurements
Validity and reliability of measurements 2 Validity and reliability of measurements 4 5 Components in a dataset Why bother (examples from research) What is reliability? What is validity? How should I treat
More informationEmpirical Knowledge: based on observations. Answer questions why, whom, how, and when.
INTRO TO RESEARCH METHODS: Empirical Knowledge: based on observations. Answer questions why, whom, how, and when. Experimental research: treatments are given for the purpose of research. Experimental group
More informationReliability AND Validity. Fact checking your instrument
Reliability AND Validity Fact checking your instrument General Principles Clearly Identify the Construct of Interest Use Multiple Items Use One or More Reverse Scored Items Use a Consistent Response Format
More information9 research designs likely for PSYC 2100
9 research designs likely for PSYC 2100 1) 1 factor, 2 levels, 1 group (one group gets both treatment levels) related samples t-test (compare means of 2 levels only) 2) 1 factor, 2 levels, 2 groups (one
More informationUnderlying Theory & Basic Issues
Underlying Theory & Basic Issues Dewayne E Perry ENS 623 Perry@ece.utexas.edu 1 All Too True 2 Validity In software engineering, we worry about various issues: E-Type systems: Usefulness is it doing what
More informationSamples, Sample Size And Sample Error. Research Methodology. How Big Is Big? Estimating Sample Size. Variables. Variables 2/25/2018
Research Methodology Samples, Sample Size And Sample Error Sampling error = difference between sample and population characteristics Reducing sampling error is the goal of any sampling technique As sample
More informationValidity and reliability of measurements
Validity and reliability of measurements 2 3 Request: Intention to treat Intention to treat and per protocol dealing with cross-overs (ref Hulley 2013) For example: Patients who did not take/get the medication
More informationChapter 9 Experimental Research (Reminder: Don t forget to utilize the concept maps and study questions as you study this and the other chapters.
Chapter 9 Experimental Research (Reminder: Don t forget to utilize the concept maps and study questions as you study this and the other chapters.) In this chapter we talk about what experiments are, we
More informationHUMAN-COMPUTER INTERACTION EXPERIMENTAL DESIGN
HUMAN-COMPUTER INTERACTION EXPERIMENTAL DESIGN Professor Bilge Mutlu Computer Sciences, Psychology, & Industrial and Systems Engineering University of Wisconsin Madison CS/Psych-770 Human-Computer Interaction
More informationOBSERVATION METHODS: EXPERIMENTS
OBSERVATION METHODS: EXPERIMENTS Sociological Research Methods Experiments Independent variable is manipulated, and the dependent variable respond to the manipulation. e.g. Eating a chocolate bar prior
More informationPsychology 205, Revelle, Fall 2014 Research Methods in Psychology Mid-Term. Name:
Name: 1. (2 points) What is the primary advantage of using the median instead of the mean as a measure of central tendency? It is less affected by outliers. 2. (2 points) Why is counterbalancing important
More informationDesign of Experiments & Introduction to Research
Design of Experiments & Introduction to Research 1 Design of Experiments Introduction to Research Definition and Purpose Scientific Method Research Project Paradigm Structure of a Research Project Types
More informationPSY 250. Experimental Design: The Basic Building Blocks. Simple between subjects design. The Two-Group Design 7/25/2015. Experimental design
Experimental Design: The Basic Building Blocks PSY 250 Experimental design The general plan for selecting participants, assigning participants to experimental conditions, controlling extraneous variables,
More informationValidity and Reliability. PDF Created with deskpdf PDF Writer - Trial ::
Validity and Reliability PDF Created with deskpdf PDF Writer - Trial :: http://www.docudesk.com Validity Is the translation from concept to operationalization accurately representing the underlying concept.
More informationPsychometrics, Measurement Validity & Data Collection
Psychometrics, Measurement Validity & Data Collection Psychometrics & Measurement Validity Measurement & Constructs Kinds of items and their combinations Properties of a good measure Data Collection Observational
More information2013/4/28. Experimental Research
2013/4/28 Experimental Research Definitions According to Stone (Research methods in organizational behavior, 1978, pp 118), a laboratory experiment is a research method characterized by the following:
More informationExperimental Design Part II
Experimental Design Part II Keith Smolkowski April 30, 2008 Where Are We Now? esearch eview esearch Design: The Plan Internal Validity Statements of Causality External Validity Statements of Generalizability
More informationinvestigate. educate. inform.
investigate. educate. inform. Research Design What drives your research design? The battle between Qualitative and Quantitative is over Think before you leap What SHOULD drive your research design. Advanced
More informationAudio: In this lecture we are going to address psychology as a science. Slide #2
Psychology 312: Lecture 2 Psychology as a Science Slide #1 Psychology As A Science In this lecture we are going to address psychology as a science. Slide #2 Outline Psychology is an empirical science.
More informationEVALUATING AND IMPROVING MULTIPLE CHOICE QUESTIONS
DePaul University INTRODUCTION TO ITEM ANALYSIS: EVALUATING AND IMPROVING MULTIPLE CHOICE QUESTIONS Ivan Hernandez, PhD OVERVIEW What is Item Analysis? Overview Benefits of Item Analysis Applications Main
More informationChapter 3 Psychometrics: Reliability and Validity
34 Chapter 3 Psychometrics: Reliability and Validity Every classroom assessment measure must be appropriately reliable and valid, be it the classic classroom achievement test, attitudinal measure, or performance
More informationPTHP 7101 Research 1 Chapter Assignments
PTHP 7101 Research 1 Chapter Assignments INSTRUCTIONS: Go over the questions/pointers pertaining to the chapters and turn in a hard copy of your answers at the beginning of class (on the day that it is
More informationDoctoral Dissertation Boot Camp Quantitative Methods Kamiar Kouzekanani, PhD January 27, The Scientific Method of Problem Solving
Doctoral Dissertation Boot Camp Quantitative Methods Kamiar Kouzekanani, PhD January 27, 2018 The Scientific Method of Problem Solving The conceptual phase Reviewing the literature, stating the problem,
More informationPLS 506 Mark T. Imperial, Ph.D. Lecture Notes: Reliability & Validity
PLS 506 Mark T. Imperial, Ph.D. Lecture Notes: Reliability & Validity Measurement & Variables - Initial step is to conceptualize and clarify the concepts embedded in a hypothesis or research question with
More informationWhat We Will Cover in This Section
Pre and Quasi-Experimental 1 What We Will Cover in This Section Pre-Experimental Quasi- Experimental. Summary 2 Internal Validity Revisited 3 Common Threats to Internal Validity 1. History. 2. Maturation.
More informationPsychology: The Science
Psychology: The Science How Psychologists Do Research Ex: While biking, it seems to me that drivers of pick up trucks aren t as nice as car drivers. I make a hypothesis or even develop a theory that p/u
More informationThe Science of Psychology
The Science of Psychology Module 2 Psychology s Scientific Method Module Objectives Why is Psychology a Science? What is the scientific method? Why should I believe what researchers say? How do Psychologist
More informationhow good is the Instrument? Dr Dean McKenzie
how good is the Instrument? Dr Dean McKenzie BA(Hons) (Psychology) PhD (Psych Epidemiology) Senior Research Fellow (Abridged Version) Full version to be presented July 2014 1 Goals To briefly summarize
More informationLecture Week 3 Quality of Measurement Instruments; Introduction SPSS
Lecture Week 3 Quality of Measurement Instruments; Introduction SPSS Introduction to Research Methods & Statistics 2013 2014 Hemmo Smit Overview Quality of Measurement Instruments Introduction SPSS Read:
More informationCHAPTER VI RESEARCH METHODOLOGY
CHAPTER VI RESEARCH METHODOLOGY 6.1 Research Design Research is an organized, systematic, data based, critical, objective, scientific inquiry or investigation into a specific problem, undertaken with the
More informationScientific Research. The Scientific Method. Scientific Explanation
Scientific Research The Scientific Method Make systematic observations. Develop a testable explanation. Submit the explanation to empirical test. If explanation fails the test, then Revise the explanation
More informationCHAPTER 8 EXPERIMENTAL DESIGN
CHAPTER 8 1 EXPERIMENTAL DESIGN LEARNING OBJECTIVES 2 Define confounding variable, and describe how confounding variables are related to internal validity Describe the posttest-only design and the pretestposttest
More informationThe following are questions that students had difficulty with on the first three exams.
The following are questions that students had difficulty with on the first three exams. Exam 1 1. A measure has construct validity if it: a) really measures what it is supposed to measure b) appears, on
More informationBy Hui Bian Office for Faculty Excellence
By Hui Bian Office for Faculty Excellence 1 Email: bianh@ecu.edu Phone: 328-5428 Location: 1001 Joyner Library, room 1006 Office hours: 8:00am-5:00pm, Monday-Friday 2 Educational tests and regular surveys
More informationDATA is derived either through. Self-Report Observation Measurement
Data Management DATA is derived either through Self-Report Observation Measurement QUESTION ANSWER DATA DATA may be from Structured or Unstructured questions? Quantitative or Qualitative? Numerical or
More informationFinal Exam: PSYC 300. Multiple Choice Items (1 point each)
Final Exam: PSYC 300 Multiple Choice Items (1 point each) 1. Which of the following is NOT one of the three fundamental features of science? a. empirical questions b. public knowledge c. mathematical equations
More informationPÄIVI KARHU THE THEORY OF MEASUREMENT
PÄIVI KARHU THE THEORY OF MEASUREMENT AGENDA 1. Quality of Measurement a) Validity Definition and Types of validity Assessment of validity Threats of Validity b) Reliability True Score Theory Definition
More information1. Temporal precedence 2. Existence of a relationship 3. Elimination of Rival Plausible Explanations
Ci Criminology i 860 Experimentation: Deconstructing Donnerstein & Berkowitz Remember John Stuart Mill s 3 Criteria for inferring the existence of a causal relation 1. Temporal precedence 2. Existence
More informationTopic #2. A key criterion in evaluating any test, measure, or piece of research is validity.
ARTHUR SYC 302 (EXERIMENTAL SYCHOLOGY) 18C LECTURE NOTES [08/23/18] RESEARCH VALIDITY AGE 1 Topic #2 RESEARCH VALIDITY A key criterion in evaluating any test, measure, or piece of research is validity.
More informationAssociate Prof. Dr Anne Yee. Dr Mahmoud Danaee
Associate Prof. Dr Anne Yee Dr Mahmoud Danaee 1 2 What does this resemble? Rorschach test At the end of the test, the tester says you need therapy or you can't work for this company 3 Psychological Testing
More informationMeasurement and Descriptive Statistics. Katie Rommel-Esham Education 604
Measurement and Descriptive Statistics Katie Rommel-Esham Education 604 Frequency Distributions Frequency table # grad courses taken f 3 or fewer 5 4-6 3 7-9 2 10 or more 4 Pictorial Representations Frequency
More informationPsychology Research Process
Psychology Research Process Logical Processes Induction Observation/Association/Using Correlation Trying to assess, through observation of a large group/sample, what is associated with what? Examples:
More informationOVERVIEW OF RESEARCH METHODS II. Lecturer: Dr. Paul Narh Doku Contact: Department of Psychology, University of Ghana
OVERVIEW OF RESEARCH METHODS II Lecturer: Dr. Paul Narh Doku Contact: pndoku@ug.edu.gh Department of Psychology, University of Ghana Session Overview This session will present an overview of several non-experimental
More informationDo not write your name on this examination all 40 best
Student #: Do not write your name on this examination Research in Psychology I; Final Exam Fall 200 Instructor: Jeff Aspelmeier NOTE: THIS EXAMINATION MAY NOT BE RETAINED BY STUDENTS This exam is worth
More informationCHAPTER 2 APPLYING SCIENTIFIC THINKING TO MANAGEMENT PROBLEMS
Cambodian Mekong University is the university that cares for the value of education MN 400: Research Methods CHAPTER 2 APPLYING SCIENTIFIC THINKING TO MANAGEMENT PROBLEMS Teacher: Pou, Sovann Sources of
More informationStudy Design. Svetlana Yampolskaya, Ph.D. Summer 2013
Study Design Svetlana Yampolskaya, Ph.D. Summer 2013 Study Design Single point in time Cross-Sectional Research Multiple points in time Study only exists in this point in time Longitudinal Research Study
More informationValidation of Scales
Validation of Scales ἀγεωμέτρητος μηδεὶς εἰσίτω (Let none enter without a knowledge of mathematics) D R D I N E S H R A M O O Introduction Validity and validation are crucial to understanding psychological
More informationADMS Sampling Technique and Survey Studies
Principles of Measurement Measurement As a way of understanding, evaluating, and differentiating characteristics Provides a mechanism to achieve precision in this understanding, the extent or quality As
More informationExperiments. Outline. Experiment. Independent samples designs. Experiment Independent variable
Experiments 1 Outline Experiment Independent variable Factor Levels Condition Dependent variable Control variable Internal vs external validity Independent samples designs Why / Why not? Carry-over effects
More informationWhat We Will Cover in This Section
Pre and Quasi-Experimental Design 9/2/2006 P767Intro.ppt 1 What We Will Cover in This Section What are they? Threats to internal validity. Pre-Experimental Designs Quasi- Experimental Designs. Summary
More information9.63 Laboratory in Cognitive Science
9.63 Laboratory in Cognitive Science Fall 2005 Course 2b Variables, Controls Aude Oliva Ben Balas, Charles Kemp Science is: Scientific Thinking 1. Empirical Based on observations 2. Objective Observations
More informationVariables in Research. What We Will Cover in This Section. What Does Variable Mean?
Variables in Research 9/20/2005 P767 Variables in Research 1 What We Will Cover in This Section Nature of variables. Measuring variables. Reliability. Validity. Measurement Modes. Issues. 9/20/2005 P767
More informationRELIABILITY AND VALIDITY (EXTERNAL AND INTERNAL)
UNIT 2 RELIABILITY AND VALIDITY (EXTERNAL AND INTERNAL) Basic Process/Concept in Research Structure 2.0 Introduction 2.1 Objectives 2.2 Reliability 2.3 Methods of Estimating Reliability 2.3.1 External
More informationComparing Vertical and Horizontal Scoring of Open-Ended Questionnaires
A peer-reviewed electronic journal. Copyright is retained by the first or sole author, who grants right of first publication to the Practical Assessment, Research & Evaluation. Permission is granted to
More informationMeasurement. Reliability vs. Validity. Reliability vs. Validity
MEASUREMENT Outline 1. Measurement - Reliability vs. Validity - Validity - Threats to Internal Validity - Construct Validity (measurement validity) - Face validity - Content validity - Predictive validity
More informationLesson 2 The Experimental Method
A Level Psychology Year 1 Research Methods Lesson 2 The Experimental Method Lesson Objectives Describe and evaluate the different types of experiment. Describe, formulate and distinguish between aims and
More informationChapter 4. The Validity of Assessment- Based Interpretations
Chapter 4. The Validity of Assessment- Based Interpretations contents What is validity? Its definition Its importance What are the sorts of evidence of validity? content-related evidence of validity Criterion-related
More informationHow to Think Straight About Psychology
How to Think Straight About Psychology A Quick and Dirty Overview of Stanovich s Wonderful Book. Chapter 1 Freud problem general public s link of Freud to psychology most psychologists don t find his theory
More informationOverview of the Logic and Language of Psychology Research
CHAPTER W1 Overview of the Logic and Language of Psychology Research Chapter Outline The Traditionally Ideal Research Approach Equivalence of Participants in Experimental and Control Groups Equivalence
More informationEducational Research Mcqs Chapter 1 Multiple Choice Questions (The answers are provided after the last question.) 1. Mrs. Smith is writing her daily observations of a student and writes, without interpretation,
More informationThe Science of Psychology
The Science of Psychology Module 2 Psychology s Scientific Method Module Objectives Why is Psychology a Science? What is the scientific method? Why should I believe what researchers say? How do Psychologist
More informationLesson 11 Correlations
Lesson 11 Correlations Lesson Objectives All students will define key terms and explain the difference between correlations and experiments. All students should be able to analyse scattergrams using knowledge
More informationCAUTIONS ABOUT THE PRACTICE EXAM
Dr. Michael Passer Psych 209, U. of Washington CAUTIONS ABOUT THE PRACTICE EXAM DEAR STUDENT: This practice tests consists of questions from past exams. By taking this practice test, you should gain an
More informationResearch Questions and Survey Development
Research Questions and Survey Development R. Eric Heidel, PhD Associate Professor of Biostatistics Department of Surgery University of Tennessee Graduate School of Medicine Research Questions 1 Research
More information2-Group Multivariate Research & Analyses
2-Group Multivariate Research & Analyses Research Designs Research hypotheses Outcome & Research Hypotheses Outcomes & Truth Significance Tests & Effect Sizes Multivariate designs Increased effects Increased
More informationMeasurement is the process of observing and recording the observations. Two important issues:
Farzad Eskandanian Measurement is the process of observing and recording the observations. Two important issues: 1. Understanding the fundamental ideas: Levels of measurement: nominal, ordinal, interval
More information26:010:557 / 26:620:557 Social Science Research Methods
26:010:557 / 26:620:557 Social Science Research Methods Dr. Peter R. Gillett Associate Professor Department of Accounting & Information Systems Rutgers Business School Newark & New Brunswick 1 Overview
More informationN Utilization of Nursing Research in Advanced Practice, Summer 2008
University of Michigan Deep Blue deepblue.lib.umich.edu 2008-07 N 536 - Utilization of Nursing Research in Advanced Practice, Summer 2008 Tzeng, Huey-Ming Tzeng, H. (2008, October 1). Utilization of Nursing
More informationIntroduction to the Scientific Method. Knowledge and Methods. Methods for gathering knowledge. method of obstinacy
Introduction to Research Methods COGS 160 (COGS 14A) Dept. of Cognitive Science Prof. Rafael Núñez R Introduction to the Scientific Method ~ Chapter 1 Knowledge and Methods Method (Merriam-Webster) a procedure
More informationORIGINS AND DISCUSSION OF EMERGENETICS RESEARCH
ORIGINS AND DISCUSSION OF EMERGENETICS RESEARCH The following document provides background information on the research and development of the Emergenetics Profile instrument. Emergenetics Defined 1. Emergenetics
More informationEmpirical Research Methods for Human-Computer Interaction. I. Scott MacKenzie Steven J. Castellucci
Empirical Research Methods for Human-Computer Interaction I. Scott MacKenzie Steven J. Castellucci 1 Topics The what, why, and how of empirical research Group participation in a real experiment Observations
More informationReliability & Validity Dr. Sudip Chaudhuri
Reliability & Validity Dr. Sudip Chaudhuri M. Sc., M. Tech., Ph.D., M. Ed. Assistant Professor, G.C.B.T. College, Habra, India, Honorary Researcher, Saha Institute of Nuclear Physics, Life Member, Indian
More informationLimitations of 2-cond Designs
Design Conditions & Variables Limitations of 2-group designs Kinds of Treatment & Control conditions Kinds of Causal Hypotheses Explicating Design Variables Kinds of IVs Identifying potential confounds
More informationGeorgina Salas. Topics EDCI Intro to Research Dr. A.J. Herrera
Homework assignment topics 32-36 Georgina Salas Topics 32-36 EDCI Intro to Research 6300.62 Dr. A.J. Herrera Topic 32 1. Researchers need to use at least how many observers to determine interobserver reliability?
More informationIntroduction to Reliability
Reliability Thought Questions: How does/will reliability affect what you do/will do in your future job? Which method of reliability analysis do you find most confusing? Introduction to Reliability What
More informationThreats to Validity in Experiments. The John Henry Effect
Threats to Validity in Experiments Office Hours: M 11:30-12:30, W 10:30-12:30 SSB 447 The John Henry Effect Housekeeping Homework 3, due Tuesday A great paper, but far from an experiment Midterm review
More informationTypes of Tests. Measurement Reliability. Most self-report tests used in Psychology and Education are objective tests :
Measurement Reliability Objective & Subjective tests Standardization & Inter-rater reliability Properties of a good item Item Analysis Internal Reliability Spearman-Brown Prophesy Formla -- α & # items
More informationExperimental Design. Dewayne E Perry ENS C Empirical Studies in Software Engineering Lecture 8
Experimental Design Dewayne E Perry ENS 623 Perry@ece.utexas.edu 1 Problems in Experimental Design 2 True Experimental Design Goal: uncover causal mechanisms Primary characteristic: random assignment to
More informationMeeting-5 MEASUREMENT 8-1
Meeting-5 MEASUREMENT 8-1 Measurement Measurement Process: 1. Selecting observable empirical events 2. Using numbers or symbols to represent aspects of the events being measured 3. Applying a mapping rule
More informationHow do we construct Intelligence tests? Tests must be: Standardized Reliable Valid
Test Construction How do we construct Intelligence tests? Tests must be: Standardized Reliable Valid Standardization The test must be pre-tested to a representative sample of people and form a normal distribution
More informationThe reality.1. Project IT89, Ravens Advanced Progressive Matrices Correlation: r = -.52, N = 76, 99% normal bivariate confidence ellipse
The reality.1 45 35 Project IT89, Ravens Advanced Progressive Matrices Correlation: r = -.52, N = 76, 99% normal bivariate confidence ellipse 25 15 5-5 4 8 12 16 2 24 28 32 RAVEN APM Score Let us examine
More informationRESEARCH METHODS. Winfred, research methods, ; rv ; rv
RESEARCH METHODS 1 Research Methods means of discovering truth 2 Research Methods means of discovering truth what is truth? 3 Research Methods means of discovering truth what is truth? Riveda Sandhyavandanam
More informationVariables Research involves trying to determine the relationship between two or more variables.
1 2 Research Methodology Week 4 Characteristics of Observations 1. Most important know what is being observed. 2. Assign behaviors to categories. 3. Know how to Measure. 4. Degree of Observer inference.
More informationResearch Methodology. Introduction 10/18/2016. Introduction. User name: cp6691 Password: stats.
Research Methodology 1 http://spectrum.troy.edu/drsmall/ Web Page User name: cp6691 Password: stats 2 Pieces of the Puzzle Chapter 2 Literature Review Chapter 1 Chapter 3 Methodology No Data Collection
More information4 Diagnostic Tests and Measures of Agreement
4 Diagnostic Tests and Measures of Agreement Diagnostic tests may be used for diagnosis of disease or for screening purposes. Some tests are more effective than others, so we need to be able to measure
More information