STATISTICS INFORMED DECISIONS USING DATA

Similar documents
Pearson Education Limited Edinburgh Gate Harlow Essex CM20 2JE England and Associated Companies throughout the world

Chapter 3 CORRELATION AND REGRESSION

STAT 201 Chapter 3. Association and Regression

Chapter 3: Examining Relationships

Simple Linear Regression the model, estimation and testing

Section 3.2 Least-Squares Regression

Chapter 1: Exploring Data

STATISTICS & PROBABILITY

Chapter 3: Describing Relationships

Lecture 6B: more Chapter 5, Section 3 Relationships between Two Quantitative Variables; Regression

NORTH SOUTH UNIVERSITY TUTORIAL 2

Examining Relationships Least-squares regression. Sections 2.3

Lecture 12: more Chapter 5, Section 3 Relationships between Two Quantitative Variables; Regression

2.75: 84% 2.5: 80% 2.25: 78% 2: 74% 1.75: 70% 1.5: 66% 1.25: 64% 1.0: 60% 0.5: 50% 0.25: 25% 0: 0%

M 140 Test 1 A Name SHOW YOUR WORK FOR FULL CREDIT! Problem Max. Points Your Points Total 60

Unit 1 Exploring and Understanding Data

Homework #3. SHORT ANSWER. Write the word or phrase that best completes each statement or answers the question.

Further Mathematics 2018 CORE: Data analysis Chapter 3 Investigating associations between two variables

3.2 Least- Squares Regression

Results & Statistics: Description and Correlation. I. Scales of Measurement A Review

Business Statistics Probability

AP Statistics Practice Test Ch. 3 and Previous

INTERPRET SCATTERPLOTS

CHILD HEALTH AND DEVELOPMENT STUDY

CHAPTER 3 Describing Relationships

Part 1. For each of the following questions fill-in the blanks. Each question is worth 2 points.

bivariate analysis: The statistical analysis of the relationship between two variables.

1.4 - Linear Regression and MS Excel

Chapter 4: More about Relationships between Two-Variables Review Sheet

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test February 2016

Understandable Statistics

Unit 8 Day 1 Correlation Coefficients.notebook January 02, 2018

M 140 Test 1 A Name (1 point) SHOW YOUR WORK FOR FULL CREDIT! Problem Max. Points Your Points Total 75

STATS Relationships between variables: Correlation

Statistics for Psychology

Medical Statistics 1. Basic Concepts Farhad Pishgar. Defining the data. Alive after 6 months?

12.1 Inference for Linear Regression. Introduction

STATISTICS 8 CHAPTERS 1 TO 6, SAMPLE MULTIPLE CHOICE QUESTIONS

Chapter 3 Review. Name: Class: Date: Multiple Choice Identify the choice that best completes the statement or answers the question.

Correlation and regression

Data and Statistics 101: Key Concepts in the Collection, Analysis, and Application of Child Welfare Data

Statistics is the science of collecting, organizing, presenting, analyzing, and interpreting data to assist in making effective decisions

Correlational Research. Correlational Research. Stephen E. Brock, Ph.D., NCSP EDS 250. Descriptive Research 1. Correlational Research: Scatter Plots

Readings: Textbook readings: OpenStax - Chapters 1 13 (emphasis on Chapter 12) Online readings: Appendix D, E & F

Chapter 14: More Powerful Statistical Methods

Section 3 Correlation and Regression - Teachers Notes

CHAPTER 3 DATA ANALYSIS: DESCRIBING DATA

Regression Equation. November 29, S10.3_3 Regression. Key Concept. Chapter 10 Correlation and Regression. Definitions

Describe what is meant by a placebo Contrast the double-blind procedure with the single-blind procedure Review the structure for organizing a memo

CRITERIA FOR USE. A GRAPHICAL EXPLANATION OF BI-VARIATE (2 VARIABLE) REGRESSION ANALYSISSys

Describe what is meant by a placebo Contrast the double-blind procedure with the single-blind procedure Review the structure for organizing a memo

Welcome to OSA Training Statistics Part II

Still important ideas

WDHS Curriculum Map Probability and Statistics. What is Statistics and how does it relate to you?

Simple Linear Regression

CHAPTER ONE CORRELATION

11/18/2013. Correlational Research. Correlational Designs. Why Use a Correlational Design? CORRELATIONAL RESEARCH STUDIES

IAPT: Regression. Regression analyses

3.2A Least-Squares Regression

14.1: Inference about the Model

Biostatistics 2 - Correlation and Risk

2 Assumptions of simple linear regression

3.4 What are some cautions in analyzing association?

Chapter 1 Where Do Data Come From?

MMI 409 Spring 2009 Final Examination Gordon Bleil. 1. Is there a difference in depression as a function of group and drug?

Relationships. Between Measurements Variables. Chapter 10. Copyright 2005 Brooks/Cole, a division of Thomson Learning, Inc.

MULTIPLE REGRESSION OF CPS DATA

Statistics is the science of collecting, organizing, presenting, analyzing, and interpreting data to assist in making effective decisions

Chapter 4: More about Relationships between Two-Variables

Regression CHAPTER SIXTEEN NOTE TO INSTRUCTORS OUTLINE OF RESOURCES

Still important ideas

10. LINEAR REGRESSION AND CORRELATION

Conditional Distributions and the Bivariate Normal Distribution. James H. Steiger

Lecture 12 Cautions in Analyzing Associations

Research Methods in Forest Sciences: Learning Diary. Yoko Lu December Research process

TEACHING REGRESSION WITH SIMULATION. John H. Walker. Statistics Department California Polytechnic State University San Luis Obispo, CA 93407, U.S.A.

Results. Example 1: Table 2.1 The Effect of Additives on Daphnia Heart Rate. Time (min)

Caffeine & Calories in Soda. Statistics. Anthony W Dick

Undertaking statistical analysis of

Introduction. Lecture 1. What is Statistics?

Multiple Regression. James H. Steiger. Department of Psychology and Human Development Vanderbilt University

Unit 3 Lesson 2 Investigation 4

Class 7 Everything is Related

Lecture (chapter 12): Bivariate association for nominal- and ordinal-level variables

5 To Invest or not to Invest? That is the Question.

Chapter 7: Descriptive Statistics

(a) 50% of the shows have a rating greater than: impossible to tell

Introduction to regression

FORM C Dr. Sanocki, PSY 3204 EXAM 1 NAME

CHAPTER 3 RESEARCH METHODOLOGY

Analysis and Interpretation of Data Part 1

HW 3.2: page 193 #35-51 odd, 55, odd, 69, 71-78

Content. Basic Statistics and Data Analysis for Health Researchers from Foreign Countries. Research question. Example Newly diagnosed Type 2 Diabetes

Table of Contents. Plots. Essential Statistics for Nursing Research 1/12/2017

Lesson 1: Distributions and Their Shapes

Chapter 1: Explaining Behavior

appstats26.notebook April 17, 2015

Chapter 2 Organizing and Summarizing Data. Chapter 3 Numerically Summarizing Data. Chapter 4 Describing the Relation between Two Variables

10. Introduction to Multivariate Relationships

Homework Linear Regression Problems should be worked out in your notebook

Transcription:

STATISTICS INFORMED DECISIONS USING DATA Fifth Edition Chapter 4 Describing the Relation between Two Variables

4.1 Scatter Diagrams and Correlation Learning Objectives 1. Draw and interpret scatter diagrams 2. Describe the properties of the linear correlation coefficient 3. Compute and interpret the linear correlation coefficient 4. Determine whether a linear relation exists between two variables 5. Explain the difference between correlation and causation

4.1 Scatter Diagrams and Correlation 4.1.1 Draw and Interpret Scatter Diagrams (1 of 6) The response variable is the variable whose value can be explained by the value of the explanatory or predictor variable. A scatter diagram is a graph that shows the relationship between two quantitative variables measured on the same individual. Each individual in the data set is represented by a point in the scatter diagram. The explanatory variable is plotted on the horizontal axis, and the response variable is plotted on the vertical axis.

4.1 Scatter Diagrams and Correlation 4.1.1 Draw and Interpret Scatter Diagrams (2 of 6) EXAMPLE Drawing and Interpreting a Scatter Diagram The data shown to the right are based on a study for drilling rock. The researchers wanted to determine whether the time it takes to dry drill a distance of 5 feet in rock increases with the depth at which the drilling begins. So, depth at which drilling begins is the explanatory variable, x, and time (in minutes) to drill five feet is the response variable, y. Draw a scatter diagram of the data. Source: Penner, R., and Watts, D.G. Mining Information. The American Statistician, Vol. 45, No. 1, Feb. 1991, p. 6. Depth at Which Drilling Begins, x (in feet) Time to Drill 5 Feet, y (in minutes) 35 5.88 50 5.99 75 6.74 95 6.1 120 7.47 130 6.93 145 6.42 155 7.97 160 7.92 175 7.62 185 6.89 190 7.9

4.1 Scatter Diagrams and Correlation 4.1.1 Draw and Interpret Scatter Diagrams (3 of 6)

4.1 Scatter Diagrams and Correlation 4.1.1 Draw and Interpret Scatter Diagrams (4 of 6) Various Types of Relations in a Scatter Diagram

4.1 Scatter Diagrams and Correlation 4.1.1 Draw and Interpret Scatter Diagrams (5 of 6) Two variables that are linearly related are positively associated when above-average values of one variable are associated with above-average values of the other variable and below-average values of one variable are associated with below-average values of the other variable. That is, two variables are positively associated if, whenever the value of one variable increases, the value of the other variable also increases.

4.1 Scatter Diagrams and Correlation 4.1.1 Draw and Interpret Scatter Diagrams (6 of 6) Two variables that are linearly related are negatively associated when above-average values of one variable are associated with below-average values of the other variable. That is, two variables are negatively associated if, whenever the value of one variable increases, the value of the other variable decreases.

4.1 Scatter Diagrams and Correlation 4.1.2 Describe the Properties of the Linear Correlation Coefficient (1 of 6) The linear correlation coefficient or Pearson product moment correlation coefficient is a measure of the strength and direction of the linear relation between two quantitative variables. The Greek letter ρ (rho) represents the population correlation coefficient, and r represents the sample correlation coefficient. We present only the formula for the sample correlation coefficient.

4.1 Scatter Diagrams and Correlation 4.1.2 Describe the Properties of the Linear Correlation Coefficient (2 of 6) Sample Linear Correlation Coefficient

4.1 Scatter Diagrams and Correlation 4.1.2 Describe the Properties of the Linear Correlation Coefficient (3 of 6) Properties of the Linear Correlation Coefficient 1. The linear correlation coefficient is always between 1 and 1, inclusive. That is, 1 r 1. 2. If r = + 1, then a perfect positive linear relation exists between the two variables. 3. If r = 1, then a perfect negative linear relation exists between the two variables. 4. The closer r is to +1, the stronger the evidence is of a positive association between the two variables. 5. The closer r is to 1, the stronger the evidence is of a negative association between the two variables.

4.1 Scatter Diagrams and Correlation 4.1.2 Describe the Properties of the Linear Correlation Coefficient (4 of 6) 6. If r is close to 0, then little or no evidence exists of a linear relation between the two variables. So r close to 0 does not imply no relation, just no linear relation. 7. The linear correlation coefficient is a unitless measure of association. So the unit of measure for x and y plays no role in the interpretation of r. 8. The correlation coefficient is not resistant. Therefore, an observation that does not follow the overall pattern of the data could affect the value of the linear correlation coefficient.

4.1 Scatter Diagrams and Correlation 4.1.2 Describe the Properties of the Linear Correlation Coefficient (5 of 6)

4.1 Scatter Diagrams and Correlation 4.1.2 Describe the Properties of the Linear Correlation Coefficient (6 of 6)

4.1 Scatter Diagrams and Correlation 4.1.3 Compute and Interpret the Linear Correlation Coefficient (1 of 5) EXAMPLE Determining the Linear Correlation Coefficient Determine the linear correlation coefficient of the drilling data. Depth at Which Drilling Begins, x (in feet) Time to Drill 5 Feet, y (in minutes) 35 5.88 50 5.99 75 6.74 95 6.1 120 7.47 130 6.93 145 6.42 155 7.97 160 7.92 175 7.62 185 6.89 190 7.9

4.1 Scatter Diagrams and Correlation 4.1.3 Compute and Interpret the Linear Correlation Coefficient (2 of 5)

4.1 Scatter Diagrams and Correlation 4.1.3 Compute and Interpret the Linear Correlation Coefficient (3 of 5)

4.1 Scatter Diagrams and Correlation 4.1.3 Compute and Interpret the Linear Correlation Coefficient (4 of 5) IN CLASS ACTIVITY Correlation Randomly select six students from the class and have them determine their atrest pulse rates and then discuss the following: 1. When determining each at-rest pulse rate, would it be better to count beats for 30 seconds and multiply by 2 or count beats for 1 full minute? Explain. What are some other ways to find the at-rest pulse rate? Do any of these methods have an advantage? 2. What effect will physical activity have on pulse rate? 3. Do you think the at-rest pulse rate will have any effect on the pulse rate after physical activity? If so, how? If not, why not? Have the same six students jog in place for 3 minutes and then immediately determine their pulse rates using the same technique as for the at-rest pulse rates.

4.1 Scatter Diagrams and Correlation 4.1.3 Compute and Interpret the Linear Correlation Coefficient (5 of 5) 4. Draw a scatter diagram for the pulse data using the at-rest data as the explanatory variable. 5. Comment on the relationship, if any, between the two variables. Is this consistent with your expectations? 6. Based on the graph, estimate the linear correlation coefficient for the data. Then compute the correlation coefficient and compare it to your estimate.

4.1 Scatter Diagrams and Correlation 4.1.4 Determine whether a Linear Relation Exists between Two Variables (1 of 2) Testing for a Linear Relation Step 1 Determine the absolute value of the correlation coefficient. Step 2 Find the critical value in Table II for the given sample size. Step 3 If the absolute value of the correlation coefficient is greater than the critical value, we say a linear relation exists between the two variables. Otherwise, no linear relation exists.

4.1 Scatter Diagrams and Correlation 4.1.4 Determine whether a Linear Relation Exists between Two Variables (2 of 2) EXAMPLE Does a Linear Relation Exist? Table II Critical Values for Correlation Coefficient Determine whether a linear relation exists between time to drill five feet and depth at which drilling begins. Comment on the type of relation that appears to exist between time to drill five feet and depth at which drilling begins. n blank 3 0.997 4 0.950 5 0.878 6 0.811 7 0.754 The correlation between drilling depth and time to drill is 0.773. The critical value for n = 12 observations is 0.576. Since 0.773 > 0.576, there is a positive linear relation between time to drill five feet and depth at which drilling begins. 8 0.707 9 0.666 10 0.632 11 0.602 12 0.576 13 0.553 14 0.532

4.1 Scatter Diagrams and Correlation 4.1.5 Explain the Difference between Correlation and Causation (1 of 8) According to data obtained from the Statistical Abstract of the United States, the correlation between the percentage of the female population with a bachelor s degree and the percentage of births to unmarried mothers since 1990 is 0.940. Does this mean that a higher percentage of females with bachelor s degrees causes a higher percentage of births to unmarried mothers?

4.1 Scatter Diagrams and Correlation 4.1.5 Explain the Difference between Correlation and Causation (2 of 8) Certainly not! The correlation exists only because both percentages have been increasing since 1990. It is this relation that causes the high correlation. In general, time series data (data collected over time) may have high correlations because each variable is moving in a specific direction over time (both going up or down over time; one increasing, while the other is decreasing over time). When data are observational, we cannot claim a causal relation exists between two variables. We can only claim causality when the data are collected through a designed experiment.

4.1 Scatter Diagrams and Correlation 4.1.5 Explain the Difference between Correlation and Causation (3 of 8) Another way that two variables can be related even though there is not a causal relation is through a lurking variable. A lurking variable is related to both the explanatory and response variable. For example, ice cream sales and crime rates have a very high correlation. Does this mean that local governments should shut down all ice cream shops? No! The lurking variable is temperature. As air temperatures rise, both ice cream sales and crime rates rise.

4.1 Scatter Diagrams and Correlation 4.1.5 Explain the Difference between Correlation and Causation (4 of 8) Table 4 EXAMPLE Lurking Variables in a Bone Mineral Density Study Number of Colas per Week Because colas tend to replace healthier beverages and colas contain caffeine and phosphoric acid, researchers Katherine L. Tucker and associates wanted to know whether cola consumption is associated with lower bone mineral density in women. The table lists the typical number of cans of cola consumed in a week and the femoral neck bone mineral density for a sample of 15 women. The data were collected through a prospective cohort study. Bone Mineral Density (g/cm2) 0 0.893 0 0.882 1 0.891 1 0.881 2 0.888 2 0.871 3 0.868 3 0.876 4 0.873 5 0.875 5 0.871 6 0.867 7 0.862 7 0.872 8 0.865

4.1 Scatter Diagrams and Correlation 4.1.5 Explain the Difference between Correlation and Causation (5 of 8) EXAMPLE Lurking Variables in a Bone Mineral Density Study The figure on the next slide shows the scatter diagram of the data. The correlation between number of colas per week and bone mineral density is 0.806.The critical value for correlation with n = 15 from Table II in Appendix A is 0.514. Because 0.806 > 0.514, we conclude a negative linear relation exists between number of colas consumed and bone mineral density. Can the authors conclude that an increase in the number of colas consumed causes a decrease in bone mineral density? Identify some lurking variables in the study.

4.1 Scatter Diagrams and Correlation 4.1.5 Explain the Difference between Correlation and Causation (6 of 8)

4.1 Scatter Diagrams and Correlation 4.1.5 Explain the Difference between Correlation and Causation (7 of 8) EXAMPLE Lurking Variables in a Bone Mineral Density Study In prospective cohort studies, data are collected on a group of subjects through questionnaires and surveys over time. Therefore, the data are observational. So the researchers cannot claim that increased cola consumption causes a decrease in bone mineral density. Some lurking variables in the study that could confound the results are: body mass index height smoking alcohol consumption calcium intake physical activity

4.1 Scatter Diagrams and Correlation 4.1.5 Explain the Difference between Correlation and Causation (8 of 8) EXAMPLE Lurking Variables in a Bone Mineral Density Study The authors were careful to say that increased cola consumption is associated with lower bone mineral density because of potential lurking variables. They never stated that increased cola consumption causes lower bone mineral density.

4.2 Least-squares Regression Learning Objectives 1. Find the least-squares regression line and use the line to make predictions 2. Interpret the slope and the y-intercept of the least-squares regression line 3. Compute the sum of squared residuals

4.2 Least-squares Regression EXAMPLE Finding an Equation that Describes Linearly Relate Data (1 of 2) Using the following sample data: x 0 2 3 5 6 6 y 5.8 5.7 5.2 2.8 1.9 2.2

4.2 Least-squares Regression EXAMPLE Finding an Equation that Describes Linearly Relate Data (2 of 2) (b) Graph the equation on the scatter diagram.

4.2 Least-squares Regression 4.2.1 Find the Least-Squares Regression Line and Use the Line to Make Predictions (1 of 7) The difference between the observed value of y and the predicted value of y is the error, or residual. Using the line from the last example, and the predicted value at x = 3: residual = observed y predicted y = 5.2 4.75 = 0.45

4.2 Least-squares Regression 4.2.1 Find the Least-Squares Regression Line and Use the Line to Make Predictions (2 of 7) Least-Squares Regression Criterion

4.2 Least-squares Regression 4.2.1 Find the Least-Squares Regression Line and Use the Line to Make Predictions (3 of 7) The Least-Squares Regression Line The equation of the least-squares regression line is given by

4.2 Least-squares Regression 4.2.1 Find the Least-Squares Regression Line and Use the Line to Make Predictions (4 of 7) The Least-Squares Regression Line

4.2 Least-squares Regression 4.2.1 Find the Least-Squares Regression Line and Use the Line to Make Predictions (5 of 7) EXAMPLE Finding the Leastsquares Regression Line Depth at Which Drilling Begins, x (in feet) Using the drilling data Time to Drill 5 Feet, y (in minutes) 35 5.88 50 5.99 75 6.74 95 6.1 (b) Predict the drilling time if drilling starts at 130 feet. 120 7.47 130 6.93 (c) Is the observed drilling time at 130 feet above, or below, average. 145 6.42 155 7.97 160 7.92 175 7.62 185 6.89 190 7.9 (a) Find the least-squares regression line. (d) Draw the least-squares regression line on the scatter diagram of the data.

4.2 Least-squares Regression 4.2.1 Find the Least-Squares Regression Line and Use the Line to Make Predictions (6 of 7) (c) The observed drilling time is 6.93 seconds. The predicted drilling time is 7.035 seconds. The drilling time of 6.93 seconds is below average.

4.2 Least-squares Regression 4.2.1 Find the Least-Squares Regression Line and Use the Line to Make Predictions (7 of 7)

4.2 Least-squares Regression 4.2.2 Interpret the Slope and the y-intercept of the LeastSquares Regression Line (1 of 3) Interpretation of Slope: The slope of the regression line is 0.0116. For each additional foot of depth we start drilling, the time to drill five feet increases by 0.0116 minutes, on average.

4.2 Least-squares Regression 4.2.2 Interpret the Slope and the y-intercept of the LeastSquares Regression Line (2 of 3) Interpretation of the y-intercept: The y-intercept of the regression line is 5.5273. To interpret the yintercept, we must first ask two questions: 1. Is 0 a reasonable value for the explanatory variable? 2. Do any observations near x = 0 exist in the data set? A value of 0 is reasonable for the drilling data (this indicates that drilling begins at the surface of Earth. The smallest observation in the data set is x = 35 feet, which is reasonably close to 0. So, interpretation of the y-intercept is reasonable. The time to drill five feet when we begin drilling at the surface of Earth is 5.5273 minutes.

4.2 Least-squares Regression 4.2.2 Interpret the Slope and the y-intercept of the LeastSquares Regression Line (3 of 3) If the least-squares regression line is used to make predictions based on values of the explanatory variable that are much larger or much smaller than the observed values, we say the researcher is working outside the scope of the model. Never use a leastsquares regression line to make predictions outside the scope of the model because we can t be sure the linear relation continues to exist.

4.2 Least-squares Regression 4.2.3 Compute the Sum of Squared Residuals To illustrate the fact that the sum of squared residuals for a least-squares regression line is less than the sum of squared residuals for any other line, use the regression by eye applet.

4.3 Diagnostics on the Least-squares Regression Line Learning Objectives 1. Compute and interpret the coefficient of determination 2. Perform residual analysis on a regression model 3. Identify influential observations

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (1 of 18) The coefficient of determination, R2, measures the proportion of total variation in the response variable that is explained by the least-squares regression line. The coefficient of determination is a number between 0 and 1, inclusive. That is, 0 < R2 < 1. If R2 = 0 the line has no explanatory value If R2 = 1 means the line explains 100% of the variation in the response variable.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (2 of 18) The data to the right are based on a study for drilling rock. The researchers wanted to determine whether the time it takes to dry drill a distance of 5 feet in rock increases with the depth at which the drilling begins. So, depth at which drilling begins is the predictor variable, x, and time (in minutes) to drill five feet is the response variable, y. Depth at Which Drilling Begins, x (in feet) Source: Penner, R., and Watts, D.G. Mining Information. The American Statistician, Vol. 45, No. 1, Feb. 1991, p. 6. Time to Drill 5 Feet, y (in minutes) 35 5.88 50 5.99 75 6.74 95 6.1 120 7.47 130 6.93 145 6.42 155 7.97 160 7.92 175 7.62 185 6.89 190 7.9

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (3 of 18)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (4 of 18) Sample Statistics blank blank Mean Standard Deviation Depth 126.2 52.2 Time 6.99 0.781 Correlation Between Depth and Time: 0.773 Regression Analysis The regression equation is Time = 5.53 + 0.0116 Depth

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (5 of 18) Suppose we were asked to predict the time to drill an additional 5 feet, but we did not know the current depth of the drill. What would be our best guess?

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (6 of 18) Suppose we were asked to predict the time to drill an additional 5 feet, but we did not know the current depth of the drill. What would be our best guess? ANSWER: The mean time to drill an additional 5 feet: 6.99 minutes

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (7 of 18) Now suppose that we are asked to predict the time to drill an additional 5 feet if the current depth of the drill is 160 feet? ANSWER:

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (8 of 18)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (9 of 18) The difference between the observed value of the response variable and the mean value of the response variable is called the total deviation and is equal to

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (10 of 18) The difference between the predicted value of the response variable and the mean value of the response variable is called the explained deviation and is equal to

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (11 of 18) The difference between the observed value of the response variable and the predicted value of the response variable is called the unexplained deviation and is equal to

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (12 of 18) Total Deviation = Unexplained Deviation + Explained Deviation

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (13 of 18) Total Deviation = Unexplained Deviation + Explained Deviation

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (14 of 18) Total Variation = Unexplained Variation + Explained Variation

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (15 of 18) To determine R2 for the linear regression model simply square the value of the linear correlation coefficient.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (16 of 18) EXAMPLE Determining the Coefficient of Determination Find and interpret the coefficient of determination for the drilling data. Because the linear correlation coefficient, r, is 0.773, we have that R2 = 0.7732 = 0.5975 = 59.75%. So, 59.75% of the variability in drilling time is explained by the least-squares regression line.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (17 of 18) DATA SET A DATA SET B DATA SET C X Y X Y X Y 3.6 8.9 3.1 8.9 2.8 8.9 8.3 15.0 9.4 15.0 8.1 15.0 0.5 4.8 1.2 4.8 3.0 4.8 1.4 6.0 1.0 6.0 8.3 6.0 8.2 14.9 9.0 14.9 8.2 14.9 5.9 11.9 5.0 11.9 1.4 11.9 4.3 9.8 3.4 9.8 1.0 9.8 8.3 15.0 7.4 15.0 7.9 15.0 0.3 4.7 0.1 4.7 5.9 4.7 6.8 13.0 7.5 13.0 5.0 13.0 Draw a scatter diagram for each of these data sets. For each data set, the variance of y is 17.49.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Compute and Interpret the Coefficient of Determination (18 of 18) Data Set A: 99.99% of the variability in y is explained by the leastsquares regression line Data Set B: 94.7% of the variability in y is explained by the leastsquares regression line Data Set C: 9.4% of the variability in y is explained by the leastsquares regression line

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (1 of 14) Residuals play an important role in determining the adequacy of the linear model. In fact, residuals can be used for the following purposes: To determine whether a linear model is appropriate to describe the relation between the predictor and response variables. To determine whether the variance of the residuals is constant. To check for outliers.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (2 of 14) If a plot of the residuals against the predictor variable shows a discernable pattern, such as a curve, then the response and predictor variable may not be linearly related.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (3 of 14)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (4 of 14) EXAMPLE Is a Linear Model Appropriate? Day Weight (in grams) 0 1000.0 A chemist has a 1000-gram sample of a radioactive material. She records the amount of radioactive material remaining in the sample every day for a week and obtains the following data. 1 897.1 2 802.5 3 719.8 4 651.1 5 583.4 6 521.7 7 468.3

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (5 of 14) Linear correlation coefficient: 0.994

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (6 of 14)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (7 of 14) Linear model not appropriate

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (8 of 14) If a plot of the residuals against the explanatory variable shows the spread of the residuals increasing or decreasing as the explanatory variable increases, then a strict requirement of the linear model is violated. This requirement is called constant error variance. The statistical term for constant error variance is homoscedasticity.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (9 of 14)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (10 of 14) A plot of residuals against the explanatory variable may also reveal outliers. These values will be easy to identify because the residual will lie far from the rest of the plot.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (11 of 14)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (12 of 14) EXAMPLE Residual Analysis Draw a residual plot of the drilling time data. Comment on the appropriateness of the linear least-squares regression model.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (13 of 14)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Perform Residual Analysis on a Regression Model (14 of 14) Boxplot of Residuals for the Drilling Data

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Identify Influential Observations (1 of 8) An influential observation is an observation that significantly affects the least-squares regression line s slope and/or yintercept, or the value of the correlation coefficient.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Identify Influential Observations (2 of 8) Influential observations typically exist when the point is an outlier relative to the values of the explanatory variable. So, Case 3 is likely influential.

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Identify Influential Observations (3 of 8) Influence is affected by two factors: (1) the relative vertical position of the observation (residuals) and (2) the relative horizontal position of the observation (leverage).

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Identify Influential Observations (4 of 8) EXAMPLE Influential Observations Suppose an additional data point is added to the drilling data. At a depth of 300 feet, it took 12.49 minutes to drill 5 feet. Is this point influential?

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Identify Influential Observations (5 of 8)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Identify Influential Observations (6 of 8)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Identify Influential Observations (7 of 8)

4.3 Diagnostics on the Least-squares Regression Line 4.3.1 Identify Influential Observations (8 of 8) As with outliers, influential observations should be removed only if there is justification to do so. When an influential observation occurs in a data set and its removal is not warranted, there are two courses of action: (1) Collect more data so that additional points near the influential observation are obtained, or (2) Use techniques that reduce the influence of the influential observation (such as a transformation or different method of estimation - e.g. minimize absolute deviations). These techniques are beyond the scope of this text.

4.4 Contingency Tables and Association Learning Objectives 1. Compute the marginal distribution of a variable 2. Use the conditional distribution to identify association among categorical data 3. Explain Simpson s Paradox

4.4 Contingency Tables and Association Example: Data Information A professor at a community college in New Mexico conducted a study to assess the effectiveness of delivering an introductory statistics course via traditional lecture-based method, online delivery (no classroom instruction), and hybrid instruction (online course with weekly meetings) methods, the grades students received in each of the courses were tallied. blank A B C D F Traditional Online Hybrid 36 52 57 46 46 39 55 68 38 54 24 66 90 41 31

4.4 Contingency Tables and Association 4.4.1 Compute the Marginal Distribution of a Variable (1 of 3) A marginal distribution of a variable is a frequency or relative frequency distribution of either the row or column variable in the contingency table.

4.4 Contingency Tables and Association 4.4.1 Compute the Marginal Distribution of a Variable (2 of 3) EXAMPLE Determining Frequency Marginal Distributions A professor at a community college in New Mexico conducted a study to assess the effectiveness of delivering an introductory statistics course via traditional lecture-based method, online delivery (no classroom instruction), and hybrid instruction (online course with weekly meetings) methods, the grades students received in each of the courses were tallied. Find the frequency marginal distributions for course grade and delivery method. blank A B C D F Total Traditional Online Hybrid Total 36 52 57 46 46 39 55 68 38 54 24 66 90 41 31 99 173 215 125 131 237 254 252 743

4.4 Contingency Tables and Association 4.4.1 Compute the Marginal Distribution of a Variable (3 of 3) EXAMPLE Determining Relative Frequency Marginal Distributions Determine the relative frequency marginal distribution for course grade and delivery method. blank A B C D F blank Traditional Online Hybrid Total 36 52 57 46 46 39 55 68 38 54 24 66 90 41 31 0.133 0.233 0.289 0.168 0.176 0.319 0.342 0.339 1.000

4.4 Contingency Tables and Association 4.4.2 Use the Conditional Distribution to Identify Association among Categorical Data (1 of 4) A conditional distribution lists the relative frequency of each category of the response variable, given a specific value of the explanatory variable in the contingency table.

4.4 Contingency Tables and Association 4.4.2 Use the Conditional Distribution to Identify Association among Categorical Data (2 of 4) EXAMPLE Determining a Conditional Distribution Construct a conditional distribution of course grade by method of delivery. Comment on any type of association that may exist between course grade and delivery method. It appears that students in the hybrid course are more likely to pass (A, B, or C) than the other two methods. blank Traditional Online A B C D F blank 36 52 57 46 46 39 55 68 38 54 Hybrid 24 66 90 41 31 Traditional Online Hybrid 0.152 0.219 0.241 0.194 0.194 0.154 0.217 0.268 0.150 0.213 0.095 0.262 0.357 0.163 0.123 A B C D F

4.4 Contingency Tables and Association 4.4.2 Use the Conditional Distribution to Identify Association among Categorical Data (3 of 4) EXAMPLE Drawing a Bar Graph of a Conditional Distribution Using the results of the previous example, draw a bar graph that represents the conditional distribution of method of delivery by grade earned.

4.4 Contingency Tables and Association 4.4.2 Use the Conditional Distribution to Identify Association among Categorical Data (4 of 4) The following contingency table shows the survival status and demographics of passengers on the illfated Titanic. Men Women Boys Girls Survived 334 318 29 27 Died 1360 104 35 18 Draw a conditional bar graph of survival status by demographic characteristic.

4.4 Contingency Tables and Association 4.4.2 Explain Simpson s Paradox (1 of 6) EXAMPLE Illustrating Simpson s Paradox Insulin dependent (or Type 1) diabetes is a disease that results in the permanent destruction of insulin-producing beta cells of the pancreas. Type 1 diabetes is lethal unless treatment with insulin injections replaces the missing hormone. Individuals with insulin independent (or Type 2) diabetes can produce insulin internally. The data shown in the table below represent the survival status of 902 patients with diabetes by type over a 5-year period. blank Type 1 Type 2 Total Survived 253 326 579 Died 105 218 323 358 544 902

4.4 Contingency Tables and Association 4.4.2 Explain Simpson s Paradox (2 of 6) EXAMPLE Illustrating Simpson s Paradox blank Type 1 Type 2 Total Survived 253 326 579 Died 105 218 323 358 544 902

4.4 Contingency Tables and Association 4.4.2 Explain Simpson s Paradox (3 of 6) However, Type 2 diabetes is usually contracted after the age of 40. If we account for the variable age and divide our patients into two groups (those 40 or younger and those over 40), we obtain the data in the table below. blank Type 1 Type 2 Total < 40 > 40 < 40 > 40 129 124 15 311 579 Died 1 104 0 218 323 blank 130 228 15 529 902 Survived

4.4 Contingency Tables and Association 4.4.2 Explain Simpson s Paradox (4 of 6) blank Type 1 Type 2 Total < 40 > 40 < 40 > 40 129 124 15 311 579 Died 1 104 0 218 323 blank 130 228 15 529 902 Survived

4.4 Contingency Tables and Association 4.4.2 Explain Simpson s Paradox (5 of 6) blank Type 1 Type 2 Total < 40 > 40 < 40 > 40 129 124 15 311 579 Died 1 104 0 218 323 blank 130 228 15 529 902 Survived

4.4 Contingency Tables and Association 4.4.2 Explain Simpson s Paradox (6 of 6) Simpson s Paradox describes a situation in which an association between two variables inverts or goes away when a third variable is introduced to the analysis.