Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Simple Linear Regression: Correlation, Coefficients, and Assumptions, Lecture notes of Statistics

An overview of simple linear regression, focusing on the concepts of scatter diagram, correlation coefficient, and the calculation of intercept and slope. It also covers the assumptions of the model, including homoscedasticity and independence of errors. Students will learn how to test hypotheses on the slope of the regression line and calculate confidence intervals.

Typology: Lecture notes

2021/2022

Uploaded on 09/27/2022

anjushri
anjushri 🇺🇸

4.8

(14)

243 documents

1 / 4

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Chapter 14. Correction and Simple Linear
Regression
Scatter Diagram: A two-dimensional graph on which each observation in
the sample of bivariate data is represented by a point (dot).
Negative (inverse) Relationship: A relationship between two variables, X
and Y, in which large values of X are associated with small values of Y.
Positive (direct) Relationship: A relationship betwee two variables, X
and Y, in which large values of X are associated with large values of Y.
Coefficient of Correlation (r): A statistical measure of the strength of
the linear relationship between two variables, calculated from a sample of
bivariate data. It is an estimate of the population coefficient of correlation.
r=P(xx)(yy)
pP(xx)2P(yy)2=SCPXY
SSXSSY
.
1r1
The larger |r|is, the stronger is the linear relationship. r 0
indicates that there is no linear relationship between X and Y. r = 1 or -
1 implies that a perfect linear pattern exists between two variables. r >0
tells the positive relationship and r <0 means the negative one.
Sum of Squares: SSX=P(xx)2,SS
Y=P(yy)2,
Sum of Cross Product: SCPXY =P(xx)(yy)
Standard Deviation of X: SX=qSSX
n1
Standard Deviation of Y: SY=qSSY
n1
Covariance, cov(X,Y): A measure of the joint variation of the two vari-
ables.
cov(X,Y) = 1
n1P(xx)(yy)= 1
n1SCPXY.
1
pf3
pf4

Partial preview of the text

Download Simple Linear Regression: Correlation, Coefficients, and Assumptions and more Lecture notes Statistics in PDF only on Docsity!

Chapter 14. Correction and Simple Linear

Regression

  • Scatter Diagram: A two-dimensional graph on which each observation in the sample of bivariate data is represented by a point (dot).
  • Negative (inverse) Relationship: A relationship between two variables, X and Y, in which large values of X are associated with small values of Y.
  • Positive (direct) Relationship: A relationship betwee two variables, X and Y, in which large values of X are associated with large values of Y.
  • Coefficient of Correlation (r): A statistical measure of the strength of the linear relationship between two variables, calculated from a sample of bivariate data. It is an estimate of the population coefficient of correlation.

r =

√∑ (x−x)(y−y) (x−x)^2

(y−y)^2

SCP XY

SSX

SSY

− 1 ≤ r ≤ 1 The larger |r| is, the stronger is the linear relationship. r ≈ 0 indicates that there is no linear relationship between X and Y. r = 1 or - 1 implies that a perfect linear pattern exists between two variables. r > 0 tells the positive relationship and r < 0 means the negative one.

  • Sum of Squares: SSX =

(x − x)^2 , SSY =

(y − y)^2 ,

  • Sum of Cross Product: SCP (^) XY =

(x−x)(y−y)

  • Standard Deviation of X: SX =

SSX n− 1

  • Standard Deviation of Y: SY =

SSY n− 1

  • Covariance, cov(X,Y): A measure of the joint variation of the two vari- ables. cov(X,Y) = (^) n−^11

(x−x)(y−y) = (^) n^1 − 1 SCP (^) XY.

  • Note: cov(X’,Y’) = correlation between X and Y = r

where X’ = XS−XX and Y’ = YS^ −YY (standardized variables).

  • Sample Correlation, r: r = Cov SX( X,YSY ).
  • Least Squares Line: The best line through a sample of bivariate data; it minimizes the sum of the squares of the vertical distances from each point to the line.
  • Sum of Squares of Error: The residual sum of squares is denoted by

SSE =

d^2 =

(y−̂ y)^2 = SSY − (SCP^ XY^ )

2 SSX where ̂ y = bo+b 1 X, where b 1 = SCP SS^ XYX , b 0 =-y− b 1 -x.

( N ote : -y=

y n ,^ -x=

x n )

  • Dependent Variable: In simple linear regression, the variable (Y) that is predicted or explained using a single independent variable (X).
  • Independent Variable: In simple linear regression, the variable (X) used to predict or explain values of the dependent variable (Y).
  • Intercept (bo) : The distance from the origin to the point where the least squares line crosses the vertical axis. It is calculated from the sample of bivariate data and estimates the population intercept, βo.
  • Slope (b 1 ) : A value specifying the steepness (slant) of the least squares line. It is calculated from the sample of bivariate data and estimates the population slope, β 1.
  • Residual: For each observation in the sample, the actual value of the dependent variable minus the estimated value.
  • Standardized Residual: A value that can be used to identify sample observations that have an unusually large or small value of the dependent variable, Y.
  • Regression Analysis: A method of studying the relationship between two or more variables. In “simple linear regression”, we use only one predictor variable, X, to describe the behavior of the dependent variable, Y.
  • Statistical Model when we select to use a straight-line predictor: Y = βo + β 1 X + e.
  • Deterministic Portion of the Model: In simple linear regression, the as- sumed line, βo + β 1 x,about which all points will fall.

∗∗ Danger of assuming Causality ** High statistical correlation does not imply causality. Even if the corre- lation between X and Y is extremely high, a unit increase in X doesn’t necessarily cause an increase in Y.

  • Coefficient of Determination (r^2 ) : The percentage of total variation in the sample Y values (measured by SSY ) that has been explained using the simple linear regression model. r^2 = 1 - SSESSY , where SSESSY is the percentage of unexplained variation.
  • Total Variation: With the realtions SS(total) = SS(f actor) + SS(error)

and

(y−y)^2 =

(̂y-y)^2 +

(y−̂y)^2 , we can derive

SSY = SSR + SSE. Then, since SSE = SSY − (SCP^ XY^ )

2 SSX , we get the sum of squares of regression, SSR = (SCP^ XY^ )

2 SSX. ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗∗

  • Autocorrelation: A term used to describe residuals in a regression analy- sis that are not independent, whereby neighboring residuals have roughly the same value (positive autocorrelation) or neighboring residuals are gen- erally very unequal in size (negative autocorrelation).
  • Influential Observation: An observation that has a large impact on the calculation of the least squares line. If such an observation were removed from the sample, there would be a dramatic shift in the least squares regression line. These observations can be detected using Cook’s distance measure.
  • Cook’s istance Measure: A statistic that is calculated for each sample observation in order to detect influential observations.
  • Leverage: A statistic that is computed to identify observations in the sample that have unusual values of the independent variable, X.