Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Lecture Notes on Covariance and Correlation by Will Monroe for CS 109, Study notes of Probability and Statistics

A set of lecture notes on covariance and correlation for cs 109, written by will monroe based on a chapter by chris piech. The notes explain the concept of covariance and correlation between two variables, their properties, and the relationship between them. The notes also discuss the difference between pearson and spearman correlation.

Typology: Study notes

2021/2022

Uploaded on 09/12/2022

butterflymadam
butterflymadam 🇺🇸

4.4

(26)

312 documents

1 / 2

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
–1–
Will Monroe
CS 109
Lecture Notes #15
July 28, 2017
Covariance and Correlation
Based on a chapter by Chris Piech
Covariance and Correlation
Consider the two plots shown below. In both images I have plotted one thousand samples drawn
from an underlying joint distribution. Clearly the two distributions are different. However, the mean
and variance are the same in both the xand the ydimension. What is different?
Covariance is a quantitative measure of the extent to which the deviation of one variable from its
mean matches the deviation of the other from its mean. It is a mathematical relationship that is
defined as:
Cov(X,Y)=E[(XE[X])(YE[Y])]
The meaning of this mathematical definition may not be obvious at a first glance. If Xand Yare both
above their respective means, or if Xand Yare both below their respective means, the expression
inside the outer expectation will be positive. If one is above its mean and the other is below, the
term is negative. If this expression is positive on average, the two random variables will have a
positive correlation. We can rewrite the above equation to get an equivalent equation:
Cov(X,Y)=E[XY ]E[Y]E[X]
Using this equation (and the fact that the expectation of the product of two independent random
variables is equal to the product of the expectations) is it easy to see that if two random variables
are independent their covariance is 0. The reverse is not true in general: if the covariance of two
random variables is 0, they can still be dependent!
pf2

Partial preview of the text

Download Lecture Notes on Covariance and Correlation by Will Monroe for CS 109 and more Study notes Probability and Statistics in PDF only on Docsity!

Will Monroe CS 109

Lecture Notes # July 28, 2017

Covariance and Correlation

Based on a chapter by Chris Piech

Covariance and Correlation

Consider the two plots shown below. In both images I have plotted one thousand samples drawn from an underlying joint distribution. Clearly the two distributions are different. However, the mean and variance are the same in both the x and the y dimension. What is different?

Covariance is a quantitative measure of the extent to which the deviation of one variable from its mean matches the deviation of the other from its mean. It is a mathematical relationship that is defined as:

Cov(X, Y ) = E[(X − E[X])(Y − E[Y ])]

The meaning of this mathematical definition may not be obvious at a first glance. If X and Y are both above their respective means, or if X and Y are both below their respective means, the expression inside the outer expectation will be positive. If one is above its mean and the other is below, the term is negative. If this expression is positive on average, the two random variables will have a positive correlation. We can rewrite the above equation to get an equivalent equation:

Cov(X, Y ) = E[XY ] − E[Y ]E[X]

Using this equation (and the fact that the expectation of the product of two independent random variables is equal to the product of the expectations) is it easy to see that if two random variables are independent their covariance is 0. The reverse is not true in general: if the covariance of two random variables is 0, they can still be dependent!

Properties of Covariance

Say that X and Y are arbitrary random variables:

Cov(X, Y ) = Cov(Y, X ) Cov(X, X ) = E[X^2 ] − E[X]E[X] = Var(X ) Cov(aX + b, Y ) = aCov(X, Y )

Let X = X 1 + X 2 + · · · + Xn and let Y = Y 1 + Y 2 + · · · + Ym. The covariance of X and Y is:

Cov(X, Y ) =

∑^ n

i= 1

∑^ m

j= 1

Cov(Xi, Yj )

Cov(X, X ) = Var(X ) =

∑^ n

i= 1

∑^ n

j= 1

Cov(Xi, X (^) j )

That last property gives us a third way to calculate variance.

Correlation

Covariance is interesting because it is a quantitative measurement of the relationship between two variables. Correlation between two random variables, ρ(X, Y ) is the covariance of the two variables normalized by the variance of each variable. This normalization cancels the units out and normalizes the measure so that it is always in the range [0, 1]:

ρ(X, Y ) =

Cov(X, Y ) √ Var(X ) Var(Y )

Correlation measures linearity between X and Y.

ρ(X, Y ) = 1 Y = aX + b where a = σy/σx ρ(X, Y ) = − 1 Y = aX + b where a = −σy/σx ρ(X, Y ) = 0 absence of linear relationship

If ρ(X, Y ) = 0 we say that X and Y are “uncorrelated.” If two variables are independent, then their correlation will be 0. However, like with covariance. it doesn’t go the other way. A correlation of 0 does not imply independence.

When people use the term correlation, they are actually referring to a specific type of correlation called “Pearson” correlation. It measures the degree to which there is a linear relationship between the two variables. An alternative measure is “Spearman” correlation, which has a formula almost identical to the correlation defined above, with the exception that the underlying random variables are first transformed into their rank. Spearman correlation is outside the scope of CS109.