Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Understanding Expectations: Mean and Variance of Random Variables, Lecture notes of Mathematical Statistics

The concept of expectations in probability theory, focusing on the mean and variance of discrete and continuous random variables. It covers the definition, calculation methods, and rules for expectations, as well as examples and problem-solving techniques.

What you will learn

  • What is the difference between mean and variance?
  • How to find the variance of a continuous random variable?
  • What is the expected value of a discrete random variable?

Typology: Lecture notes

2021/2022

Uploaded on 09/12/2022

sadayappan
sadayappan 🇺🇸

4.5

(15)

246 documents

1 / 6

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Expectations
Expectations. (See also Hays, Appendix B; Harnett, ch. 3).
A. The
expected value of a random variable is the arithmetic mean of that variable,
i.e. E(X) = µ. As Hays notes, the idea of the expectation of a random variable began with
probability theory in games of chance. Gamblers wanted to know their expected long-run
winnings (or losings) if they played a game repeatedly. This term has been retained in
mathematical statistics to mean the long-run average for any random variable over an indefinite
number of trials or samplings.
B. Discrete case: The expected value of a discrete random variable, X, is found by
multiplying each X-value by its probability and then summing over all values of the random
variable. That is, if X is discrete,
µ
X
XAll
= xp(x) = E(X)
C. Continuous case: For a continuous variable X ranging over all the real numbers,
the expectation is defined by
µ
X
-
= dx xf(x) = E(X)
D. Variance of X: The variance of a random variable X is defined as the expected
(average) squared deviation of the values of this random variable about their mean. That is,
σ
µµ
2
x
XE= - E[(X = V(X) =
222
)(])
In the discrete case, this is equivalent to
==
X All
22
)()()( xPxXV
µσ
E. Standard deviation of X: The standard deviation is the positive square root of the
variance, i.e.
2
)(
σσ
==XSD
Expectations - Page 1
pf3
pf4
pf5

Partial preview of the text

Download Understanding Expectations: Mean and Variance of Random Variables and more Lecture notes Mathematical Statistics in PDF only on Docsity!

Expectations

Expectations. (See also Hays, Appendix B; Harnett, ch. 3).

A. The expected value of a random variable is the arithmetic mean of that variable, i.e. E(X) = μ. As Hays notes, the idea of the expectation of a random variable began with probability theory in games of chance. Gamblers wanted to know their expected long-run winnings (or losings) if they played a game repeatedly. This term has been retained in mathematical statistics to mean the long-run average for any random variable over an indefinite number of trials or samplings.

B. Discrete case: The expected value of a discrete random variable, X, is found by multiplying each X-value by its probability and then summing over all values of the random variable. That is, if X is discrete,

μ (^) X AllX

E(X)= ∑ xp(x)=

C. Continuous case: For a continuous variable X ranging over all the real numbers, the expectation is defined by

μ (^) X

-

E(X) = ∫ xf(x)dx=

D. Variance of X: The variance of a random variable X is defined as the expected (average) squared deviation of the values of this random variable about their mean. That is,

V(X)= E[(X- μ =E X − μ = σ^2 x )^2 ] (^2 )^2

In the discrete case, this is equivalent to

AllX

V ( X ) σ^2 ( x μ)^2 P ( x )

E. Standard deviation of X: The standard deviation is the positive square root of the variance, i.e.

SD ( X )=σ = σ^2

F. Examples.

1. Hayes (p. 96) gives the probability distribution for the number of spots appearing on two fair dice. Find the mean and variance of that distribution.

x p(x) xp(x) (x - μx) 5 (x - μx) 5 p(x)

2 1/36 2/36 25 25/

3 2/36^ 6/36^16 32/

4 3/36 12/36 9 27/

5 4/36 20/36 4 16/

6 5/36 30/36 1 5/

7 6/36 42/36 0 0

8 5/36^ 40/36^1 5/

9 4/36 36/36 4 16/

10 3/36 30/36 9 27/

11 2/36 22/36 16 32/

12 1/36 12/36 25 25/

Σ xp(x) = 252/36 = 7 = μx. The variance σ 5 = 210/36 = 35/6 = 5 5/6. (NOTE: There is a simpler solution to this problem, which takes advantage of the independence of the two tosses.)

2. Consider our earlier coin tossing experiment. If we toss a coin three times, how many times do we expect it to come up heads? And, what is the variance of this distribution?

x p(x) xp(x) (x - μx) 5 (x - μx) 5 p(x)

0 1/8 0 2.25 2.25/

1 3/8 3/8 0.25 0.75/

2 3/8 6/8 0.25 0.75/

3 1/8 3/8 2.25 2.25/

Σ xp(x) = 1.5. So (not surprisingly) if we toss a coin three times, we expect 1.5 heads. And, the variance = 6/8 = 3/4.

PROBLEMS: HINT. Keep in mind that μX and σX are constants.

  1. Prove that V(X) = E[(X - μX) 5 ] = E(X 5 ) - μX 5. HINT: Rules 4, 5, and 8 are especially helpful here.

Solution.

Equation Explanation

E[(X - μ (^) X)^2 ] = Original Formula for the variance.

E( (^) X^2 - 2X μ (^) X+ μ^2 X) = Expand the square

E( (^) X^2 )-E(2 μ (^) XX)+E( μ^2 X) = Rule 8: E(X + Y) = E(X) + E(Y). That is, the expectation of a sum = Sum of the expectations

E( (^) X^2 )- 2 μ (^) XE(X)+ μ^2 X = Rule 5: E(aX) = a * E(X), i.e. Expectation of a constant times a variable = The constant times the expectation of the variable; and Rule 4: E(a) = a, i.e. Expectation of a constant = the constant

E( (^) X^2 )- u^2 X Remember that E(X) = μX, hence 2μXE(X) = 2μX 5. QED.

  1. Prove that V(aX) = a 5 * V(X). HINT: Rules 3 and 5 are especially helpful.

Solution. Let Y = aX. Then,

Equation Explanation

V(Y) =E(Y^2 ) - E(Y )^2 = Rule 3: V(X) = E[(X - E(X))^5 ] = E(X^5 ) - E(X)^5 = σ (^5) X, i.e. Definition of the variance

E( (^) a^2 X^2 ) − E(aX )^2 = Substitute for Y. Since Y = aX, Y^5 = a^5 X^5

a 2 E(X^2 )^ - a^2 E(X )^2^ = Rule 5: E(aX) = a * E(X), i.e. Expectation of a constant times a variable = The constant times the expectation of the variable

a 2 (E(X^2 )^ - E(X )^2 )^ = Factor out a^5

a 2 V(X) Rule 3: Definition of the variance, i.e. V(X) = E(X 5 ) - E(X) 5. QED.

  1. Let Z = (X - μX)/σX. Find E(Z) and V(Z). HINT: Apply rules 7b and 14.

Solution. In this problem, a = -μX, b = 1/σX.

Equation Explanation

X-

E(Z) =E

X

X ⎟⎟ ⎠

μ Definition of Z

E(X)-

X

X

μ Rule 7b: E[(a^ ±^ X) * b] = (a^ ±^ E(X)) * b.

Remember, a = -μX, b = 1/σX.

0 Remember E(X) = μX, so the numerator = 0. QED

Intuitively, the above makes sense; subtract the mean from every case and the new mean becomes zero. Now, for the variance,

Equation Explanation

X-

V(Z)= V

X

X ⎟⎟ ⎠

μ Definition of Z

*V(X) =

2 σ X

Rule 14: V(a ± bX) = b 5 * V(X) = σ (^5) bX. Remember, b = 1/σX

1 Remember, V(X) =^ σX^5 , hence^ σX^5 appears in both the numerator and denominator. QED.

NOTE: This is called a z-score transformation. As we will see, such a transformation is extremely useful. Note that, if Z = 1, the score is one standard deviation above the mean.