Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Practice Problems for Final Exam - Mathematical Statistics I | STAT 510, Study notes of Mathematical Statistics

Material Type: Notes; Class: Mathematical Statistics I; Subject: Statistics; University: University of Illinois - Urbana-Champaign; Term: Summer 2006;

Typology: Study notes

Pre 2010

Uploaded on 03/16/2009

koofers-user-4vx
koofers-user-4vx 🇺🇸

10 documents

1 / 6

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
NAME
STAT 510 Practice Problems for the Final
December, 2006
The final will be comprehensive, but be weighted somewhat towards the material since the
last midterm. So you should study all the homework, and the previous exams. The final
will have about 6-7 problems. Also, the final will give the necessary pdf’s. It is closed book
& notes.
1. Suppose X1, ..., Xnare iid with pdf10 points
f(xi|λ) = λeλxi,for xi>0,
where λ > 0.Then E(Xi) = 1 and V ar(Xi) = 12.
(a) Find a method-of-moments estimator of λ.
(b) Find the variance of the estimator in (a).
2. Suppose X1, . . . , Xnrepresent the weights (in ounces) of nrandomly chosen (independent)10 points
rocks, where the distribution of the weights of the population of rocks is N(θ, 1). There are
two possible experiments:
IThe rocks are weighed on a scale that gives the exact weight of the rock. So the data
are the regular X1, . . . , Xn.
II The rocks are weighed on a scale that only goes up to 100 ounces. It gives the exact
weight if the weight is less than 100; otherwise it gives 100. So the data are Y1, . . . , Yn,
where Yi= min{Xi,100}.
(a) For experiment I, is Xan unbiased estimator of θ? (This is not a trick question.)
(b) Give the distribution of Yifor experiment II.
(c) Show that for experiment II,
E[Yi] = θ(θ100)Φ(θ100) φ(θ100),
where Φ is the distribution function and φis the pdf for a N(0,1). Is Yan unbiased estimator
of θ?
(d) Suppose n= 5, and the Xi’s are 70, 73, 74, 75, 79. Find the MLE of θfrom
experiment I.
1
pf3
pf4
pf5

Partial preview of the text

Download Practice Problems for Final Exam - Mathematical Statistics I | STAT 510 and more Study notes Mathematical Statistics in PDF only on Docsity!

NAME

STAT 510 Practice Problems for the Final

December, 2006

The final will be comprehensive, but be weighted somewhat towards the material since the last midterm. So you should study all the homework, and the previous exams. The final will have about 6-7 problems. Also, the final will give the necessary pdf’s. It is closed book & notes.

10 points 1. Suppose X 1 , ..., Xn are iid with pdf

f (xi | λ) = λe−λxi^ , for xi > 0 ,

where λ > 0. Then E(Xi) = 1/λ and V ar(Xi) = 1/λ^2. (a) Find a method-of-moments estimator of λ. (b) Find the variance of the estimator in (a).

10 points 2. Suppose X 1 ,... , Xn represent the weights (in ounces) of n randomly chosen (independent) rocks, where the distribution of the weights of the population of rocks is N (θ, 1). There are two possible experiments:

I The rocks are weighed on a scale that gives the exact weight of the rock. So the data are the regular X 1 ,... , Xn.

II The rocks are weighed on a scale that only goes up to 100 ounces. It gives the exact weight if the weight is less than 100; otherwise it gives 100. So the data are Y 1 ,... , Yn, where Yi = min{Xi, 100 }.

(a) For experiment I, is X an unbiased estimator of θ? (This is not a trick question.) (b) Give the distribution of Yi for experiment II. (c) Show that for experiment II,

E[Yi] = θ − (θ − 100)Φ(θ − 100) − φ(θ − 100),

where Φ is the distribution function and φ is the pdf for a N (0, 1). Is Y an unbiased estimator of θ? (d) Suppose n = 5, and the Xi’s are 70, 73, 74, 75, 79. Find the MLE of θ from experiment I.

(e) Using the same data as in part (d), find the MLE of θ from experiment II. (f) Are the estimates in (d) and (e) the same? Is the estimate in (d) biased or unbiased? Is the estimate in (e) biased or unbiased? Is anything funny?

10 points 3. Suppose X 1 ,... , Xn are iid Beta(α, β), where (α, β) ∈ (0, ∞) × (0, ∞).

(a) Find a method of moments estimator of (α, β). (b) Why would it be difficult to find the MLE of (α, β) in closed form?

10 points 4. Suppose X | Λ = λ ∼ P oisson(λ), and Λ ∼ Gamma(α, 1).

(a) Write down the joint pdf of (X, Λ). (b) Without doing any integrations explicitly, find the conditional distribution of Λ | X = x. (c) Find E[Λ | X = x].

10 points 5. Suppose X 1 ,... , Xn are iid U nif orm(θ, θ + 1), where θ ∈ R. (Assume n > 2.)

(a) Find the likelihood function L(θ; x 1 ,... , xn). (b) Suppose the data are 1. 1 , 1. 3 , 1. 2 , 1 .5.

(i) Sketch the likelihood based on these data. (ii) Is there a unique value of θ that maximizes this likelihood?

  1. Suppose gk(u) is a pdf for each k, and U has the pdf g(u) =

∑∞ 10 points k=0 pkgk(u) for some constants pk ≥ 0. (a) Show that

∑∞ k=0 pk^ = 1. [Hint: Integrate both sides.] (b) Suppose the mean of the random variable with pdf gk(u) is μk. Show that

E(U ) =

∑^ ∞

k=

pkμk.

(You can assume the summation converges.) (c) Suppose now that U ∼ χ^2 ν (∆), so that gk is the pdf of a χ^2 ν+2k, and

pk = exp(−∆/2)

(∆/2)k k!

which is the P oisson(∆/2) pmf. Show that E(U ) = ν + ∆. [You can use the facts that the mean of a χ^2 a is a, and the mean of P oisson(λ) is λ.]

10 points 7. Let X ∼ N (0, 1), Y ∼ U nif orm(0, 1), where X and Y are independent. [So X has pdf

(c) Show that the marginal pmf of W is

fW (0) = fW (1) =

[So it is Bernoulli(^12 ).] (d) Show that V and W are independent. (e) What is the moment generating function of V? (f) What is the moment generating function of W? (g) What is the moment generating function of V + W? It is the mgf of U , so call it MU (t). (h) Write MU (t) as MU (t) = p 0 + p 1 et^ + p 2 e^2 t^ + p 3 e^3 t. What are the pi’s? (i) From MU (t), can you see what the pmf of U must be? (What is the mgf of the random variable with pmf given by the pi’s?)

10 points 9. Suppose X 1 , X 2 , X 3 , and Z are independent, with Xi ∼ N (0, σ^2 X ), i = 1, 2 , 3, and Z ∼ N (0, σ^2 Z ), where σ^2 X > 0 and σ^2 Z > 0. (a) What is the distribution of    

X 1

X 2

X 3

Z

   

(b) Let Y 1 = X 1 + Z, Y 2 = X 2 + Z, and Y 3 = X 3 + Z. Find the matrix A so that

 

Y 1

Y 2

Y 3

  = A

  

X 1

X 2

X 3

Z

  .

(c) Show that

Cov

  

Y 1

Y 2

Y 3

   =^ σ^2

  

1 ρ ρ ρ 1 ρ ρ ρ 1

  

for some σ^2 and ρ, and give σ^2 and ρ in terms of σ X^2 and σ^2 Z. (d) Are Y 1 and Y 2 independent? Why or why not?

10 points 10. Let (X, Y ) have pdf fX,Y (x, y) = 2 and space

W = {(x, y) | x > 0 , y > 0 , and x + y < 1 }.

(a) Give Yx, the conditional space of Y given X = x. (b) Find the marginal space and pdf of X. (c) Find the conditional pdf of Y | X = x. (d) Are X and Y independent? Why or why not?

10 points 11. Use the same (X, Y ) as in problem 10, and set

U =

X

1 − Y

and V = Y,

so that g(x, y) = (x/(1 − y), y). (a) What is the space of (U, V )? (b) Find g−^1 (u, v). (c) Find the Jacobian of the transformation. (d) Find the joint pdf of (U, V ). (e) Are U and V independent? Why or why not?

10 points 12. The Double Exponential distribution with parameters 0 and θ > 0 (denoted DE(0, θ)) has space R and pdf f (x | θ) =

2 θ

e−|x|/θ.

The mgf is MDE (t) =

1 − θ^2 t^2

for |t| < θ.

(a) Find the mean and variance of a DE(0, θ). (b) The mgf of an Exponential(1) random variable is M (t) = 1/(1−t) for t < 1, Suppose X 1 and X 2 are independent Exponential(1)’s. Let Y = X 1 − X 2. Find MY (t), the mgf of Y , and say for which values t it is finite. (c) Show that the Y in part (b) is DE(0, θ), and give the θ.

10 points 13. Suppose Y | X = x ∼ N (2x + 1, σ^2 ) and X ∼ N (μ, 1). What are the mean and variance of Y? What is the distribution of Y?

10 points 14. Suppose X ∼ U nif orm(− 1 , 1), so that it has pdf f (x) = 12 for − 1 < x < 1, and 0 elsewhere. Let Y = X^2. (a) Find E(X) and E(Y ).