

Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
Material Type: Assignment; Class: GENERALIZED LIN MOD; Subject: STATISTICS; University: University of Florida; Term: Spring 2004;
Typology: Assignments
1 / 2
This page cannot be seen from the preview
Don't miss anything!
log μi = xiβ (xi univariate)
and Var(yi) = φμi.
(a) Give the quasi-likelihood estimating equation for β and find the asymptotic vari- ance of β˜, the “maximum quasi-likelihood estimator” (MQLE) of β. (b) Suppose that y 1 ,... , yn are normally distributed. Derive the likelihood equations for β and φ and the asymptotic variance of βˆ, the MLE of β. (c) Calculate the ratio of the asymptotic variance of β˜ to that of βˆ. For concreteness, assume that n/2 of the observations have xi = 5 and the remaining n/2 have xi = 10. Do the calculations for β equal to 0.1, 1, and 10.
m, eλ/(1 + eλ)
Show that Y ′^ = m − Y is also binomially distributed, and that the induced parameter is λ′^ = −λ. Consider λ˜ = log
Y + c 1 m − Y + c 2
as an estimator of λ. Show that in order that the estimator be equivariant under the transformation Y 7 → m − Y , we must have c 1 = c 2.
E{log(Y + c)} = log(mπ) +
c mπ
1 − π 2 mπ
Find the corresponding expansion for log(m − Y + c), and use these results to deduce that E(˜λ) = λ +
(1 − 2 π)(c − 1 /2) mπ(1 − π)
This shows that choosing c = 1/2 makes ˜λ approximately unbiased. Hint: Write Y = mπ +
mπ(1 − π)Z, where Z = (Y − mπ)/
mπ(1 − π), and note that Z = Op(1) as m → ∞, since Z −→d N (0, 1). Now use Taylor expansion to approximate log{(Y + c)/(mπ)}. To make this completely rigorous, you will have to explicitly show that the expectation of the error term is O(m−^3 /^2 ). This takes some work but can be done using Hoeffing’s inequality, which can be found, e.g., on page 75 of Serfling (1980), Approximation Theorems of Mathematical Statistics.
(a) Referring to equation (9.4), p. 327 of McCullagh and Nelder, derive the form of the quasi-deviance function corresponding to this choice of variance function. (b) Modify the “quasi” function in R to accept this new variance function and use the result to reproduce the second analysis of McCullagh and Nelder, and in par- ticular the table at the bottom of p. 330 and the residual plot given in Figure 9. on p. 332. Turn in a printout of the part of your output that contains the results given in the table (the output of summary() should do) and a printout of the residual plot. Also email your code to do this example (including your modified quasi function) to the TA in a form that he can easily use to reproduce your calculations.