Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Practice Homework 2 - Generalized Linear Models | STA 7249, Assignments of Biostatistics

Material Type: Assignment; Class: GENERALIZED LIN MOD; Subject: STATISTICS; University: University of Florida; Term: Spring 2004;

Typology: Assignments

Pre 2010

Uploaded on 09/17/2009

koofers-user-kfa-1
koofers-user-kfa-1 🇺🇸

5

(1)

10 documents

1 / 2

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
STA 7249: Generalized Linear Models
Spring 2004: Assignment 2
More problems will be added to this assignment. These are given now
so that you can get started on them.
1. (Exercise 5.10, McCulloch and Searle, 2001) Suppose that y1, . . . , ynare independent
with µi=E(yi) satisfying
log µi=xiβ(xiunivariate)
and
Var(yi) = φµi.
(a) Give the quasi-likelihood estimating equation for βand find the asymptotic vari-
ance of ˜
β, the “maximum quasi-likelihood estimator” (MQLE) of β.
(b) Suppose that y1, . . . , ynare normally distributed. Derive the likelihood equations
for βand φand the asymptotic variance of ˆ
β, the MLE of β.
(c) Calculate the ratio of the asymptotic variance of ˜
βto that of ˆ
β. For concreteness,
assume that n/2 of the observations have xi= 5 and the remaining n/2 have
xi= 10. Do the calculations for βequal to 0.1, 1, and 10.
2. (Exercise 4.14, McCullagh and Nelder, 1989) Suppose that YBinm, eλ/(1 + eλ).
Show that Y0=mYis also binomially distributed, and that the induced parameter
is λ0=λ. Consider
˜
λ= logY+c1
mY+c2.
as an estimator of λ. Show that in order that the estimator be equivariant under the
transformation Y7→ mY, we must have c1=c2.
3. (Exercise 4.15, McCullagh and Nelder, 1989) Suppose that YBin(m, π), 0 < π < 1
Show that for c > 0,
E{log(Y+c)}= log() + c
1π
2 +O(m3/2).()
Find the corresponding expansion for log(mY+c), and use these results to deduce
that
E(˜
λ) = λ+(1 2π)(c1/2)
(1 π)+O(m3/2).
This shows that choosing c= 1/2 makes ˜
λapproximately unbiased.
Hint: Write Y= +p(1 π)Z, where Z= (Y)/p(1 π), and note
that Z=Op(1) as m , since Zd
N(0,1). Now use Taylor expansion to
approximate log{(Y+c)/()}. To make this completely rigorous, you will have to
explicitly show that the expectation of the error term is O(m3/2). This takes some
work but can be done using Hoeffing’s inequality, which can be found, e.g., on page 75
of Serfling (1980), Approximation Theorems of Mathematical Statistics.
1
pf2

Partial preview of the text

Download Practice Homework 2 - Generalized Linear Models | STA 7249 and more Assignments Biostatistics in PDF only on Docsity!

STA 7249: Generalized Linear Models

Spring 2004: Assignment 2

More problems will be added to this assignment. These are given now

so that you can get started on them.

  1. (Exercise 5.10, McCulloch and Searle, 2001) Suppose that y 1 ,... , yn are independent with μi = E(yi) satisfying

log μi = xiβ (xi univariate)

and Var(yi) = φμi.

(a) Give the quasi-likelihood estimating equation for β and find the asymptotic vari- ance of β˜, the “maximum quasi-likelihood estimator” (MQLE) of β. (b) Suppose that y 1 ,... , yn are normally distributed. Derive the likelihood equations for β and φ and the asymptotic variance of βˆ, the MLE of β. (c) Calculate the ratio of the asymptotic variance of β˜ to that of βˆ. For concreteness, assume that n/2 of the observations have xi = 5 and the remaining n/2 have xi = 10. Do the calculations for β equal to 0.1, 1, and 10.

  1. (Exercise 4.14, McCullagh and Nelder, 1989) Suppose that Y ∼ Bin

m, eλ/(1 + eλ)

Show that Y ′^ = m − Y is also binomially distributed, and that the induced parameter is λ′^ = −λ. Consider λ˜ = log

Y + c 1 m − Y + c 2

as an estimator of λ. Show that in order that the estimator be equivariant under the transformation Y 7 → m − Y , we must have c 1 = c 2.

  1. (Exercise 4.15, McCullagh and Nelder, 1989) Suppose that Y ∼ Bin(m, π), 0 < π < 1 Show that for c > 0,

E{log(Y + c)} = log(mπ) +

c mπ

1 − π 2 mπ

  • O(m−^3 /^2 ). (∗)

Find the corresponding expansion for log(m − Y + c), and use these results to deduce that E(˜λ) = λ +

(1 − 2 π)(c − 1 /2) mπ(1 − π)

  • O(m−^3 /^2 ).

This shows that choosing c = 1/2 makes ˜λ approximately unbiased. Hint: Write Y = mπ +

mπ(1 − π)Z, where Z = (Y − mπ)/

mπ(1 − π), and note that Z = Op(1) as m → ∞, since Z −→d N (0, 1). Now use Taylor expansion to approximate log{(Y + c)/(mπ)}. To make this completely rigorous, you will have to explicitly show that the expectation of the error term is O(m−^3 /^2 ). This takes some work but can be done using Hoeffing’s inequality, which can be found, e.g., on page 75 of Serfling (1980), Approximation Theorems of Mathematical Statistics.

  1. Refer to the “incidence of leaf-blotch on barley” example in Section 9.2.4 of Generalized Linear Models, by McCullagh and Nelder (1989). In the second part of this example, McCullagh and Nelder fit a quasi-likelihood model with logit link and variance function V (μ) = μ^2 (1 − μ)^2.

(a) Referring to equation (9.4), p. 327 of McCullagh and Nelder, derive the form of the quasi-deviance function corresponding to this choice of variance function. (b) Modify the “quasi” function in R to accept this new variance function and use the result to reproduce the second analysis of McCullagh and Nelder, and in par- ticular the table at the bottom of p. 330 and the residual plot given in Figure 9. on p. 332. Turn in a printout of the part of your output that contains the results given in the table (the output of summary() should do) and a printout of the residual plot. Also email your code to do this example (including your modified quasi function) to the TA in a form that he can easily use to reproduce your calculations.