



Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
Series Generated, Cointegrated System, Scalar Parameter, Limiting Distribution, Efficient Relative, Real Valued, Parameter, Econometrician, Differentiable, Stochastic Equicontinuous. This exam paper is for Econometrics course.
Typology: Exams
1 / 5
This page cannot be seen from the preview
Don't miss anything!
Dept. of Economics, UC Berkeley
Instructions. You have 180 minutes to answer THREE out of the following four questions. Please make your answers elegant, that is, clear, concise, and, above all, correct. Good luck!
Suppose {(yt, xt)′^ : 1 ≤ t ≤ T }^ is an observed time series generated by the cointegrated system
yt = βxt + ut,
where
ut ∆xt
(^) ∼ i.i.d. N
with initial condition x 0 = 0. A researcher wants to estimate the scalar parameter β. As estimators of β, consider
βˆ =
∑tT=2^ ∆xt∆yt t=2 (∆xt)^2
and β˜ =
∑tT=1^ ∆xtyt t=1 ∆xtxt
βˆ − β
β − β
Let (Wt)t be an iid sequence of real valued random variables drawn from P. Let θ 0 ∈ Θ ⊆ R be the (true) parameter of a model, and let τ 0 ∈ T be a “nuisance” parameter with (T, || · ||) being a normed space. Both are identified by
E[m(W, θ 0 , τ 0 )] = 0,
where m : R × R × T → R is a known function. E.g., m is the score of a ML model, or a moment. Suppose θ 0 is in the interior of Θ; and, for now, lets consider the case where τ 0 is known to the econometrician, and let m(·, ·) ≡ m(·, ·, τ 0 ). Consider the following estimator: θˆ where
m¯T (θˆ) ≡ T −^1
t=
m(Wt, θˆ) = 0.
Assume θˆ is consistent, i.e., For all ǫ > 0, limT →∞ P
|θˆ − θ 0 | > ǫ
T m¯T (θ 0 ) ⇒ N (0, V ), with V > 0. (c) sup|θ−θ 0 |<δ
∣ d^ m¯ dθT^ ( θ)− d[E[m dθ(W,θ^0 )]]
∣ = oP (1) with Γ ≡ d[E[m dθ(W,θ 0 )]]> 0. Show that √ T (θˆ − θ 0 ) ⇒ N (0, Γ−^1 V Γ−^1 ).
∣ dE[ ¯m dθT (θ)]−^ d[E[m dθ(W,θ^0 )]]
∣ =^ oP (1) with Γ^ ≡^ d[E[m dθ(W,θ 0 )]]>^ 0. (d) {νT (θ) : θ ∈ Θ} is stochastic equicontinuous; i.e., sup|θ−θ 0 |<δ |νT (θ) − νT (θ 0 )| = oP (1). Where νT (·) ≡ √^1 T^ ∑Tt=1{m(Wt, ·) − E[m(W, ·)]}. (^1) If you think you need more regularity conditions to show the results; feel free to add them. However, adding unnecessary regularity conditions will be penalized.
Consider the following panel data relationship for a sample of N individuals over T = 2 time periods:
E[yit|xi 1 , xi 2 ] ≡ μt(xi 1 , xi 2 ) = α(xi 1 , xi 2 ) + δ · 1 {t = 2} + g(xit)
for i = 1, ..., N and T = 1, 2. Here yit is a scalar dependent variable, xit is a scalar explanatory variable, α(xi 1 , xi 2 ) ≡ αi is an individual-specific ”fixed effect,” δ is a constant ”time effect” for period t = 2, and g(xit), the object of interest,embodies the contemporaneous effect of xit on the conditional mean of yit. Assume the functions α(·) and g(·) are very smooth (i.e., are continuously differentiable of arbi- trarily high order), that αi and g(xit) have all moments finite, that xi 1 and xit are jointly continuously distributed with positive (and very smooth) density on R^2 , and that the data are i.i.d. over the index i.
N 1 /^3 (ˆμ(x 1 , x 2 ) − μ(x 1 , x 2 )) = Op(1),
what is the rate of convergence of the constructed δˆ and ˆg(x)?
N -consistent estimator of ˆδ exists under this normalization and the moment restriction. Also, construct a kernel estimator of g(x) that converges at the one-dimensional rate N 2 /^5 , and derive its asymptotic distribution. (You do not need to verify any regularity conditions for your estimator, but should correctly cite existing results on kernel regression.)
Suppose {yt : 1 ≤ t ≤ T } is an observed time series generated by the model
yt = ρyt− 1 + εt,
where y 0 = 0, εt ∼ i.i.d. N (0, 1) , and ρ is an unknown parameter of interest. Consider the (unit root) testing problem
H 0 : ρ = 1 vs. H 1 : ρ < 1.
t=
yt− 1 ∆yt, (^) T^1
t=
y^2 t− 1
τT = ρˆ^ −^1 1 /
t=1 y^2 t−^1
, ρˆ =
∑t=1T^ yt−^1 yt t=1 y^ t^2 −^1
Is the converse true?