



























Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
Random Effects Linear Model, Error Components Model, Notation, Regression Model Orthogonality, Convergence of Moments, Mechanics, Cornwell and Rupert Data, Feasible GLS are points which describes this lecture importance in Econometric Analysis of Panel Data course.
Typology: Slides
1 / 35
This page cannot be seen from the preview
Don't miss anything!
5. Random Effects Linear Model
The Random Effects Model
E[c (^) i | X i ] = 0 E[εit| X i ,c (^) i ]=
it it i it i i i i i i i i i i i i Ni=1 i 1 2 N
y = +c +ε , observation for person i at time t = +c + , T observations in group i = + + , note (c , c ,...,c ) = + + , T observations in the sample c=( , ,... ) ,
′
= ′ Σ ′ ′ ′ ′
x β y X β i ε X β c ε c y Xβ c ε c c c ΣNi=1 T by 1 vectori
Notation
1 1 1 2 2 2 N N N i
u T observations u T observations u T observations = + + T observations = +
(^) = (^) + (^) + Σ
1 1 1 2 2 2 N N N Ni=
y X ε i y X (^) β ε i y X ε i Xβ ε u Xβ w In all that fo
′ ′
i it it
llows, except where explicitly noted, X, X and x contain a constant term as the first element. To avoid notational clutter, in those cases, x etc. will simply denote the counterpart without the constant term. Use of the symbol K for the number of variables will thus be context specific but will usually include the constant term.
Notation
(^2) u (^2) u (^2) u 2 u^2 2 u^2 u^2 i i u^2 u^2 2 u^2 (^2) u (^2) i i (^2) u 2 i 1 2 N
Var[ +u ]
=
Var[ | ]
(^) + = ^ + +
=
i i
T T
ε i
I ii I ii Ω Ω 0 0 w X^0 Ω^0 0 0 Ω
ε ε ε ε ε
σ σ σ σ σ σ σ σ σ σ σ σ σ σ σ σ
i
(Note these differ only in the dimension T )
Convergence of Moments
N^ Ni 1 i i 1 i N Ni 1 i i 1 i Ni 1 i (^) u Ni 1 i i
T f^ T a weighted sum of individual moment matrices
T f^ T a weighted sum of individual moment matrices = f (^) T f
Note asymptoti
=^ =
= = = =
′ (^) = Σ ′ = Σ ′ (^) = Σ ′ = Σ Σ ′^ + Σ ′
i i
i i i
(^2) ii i (^2) i i
X X X X
X ΩX X Ω X
X X (^) x x σ ε σ
i i
i
cs are with respect to N. Each matrix (^) T is the
moments for the T observations. Should be 'well behaved' in micro level data. The average of N such matrices should be likewise. T or T is assum
X X^ ′ ii i
ed to be fixed (and small).
Random vs. Fixed Effects
Estimating the Variance for OLS
1 1 Ni 1 (^) i Ni 1 (^) i Ni 1 (^) i Ni 1 i
N Ni 1 i i 1 i
N Ni 1 i i 1 i
Var[ | ] (^) T T T T
T f^ T , where^ =^ =E[^ |^ ] In the spirit of the White estimator, use f ˆ^ ˆ^ , ˆ T T
− − = = = =
= =
= =
= ^ ′^ ^ ^ ′^ ^ ′ Σ ^ Σ ^ Σ (^) ^ Σ ′ (^) = Σ ′ ′ Σ
′ (^) = Σ ′^ ′ Σ
i i i i i i i
i i i i i
b X^1 X X^ X^ ΩX^ X X X ΩX X^ Ω X Ω w w X
X ΩX X w w X w =
Hypothesis tests are then based on Wald statistics.
y - X b i i
THIS IS THE 'CLUSTER' ESTIMATOR
i i
Est.Var[ | ] ˆ ˆ
ˆ = set of T OLS residuals for individual i. = T xK data on exogenous variable for individual i. ˆ = K x 1 vector of products ( ˆ^ )( ˆ )
− − = (^) ′^ Σ= ′^ ′^ ′ ′
′ ′ ′ (^) =
i i i i i i i i i i i i
b X X X X w w X X X w X X w X w w X
Ni 1 Ni 1 Ni 1
KxK matrix (rank 1, outer product) ˆ ˆ (^) = sum of N rank 1 matrices. Rank K. We could compute this as ˆ^ ˆ = ˆ. Why not do it that way?
= = =
Σ ′^ ′ ≤ Σ ′^ ′^ Σ ′
i i i i i i i i i i i
X w w X X w w X X Ω X
OLS Results
+----------------------------------------------------+| Residuals Sum of squares = 522.2008 | || (^) Fit Standard errorR-squared of e == .3544712.4112099 || |+----------------------------------------------------+ Adjusted R-squared = .4100766 | +---------+--------------+----------------+--------+---------+----------+|Variable | Coefficient | Standard Error |b/St.Er.|P[|Z|>z] | Mean of X| +---------+--------------+----------------+--------+---------+----------+Constant 5.40159723 .04838934 111.628. EXPEXPSQ (^) -.00068788.04084968 (^) .480428D-04.00218534 (^) -14.31818.693 .0000.0000 (^) 514.40504219. OCCSMSA -.13830480.14856267 .01480107.01206772 -9.34412.311 .0000.0000 .51116447. MSFEM (^) -.40020215.06798358 .02074599.02526118 (^) -15.8433.277 .0010.0000 .81440576. UNIONED .09409925.05812166 .01253203.00260039 (^) 22.3517.509 .0000.0000 (^) 12.8453782.
Panel Data Algebra (1) (^2) ε (^2) u i i (^2) ε (^2 2) u (^2) ε (^2) ε (^2) ε
(^2) ε
= σ +σ , depends on 'i' because it is T T = σ [ ], = σ / σ = σ [ ] = σ [ ], = , =. Using (A-66) in Greene (p. 822) 1 1 σ 1+ =
ρ ′ ρ ρ ′^ + ′ ρ
i i^2 i^2
-1i -1 (^) -1 -1 -
Ω I ii Ω I + ii Ω I + ii A bb A I b i
Ω A - (^) b A b A bb A
(^2 2) u (^2) ε i 2 2 ε 2 ε i 2 u
(^1 1) = 1 σ σ 1+T σ σ +Tσ
(^) ρ ′ (^) ′ (^) ρ
I - ii I - ii
Panel Data Algebra (2)
(^2) ε 2 u (^2) ε i 2 u (^2) ε i u 2 (^2) ε i 2 u (^2) ε i 2 u i 2 ε 2 ε i 2 u (^2) ε i 2 u (^2) ε
(Based on Wooldridge p. 286) σ +σ σ +Tσ σ +Tσ σ +Tσ ( ) (σ +Tσ )[ ( )], = σ /(σ +Tσ ) (σ +Tσ )[ ] (σ
= ′^ = ′^ ′= = − = + η − η = + η =
i -1^ Di iD Di^ Di Di^ Di
Ω I ii I i(i i) i I P I I M P I P P M i 2 u (^1) i 1 / 2 i i i i i a ai
+Tσ ) (1 / ) (Prove by multiplying. .) (1 / ) 1 , =1- 1 (Note )
− −
= + η = = + η = (^) − θ ^ − θ θ η = + η
i i Di^ iD^ Di^ iD i Di^ Di^ Di i Di^ iD
S S P M P M 0
S P M I P
S P M
GLS (cont.)
it it i i it it i i
i (^2 ) i u 2
GLS is equivalent to OLS regression of
y * y y. on * .,
where 1 T
Asy.Var[ ˆ] [ ] [ ]
ε ε
ε
= − θ = − θ σ θ = − σ + σ
= ′^ -1^ -1^ = σ ′ -
x x x
β X Ω X X * X*
Estimators for the Variances
i i
it it i OLS Ni 1 Tt 1 (^) it 2 Ni 1 (^) t 1T 2 U 2 (^2 2 2) U i LSDV i
y u With a consistent estimator of , say , (y ) estimates ( ) Divide by something to estimate = With the LSDV estimates, a and ,
= = = = ε ε
=
= ′ + ε +
Σ Σ ′ Σ Σ σ + σ σ σ + σ
it
it
x β β b
- x b
b
N 1 Tt 1 i (^) it 2 Ni 1 Tt 1i 2 2 (^2) U (^2 2) U 2
(y a ) estimates Divide by something to estimate Estimate with ( ) - (^) ˆ.
= = = ε ε ε ε
Σ ′ Σ Σ σ σ σ σ + σ σ
- (^) i - x bit