Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Random Variable - Econometrics - Past Exam, Exams of Econometrics and Mathematical Economics

Random Variable, Continuously Distributed, Additively or Multiplicatively Separable, Strictly Positive, Continuous Derivatives, Density Function, True Function, Separability Restriction, Random Sample, Kernel Estimator. This exam paper is for Econometrics course.

Typology: Exams

2011/2012

Uploaded on 12/04/2012

devpad
devpad 🇮🇳

4.1

(54)

81 documents

1 / 4

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Econometrics Field Examination August 2010
Field Examination: Econometrics
Department of Economics
University of California, Berkeley
August 2010
Instructions: Answer THREE of the following four questions (one hour each).
1. Let Ybe a scalar random variable with many nite moments and X= (X1; X2)0be a two-
dimensional random variable that is continuously distributed. Suppose that economic theory suggests that
the conditional expectation g(x1; x2)E[YjX1=x1; X2=x2]is separable in the two components, but
you don’t know if it is additively- or multiplicatively separable, i.e., one of the following two models is
correct:
M1:g(x1; x2) = g1(x1) + g2(x2);
M2:g(x1; x2) = g1(x1)g2(x2);
where both g1and g2are strictly positive with many continuous derivatives. The object of interest is g1(x)
(or perhaps its logarithm) in either case.
(a) For model M1, suppose you are given the function
h(x)Zg(x; u)f(u)du;
where f(u)is some known density function (with support strictly inside the support of X2). Under what
additional restriction(s) on the true function g2(x)is the function g1(x)identi…ed? Is this just a normal-
ization –that is, there are always g1and g2functions satisfying the restriction(s) for any g(x1; x2)–or do
they rule out certain possible g(x1; x2)functions that satisfy the separability restriction?
(b) Repeat the exercise in part (a) for the model M2.
(c) Suppose that, for a random sample of size n; you are given a kernel estimator ^g(x1; x2)of g(x1; x2)
based on a bivariate kernel K(u1;u2)and bandwidth sequence hn that was asymptotically normal,
converging at the (undersmoothed) rate pnh2, i.e.
pnh2(^gg)d
!N(0; V0)
for the usual V0>0:For model M1and the restriction(s) derived for part (a), propose a consistent es-
timator of g1(x);and indicate its rate of convergence under "suitable regularity conditions." You should
not attempt to rigorously derive this convergence rate, but should give some justi…cation of your claim.
(d) Again, repeat the exercise in part (c) for model M2and the restrictions derived for part (b).
1
pf3
pf4

Partial preview of the text

Download Random Variable - Econometrics - Past Exam and more Exams Econometrics and Mathematical Economics in PDF only on Docsity!

Field Examination: Econometrics

Department of Economics University of California, Berkeley

August 2010

Instructions: Answer THREE of the following four questions (one hour each).

  1. Let Y be a scalar random variable with many Önite moments and X = (X 1 ; X 2 )^0 be a two- dimensional random variable that is continuously distributed. Suppose that economic theory suggests that the conditional expectation g(x 1 ; x 2 )  E[Y jX 1 = x 1 ; X 2 = x 2 ] is separable in the two components, but you donít know if it is additively- or multiplicatively separable, i.e., one of the following two models is correct:

M 1 : g(x 1 ; x 2 ) = g 1 (x 1 ) + g 2 (x 2 ); M 2 : g(x 1 ; x 2 ) = g 1 (x 1 )  g 2 (x 2 );

where both g 1 and g 2 are strictly positive with many continuous derivatives. The object of interest is g 1 (x) (or perhaps its logarithm) in either case.

(a) For model M 1 , suppose you are given the function

h(x) 

Z

g(x; u)f (u)du;

where f (u) is some known density function (with support strictly inside the support of X 2 ). Under what additional restriction(s) on the true function g 2 (x) is the function g 1 (x) identiÖed? Is this just a normal- ization ñthat is, there are always g 1 and g 2 functions satisfying the restriction(s) for any g(x 1 ; x 2 ) ñor do they rule out certain possible g(x 1 ; x 2 ) functions that satisfy the separability restriction?

(b) Repeat the exercise in part (a) for the model M 2.

(c) Suppose that, for a random sample of size n; you are given a kernel estimator ^g(x 1 ; x 2 ) of g(x 1 ; x 2 ) ñ based on a bivariate kernel K(u 1 ; u 2 ) and bandwidth sequence hn ñ that was asymptotically normal, converging at the (undersmoothed) rate

p nh^2 , i.e. p nh^2 (^g g) !d N(0; V 0 )

for the usual V 0 > 0 : For model M 1 and the restriction(s) derived for part (a), propose a consistent es- timator of g 1 (x); and indicate its rate of convergence under "suitable regularity conditions." You should not attempt to rigorously derive this convergence rate, but should give some justiÖcation of your claim.

(d) Again, repeat the exercise in part (c) for model M 2 and the restrictions derived for part (b).

  1. Suppose

(yt; xt)^0 : 1  t  T is an observed time series generated by the cointegrated system

yt = xt + ut;

where

 ut xt

 i:i:d: N

with initial condition x 0 = 0: A researcher wants to estimate the scalar parameter ; which is assumed to be non-zero. Let ^^ denote the OLS estimator of :

(a) Characterize the limiting distribution (after appropriate centering and rescaling) of ^^ :

(b) Characterize the limiting distribution of the reverse regression estimator ^ =

P

T t=1 y

2 t

 1 P

T t=1 ytxt

Let ~^ = 1=^:

(c) Is ~^ a consistent estimator of?

(d) Characterize the limiting distribution (after appropriate centering and rescaling) of ~^ : Is ~^ asymp- totically equivalent to ^^?

  1. Let (Yi; Xi)ni=1 be a sequence of iid random variables such that Y 2 R and X 2 R. Consider the following model

Y = X ( ) + U; (1) P rY jX (Y  X ( )jX) =  (2)

some  2 (0; 1), and ( ) 2  = R. Cast this problem as a M-estimation problem

Pn i=1 ^ (Yi^ ^ Xi^ (^ ))^ using^ ^ (u) =^ u(^ ^1 fu^ ^0 g). And let

^ (^) ( ) = arg min 2 

X^ n

i=

 (Yi Xi ): (3)

To solve the questions below, you might need to assume: E[X^2 ] < 1 FU jX (u)  P rU jX (U  ujX) is continuous di§erentiable with bounded second derivative, and 1 > C  fU jX (0)  c > 0 for all U and X.

(a) DeÖne Zn() 

Pn i=1  (Ui^ Xi=

p n) (Ui). Show that (a) Zn() is convex, (b) it has minimizer, ^n = pn(^ (^) ( ) ( )).

(b) Show that, for all , Z 1 n()  p^1 n

Pn i=1 Xi^  (Ui)^ )^ N^ (0; 

2  (1  )D

0 )^ with  (u) =^ ^ ^1 fu^ ^0 g^ and^ D^0 =^ E[X

2 ].

(c) Show that, for all , Z 2 n() 

Pn i=

R (^) Xi=pn 0 (1fUi^ ^ sg^1 fUi^ ^0 g)ds^ = 0:^5 fU^ jX^ (0)

(^2) E[X (^2) ]+op(1).

Hint: Show that the expectation converges to 0 : 5 fU jX (0)^2 E[X^2 ] and then show that the variance of Z 2 n() goes to zero.

(d) Show that,

p n(^^ ( ) ( )) ) N (0; (^) f 2 (1^ ) U jX (0)^

). Hint: (1) You may use Knightís identity:

 (u v)  (u) = v (^)  (u) +

Z (^) v

0

(1fu  sg 1 fu  0 g)ds

. (2) You may use the following fact without showing it: If, for all , Zn() ) Z 0 (), then arg inf Zn() ) arg inf Z 0 (); provided Zn() is convex.