14_GMM - Amine Ouazad
Download
Report
Transcript 14_GMM - Amine Ouazad
Generalized Method of Moments:
Introduction
Amine Ouazad
Ass. Professor of Economics
Outline
1. Introduction:
Moments and moment conditions
2. Generalized method of moments estimator
3. Consistency and asymptotic normality
4. Test for overidentifying restrictions: J stat
5. Implementation (next session).
Next session: leading example of application of
GMM, dynamic panel data.
Moments
• Moment of a random variable is the expected
value of a function of the random variable.
– The mean,the standard deviation, skewness,
kurtosis are moments.
– A moment can be a function of multiple
parameters.
• Insight:
– All of the estimation techniques we have seen so
far rely on a moment condition.
Moment conditions
• Estimation of the mean:
– m satisfies E(yi – m)=0
• Estimation of the OLS coefficients:
– Coefficient b satisfies E(xi’(yi – xib))= 0
• Estimation of the IV coefficients:
– Coefficient b satisfies E(zi’(yi – xib))= 0
• Estimation of the ML parameters:
– Parameter q satisfies the score equation
E(d ln L(yi;q) / dq ) = 0
• As many moment conditions as there are
parameters to estimate.
Method of moments
• The method of moments estimator of m is the
estimator m that satisfies the empirical
moment condition.
- (1/N) Si (yi-m) = 0
- The method of moments estimator of b in the
OLS is the b that satisfies the empirical
moment condition.
- (1/N) Si xi’(yi-xib) = 0
Method of moments
• Similarly for IV and ML.
• The method of moments estimator of the
instrumental variable estimator of b is the
vector b that satisfies:
– (1/N) Si zi’(yi-xib) = 0 . Empirical moment condition
• The method of moments estimator of the ML
estimator of q is the vector q such that:
– (1/N) d ln L(yi;q) / dq = 0.
– The likelihood is maximized at that point.
Framework and estimator
• iid observations yi,xi,zi.
• K parameters to estimate q = (q1,…,qK).
• L>=K moment conditions.
• Empirical moment conditions:
• GMM estimator of q minimizes the GMM
criterion.
GMM Criterion
• GMM estimator minimizes:
• Or any criterion such as:
• Where Wn is a symmetric positive (definite)
matrix.
Assumption
• Convergence of the empirical moments.
• Identification
• Asymptotic distribution of the empirical
moments.
Convergence of the
empirical moments
• Satisfied for most cases: Mean, OLS, IV, ML.
• Some distributions don’t have means, e.g. Cauchy
distribution. Hence parameters of a Cauchy
cannot be estimated by the method of moments.
Identification
• Lack of identification if:
– Fewer moment conditions than parameters.
– More moment conditions than parameters and at least
two inconsistent equations.
– As many moment conditions as parameters and two
equivalent equations.
Asymptotic distribution
• Satisfied for means such as the OLS moment,
the IV moment, and also for the score
equation in ML (see session on maximum
likelihood).
GMM estimator is CAN
• Same property as for OLS, IV, ML.
• Variance-covariance matrix VGMM
determined by the variance-covariance matrix
of the moments.
Variance of GMM
• Variance of GMM estimator is:
• Hansen (1982) shows that the matrix that
provides an efficient GMM estimator is:
Two step GMM
• The matrix W is unknown (both for practical
reasons, and because it depends on the
unknown parameters).
1. Estimate the parameter vector q using
W=Identity matrix.
2. Estimate the parameter vector q using
W=estimate of the variance covariance
matrix of the empirical moments.
Overidentifying restrictions
• Examples:
– More instruments than endogenous variables.
– More than one moment for the Poisson
distribution (parameterized by the mean only).
– More than 2 moments for the normal distribution
(parameterized by the mean and s.d. only).
Testing for overidentifying restrictions
• With more moments than parameters, if the
moment conditions are all satisfied
asymptotically, then
• Converges to 0 in probability, and
• has a c2 distribution. The number of degrees
of freedom is the rank of the Var cov matrix.
Testing for overidentifying restrictions
• With more conditions than parameters, this
gives a test statistic and a p-value.
• Sometimes called the J Statistic.