Seminar

Extremum Estimation and Numerical Derivatives

Han Hong (Stanford University)

May 10, 2011, 15:30–17:00

Toulouse

Room MF323

Econometrics Seminar

Abstract

Empirical researchers routinely rely on finite-difference approximations to evaluate derivatives of estimated functions. For instance, commonly used optimization routines implicitly use finited difference formulas for gradient calculations. This paper investigates the statistical properties of numerically evaluated gradients and of extremum estimators computed using numerical gradients. We find that first, one needs to adjust the step size or the tolerance parameter as a function of the sample size. Second, higher-order finite difference formulas reduce the asymptotic bias analogous to higher order kernels. Third, we provide weak sufficient conditions for uniform consistency of the finite-difference approximations for gradients and directional derivatives. Fourth, we analyze numerical gradient-based extremum estimators and find that the asymptotic distribution of the resulting estimators may depend on the sequence of step sizes. Fifth, we state conditions under which the numerical derivative estimator is consistent and asymptotically normal. Sixth, we generalize our results to semiparametric estimation problems. Finally, we show that the theory is also useful in a range of nonstandard estimation procedures.

JEL codes

  • C14: Semiparametric and Nonparametric Methods: General
  • C52: Model Evaluation, Validation, and Selection