In least squares estimation problems, sometimes one or more regressors specified in the model are not observable. One way to circumvent this issue is to estimate or generate regressors from observable data.[1] This generated regressor method is also applicable to unobserved instrumental variables. Under some regularity conditions, consistency and asymptotic normality of least squares estimator is preserved, but asymptotic variance has a different form in general.
Suppose the model of interest is the following:
where g is a conditional mean function and its form is known up to finite-dimensional parameter β. Here [math]\displaystyle{ x_{2i} }[/math] is not observable, but we know that [math]\displaystyle{ x_{2i}=h(w_{i},\gamma) }[/math] for some function h known up to parameter [math]\displaystyle{ \gamma }[/math], and a random sample [math]\displaystyle{ y_{i}=g(x_{1i},x_{2i},\beta)+u_{i} }[/math] is available. Suppose we have a consistent estimator [math]\displaystyle{ \hat\gamma }[/math] of [math]\displaystyle{ \gamma }[/math] that uses the observation [math]\displaystyle{ w_{i} }[/math]'s. Then, β can be estimated by (Non-Linear) Least Squares using [math]\displaystyle{ \hat{x_{2i}}=h(w_{i},\hat\gamma) }[/math]. Some examples of the above setup include Anderson et al. (1976[2] and Barro (1977).[3]
This problem falls into the framework of two-step M-estimator and thus consistency and asymptotic normality of the estimator can be verified using the general theory of two-step M-estimator.[4] As in general two-step M-estimator problem, asymptotic variance of a generated regressor estimator is usually different from that of the estimator with all regressors observed. Yet, in some special cases, the asymptotic variances of the two estimators are identical. To give one such example, consider the setting in which the regression function is linear in parameter and unobserved regressor is a scalar. Denoting the coefficient of unobserved regressor by [math]\displaystyle{ \delta }[/math] if [math]\displaystyle{ \delta=0 }[/math] and [math]\displaystyle{ E[\triangledown\gamma h(W,\gamma) U]=0 }[/math] then the asymptotic variance is independent of whether observing the regressor.[4]
With minor modifications in the model, the above formulation is also applicable to Instrumental Variable estimation. Suppose the model of interest is linear in parameter. Error term is correlated with some of the regressors, and the model specifies some instrumental variables, which are not observable but have the representation [math]\displaystyle{ z_{i}=h(w_{i},\gamma) }[/math]. If a consistent estimator of [math]\displaystyle{ \gamma }[/math] of [math]\displaystyle{ \hat\gamma }[/math] is available using [math]\displaystyle{ \hat z_{i}= h(w_{i},\hat\gamma) }[/math] as instruments, the parameter of interest can be estimated by IV. Similar to the above case, consistency and asymptotic normality follows under mild conditions, and the asymptotic variance has a different form than observed IV case. Yet, there are cases in which the two estimators have the same asymptotic variance. One such case occurs if [math]\displaystyle{ E[\triangledown\gamma h(W,\gamma)]=0[4] }[/math]In this special case, inference on the estimated parameter can be conducted with the usual IV standard error estimator.
Original source: https://en.wikipedia.org/wiki/Generated regressor.
Read more |