As described in the short
tutorial, we can get standard errors for the GATEs by estimating via
OLS appropriate linear models. Then, under an “honesty” condition, we
can use the estimated standard errors to conduct valid inference about
the GATEs as usual, e.g., by constructing conventional confidence
intervals. In this article, we discuss which linear models are estimated
by the inference_aggtree
function.
As mentioned above, we require an honesty condition to achieve valid inference about the GATEs.
Honesty is a subsample-splitting technique that requires that different observations are used to form the subgroups and estimate the GATEs.
To this end, inference_aggtree
always uses the honest
sample to estimate the linear models below, unless we called the
build_aggtree
function without allocating any observation
to the honest sample (e.g., we set honest_frac = 0
or we
used a vector of FALSEs for the is_honest
argument).
When calling the build_aggtree
function, the user can
control the GATE estimator employed by the routine by setting the
method
argument. The inference_aggtree
function inherits this argument and selects the model to be estimated
accordingly.
If method
is set to "raw"
, the
inference_aggtree
function estimates via OLS the following
linear model:
with the number of leaves of a particular tree , and a dummy variable equal to one if the -th unit falls in the -th leaf of .
Exploiting the random assignment to treatment, we can show that each identifies the GATE in the -th leaf.
Under honesty, the OLS estimator of is root- consistent and asymptotically normal.
If method
is set to "aipw"
, the
inference_aggtree
function estimates via OLS the following
linear model:
where writes as:
with the conditional mean of and the propensity score.
The doubly-robust scores
are inherited by the output of the build_aggtree
function.
As before, we can show that each identifies the GATE in the -th leaf, this time even in observational studies.
Under honesty, the OLS estimator of is root- consistent and asymptotically normal, provided that the are cross-fitted in the honest sample and that the product of the convergence rates of the estimators of the nuisance functions and is faster than .