Regularized DeepIV with Model Selection
arxiv(2024)
摘要
In this paper, we study nonparametric estimation of instrumental variable
(IV) regressions. While recent advancements in machine learning have introduced
flexible methods for IV estimation, they often encounter one or more of the
following limitations: (1) restricting the IV regression to be uniquely
identified; (2) requiring minimax computation oracle, which is highly unstable
in practice; (3) absence of model selection procedure. In this paper, we
present the first method and analysis that can avoid all three limitations,
while still enabling general function approximation. Specifically, we propose a
minimax-oracle-free method called Regularized DeepIV (RDIV) regression that can
converge to the least-norm IV solution. Our method consists of two stages:
first, we learn the conditional distribution of covariates, and by utilizing
the learned distribution, we learn the estimator by minimizing a
Tikhonov-regularized loss function. We further show that our method allows
model selection procedures that can achieve the oracle rates in the
misspecified regime. When extended to an iterative estimator, our method
matches the current state-of-the-art convergence rate. Our method is a Tikhonov
regularized variant of the popular DeepIV method with a non-parametric MLE
first-stage estimator, and our results provide the first rigorous guarantees
for this empirically used method, showcasing the importance of regularization
which was absent from the original work.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要