Chrome Extension
WeChat Mini Program
Use on ChatGLM

Minimax rates for conditional density estimation via empirical entropy

ANNALS OF STATISTICS(2023)

Cited 0|Views38
No score
Abstract
We consider the task of estimating a conditional density using i.i.d. sam-ples from a joint distribution, which is a fundamental problem with applica-tions in both classification and uncertainty quantification for regression. For joint density estimation, minimax rates have been characterized for general density classes in terms of uniform (metric) entropy, a well-studied notion of statistical capacity. When applying these results to conditional density es-timation, the use of uniform entropy-which is infinite when the covariate space is unbounded and suffers from the curse of dimensionality-can lead to suboptimal rates. Consequently, minimax rates for conditional density es-timation cannot be characterized using these classical results.We resolve this problem for well-specified models, obtaining match-ing (within logarithmic factors) upper and lower bounds on the minimax Kullback-Leibler risk in terms of the empirical Hellinger entropy for the conditional density class. The use of empirical entropy allows us to appeal to concentration arguments based on local Rademacher complexity, which- in contrast to uniform entropy-leads to matching rates for large, potentially nonparametric classes and captures the correct dependence on the complexity of the covariate space. Our results require only that the conditional densities are bounded above, and do not require that they are bounded below or other-wise satisfy any tail conditions.
More
Translated text
Key words
Conditional density estimation,empirical entropy,logarithmic loss,nonparametric estimation,minimax rates
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined