Monte Carlo with kernel-based Gibbs measures: Guarantees for probabilistic herding
CoRR(2024)
摘要
Kernel herding belongs to a family of deterministic quadratures that seek to
minimize the worst-case integration error over a reproducing kernel Hilbert
space (RKHS). In spite of strong experimental support, it has revealed
difficult to prove that this worst-case error decreases at a faster rate than
the standard square root of the number of quadrature nodes, at least in the
usual case where the RKHS is infinite-dimensional. In this theoretical paper,
we study a joint probability distribution over quadrature nodes, whose support
tends to minimize the same worst-case error as kernel herding. We prove that it
does outperform i.i.d. Monte Carlo, in the sense of coming with a tighter
concentration inequality on the worst-case integration error. While not
improving the rate yet, this demonstrates that the mathematical tools of the
study of Gibbs measures can help understand to what extent kernel herding and
its variants improve on computationally cheaper methods. Moreover, we provide
early experimental evidence that a faster rate of convergence, though not
worst-case, is likely.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要