谷歌浏览器插件
订阅小程序
在清言上使用

Convergence Of The Unadjusted Langevin Algorithm For Discontinuous Gradients

arxiv(2023)

引用 0|浏览4
暂无评分
摘要
We demonstrate that for strongly log-convex densities whose potentials are discontinuous on manifolds, the ULA algorithm converges with stepsize bias of order $1/2$ in Wasserstein-p distance. Our resulting bound is then of the same order as the convergence of ULA for gradient Lipschitz potential. Additionally, we show that so long as the gradient of the potential obeys a growth bound (therefore imposing no regularity condition), the algorithm has stepsize bias of order $1/4$. We therefore unite two active areas of research: i) the study of numerical methods for SDEs with discontinuous coefficients and ii) the study of the non-asymptotic bias of the ULA algorithm (and variants). In particular this is the first result of the former kind we are aware of on an unbounded time interval.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要