Non-geodesically-convex optimization in the Wasserstein space
arxiv(2024)
Abstract
We study a class of optimization problems in the Wasserstein space (the space
of probability measures) where the objective function is nonconvex along
generalized geodesics. When the regularization term is the negative entropy,
the optimization problem becomes a sampling problem where it minimizes the
Kullback-Leibler divergence between a probability measure (optimization
variable) and a target probability measure whose logarithmic probability
density is a nonconvex function. We derive multiple convergence insights for a
novel semi Forward-Backward Euler scheme under several nonconvex (and
possibly nonsmooth) regimes. Notably, the semi Forward-Backward Euler is just a
slight modification of the Forward-Backward Euler whose convergence is – to
our knowledge – still unknown in our very general non-geodesically-convex
setting.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined