Convergence properties of stochastic proximal subgradient method in solving a class of composite optimization problems with cardinality regularizer

JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION(2024)

引用 0|浏览0
暂无评分
摘要
. In this paper, we study a class of composite optimization problems, whose objective function is the summation of a bunch of nonsmooth nonconvex loss functions and a cardinality regularizer. Firstly we investigate the optimality condition of these problems and then suggest a stochastic proximal subgradient method (SPSG) to solve them. Then we establish the almost surely subsequence convergence of SPSG under mild assumptions. We emphasize that these assumptions are satisfied by a wide range of problems arising from training neural networks. Furthermore, we conduct preliminary numerical experiments to demonstrate the effectiveness and efficiency of SPSG in solving this class of problems.
更多
查看译文
关键词
Nonsmooth optimization,cardinality regularizer,proximal subgradient method,global convergence,conservative field
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要