Backtracking linesearch for conditional gradient sliding

arxiv(2020)

引用 0|浏览0
暂无评分
摘要
We present a modification of the conditional gradient sliding (CGS) method that was originally developed in \cite{lan2016conditional}. While the CGS method is a theoretical breakthrough in the theory of projection-free first-order methods since it is the first that reaches the theoretical performance limit, in implementation it requires the knowledge of the Lipschitz constant of the gradient of the objective function $L$ and the number of total gradient evaluations $N$. Such requirements imposes difficulties in the actual implementation, not only because that it can be difficult to choose proper values of $L$ and $N$ that satisfies the conditions for convergence, but also since conservative choices of $L$ and $N$ can deteriorate the practical numerical performance of the CGS method. Our proposed method, called the conditional gradient sliding method with linesearch (CGS-ls), does not require the knowledge of either $L$ and $N$, and is able to terminate early before the theoretically required number of iterations. While more practical in numerical implementation, the theoretical performance of our proposed CGS-ls method is still as good as that of the CGS method. We present numerical experiments to show the efficiency of our proposed method in practice.
更多
查看译文
关键词
conditional gradient,linesearch
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要