Do Deep Neural Network Solutions Form a Star Domain?
arxiv(2024)
摘要
Entezari et al. (2022) conjectured that neural network solution sets
reachable via stochastic gradient descent (SGD) are convex, considering
permutation invariances. This means that two independent solutions can be
connected by a linear path with low loss, given one of them is appropriately
permuted. However, current methods to test this theory often fail to eliminate
loss barriers between two independent solutions (Ainsworth et al., 2022;
Benzing et al., 2022). In this work, we conjecture that a more relaxed claim
holds: the SGD solution set is a star domain that contains a star model that is
linearly connected to all the other solutions via paths with low loss values,
modulo permutations. We propose the Starlight algorithm that finds a star model
of a given learning task. We validate our claim by showing that this star model
is linearly connected with other independently found solutions. As an
additional benefit of our study, we demonstrate better uncertainty estimates on
Bayesian Model Averaging over the obtained star domain. Code is available at
https://github.com/aktsonthalia/starlight.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要