Chrome Extension
WeChat Mini Program
Use on ChatGLM

Infinite-width limit of deep linear neural networks

COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS(2024)

Cited 0|Views14
No score
Abstract
This paper studies the infinite-width limit of deep linear neural networks (NNs) initialized with random parameters. We obtain that, when the number of parameters diverges, the training dynamics converge (in a precise sense) to the dynamics obtained from a gradient descent on an infinitely wide deterministic linear NN. Moreover, even if the weights remain random, we get their precise law along the training dynamics, and prove a quantitative convergence result of the linear predictor in terms of the number of parameters. We finally study the continuous-time limit obtained for infinitely wide linear NNs and show that the linear predictors of the NN converge at an exponential rate to the minimal & ell;2$\ell _2$-norm minimizer of the risk.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined