Fast max-affine regression via stochastic gradient descent

2023 59th Annual Allerton Conference on Communication, Control, and Computing (Allerton)(2023)

引用 0|浏览4
暂无评分
摘要
We consider regression of the max-affine model that combines multiple affine models via the max function. The max-affine model ubiquitously arises in applications such as multiclass classification and auction problems. It also generalizes the forward model in phase retrieval and rectifier linear unit activation function. We present a non-asymptotic convergence analysis of mini-batch stochastic gradient descent (SGD) for max-affine regression when the model is observed at random locations following the sub-Gaussianity with anti-concentration. Under these assumptions, a suitably initialized SGD converges linearly to the ground truth. Due to its low per-iteration cost, SGD converges faster than alternating minimization and gradient descent in run time. Our numerical results corroborate the presented theoretical results.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要