Improving Consistency-Based Semi-Supervised Learning with Weight Averaging.

arXiv: Learning(2018)

引用 25|浏览63
暂无评分
摘要
Recent advances in deep unsupervised learning have renewed interest in semi-supervised methods, which can learn from both labeled and unlabeled data. Presently the most successful approaches to semi-supervised learning are based on consistency regularization, whereby a model is trained to be robust to small perturbations of its inputs and parameters. We show that consistency regularization leads to flatter but narrower optima. We also show that the test error surface for these methods is approximately convex in regions of weight space traversed by SGD. Inspired by these observations, we propose to train consistency based semi-supervised models with stochastic weight averaging (SWA), a recent method which averages weights along the trajectory of SGD. We also develop fast-SWA, which further accelerates convergence by averaging multiple points within each cycle of a cyclical learning rate schedule. With fast-SWA we achieve the best known semi-supervised results on CIFAR-10 and CIFAR-100 over many different numbers of observed training labels. For example, we achieve 95.0% accuracy on CIFAR-10 with only 4000 labels, compared to the previous best result in the literature of 93.7%. We also improve the best known accuracy for domain adaptation from CIFAR-10 to STL from 80% to 83%. Finally, we show that with fast-SWA the simple $Pi$ model becomes state-of-the-art for large labeled settings.
更多
查看译文
关键词
weight averaging,learning,consistency-based,semi-supervised
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要