A Convergence Path to Deep Learning on Noisy Labels.

IEEE transactions on neural networks and learning systems(2024)

引用 2|浏览12
暂无评分
摘要
In many real-world machine learning classification applications, the model performance based on deep neural networks (DNNs) oftentimes suffers from label noise. Various methods have been proposed in the literature to address this issue, primarily by focusing on designing noise-tolerant loss functions, cleaning label noise, and correcting the objective loss. However, the noise-tolerant loss functions face challenges when the noise level increases. This article aims to reveal a convergence path of a trained model in the presence of label noise, and here, the convergence path depicts the evolution of a trained model over epochs. We first propose a theorem to demonstrate that any surrogate loss function can be used to learn DNNs from noisy labels. Next, theories on the general convergence path for the deep models under label noise are presented and verified through a series of experiments. In addition, we design an algorithm based on the proposed theorems that make efficient corrections on the noisy labels and achieve strong robustness in the DNN models. We designed several experiments using benchmark datasets to assess noise tolerance and verify the theorems presented in this article. The comprehensive experimental results firmly confirm our theoretical results and also clearly validate the effectiveness of our method under various levels of label noise.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要