Self-Paced Broad Learning System

IEEE transactions on cybernetics(2023)

引用 12|浏览50
暂无评分
摘要
Broad learning system (BLS), an efficient neural network with a flat structure, has received a lot of attention due to its advantages in training speed and network extensibility. However, the conventional BLS adopts the least square loss, which treats each sample equally and thus is sensitivity to noise and outliers. To address this concern, in this article we propose a self-paced BLS (SPBLS) model by incorporating the novel self-paced learning (SPL) strategy into the network for noisy data regression. With the assistance of the SPL criterion, the model output is used as feedback to learn appropriate priority weight to readjust the importance of each sample. Such a reweighting strategy can help SPBLS to distinguish samples from "easy" to "difficult" in model training, equipping the model robust to noise and outliers while maintaining the characteristics of the original system. Moreover, two incremental learning algorithms associated to SPBLS have also been developed, with which the system can be updated quickly and flexibly without retraining the entire model when new training samples are added or the network needs to be expanded. Experiments conducted on various datasets demonstrate that the proposed SPBLS can achieve satisfying performance for noisy data regression.
更多
查看译文
关键词
Training,Noise measurement,Data models,Feature extraction,Learning systems,Robustness,Manifold learning,Broad learning system (BLS),importance reweighting strategy,noisy data regression,self-paced learning (SPL)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要