Chrome Extension
WeChat Mini Program
Use on ChatGLM

Improving Deep Forest by Screening.

IEEE Transactions on Knowledge and Data Engineering(2022)

Cited 16|Views50
No score
Abstract
Most studies about deep learning are based on neural network models, where many layers of parameterized nonlinear differentiable modules are trained by backpropagation. Recently, it has been shown that deep learning can also be realized by non-differentiable modules without backpropagation training called deep forest. We identify that deep forest has high time costs and memory requirements—this has inhibited its use on large-scale datasets. In this paper, we propose a simple and effective approach with three main strategies for efficient learning of deep forest. First, it substantially reduces the number of instances that needs to be processed through redirecting instances having high predictive confidence straight to the final level for prediction, by-passing all the intermediate levels. Second, many non-informative features are screened out, and only the informative ones are used for learning at each level. Third, an unsupervised feature transformation procedure is proposed to replace the supervised multi-grained scanning procedure. Our theoretical analysis supports the proposed approach in varying the model complexity from low to high as the number of levels increases in deep forest. Experiments show that our approach achieves highly competitive predictive performance with reduced time cost and memory requirement by one to two orders of magnitude.
More
Translated text
Key words
Ensemble methods,deep forest,confidence screening,feature screening
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined