Dropout effect on probabilistic neural network

2017 International Conference on Electrical, Computer and Communication Engineering (ECCE)(2017)

Cited 1|Views3
No score
Abstract
To ignore noisy, skewed, correlated, imbalanced and unnecessary features from real life problems, the feature subset selection with learning algorithm was faced some problems of selecting these relevant features. Several factors like-skewed, high kurtosis valued, dependence or correlation influenced features as well as the classifiers performance. A Dropout technique (feature subset selection method) was used to select relevant and important features and mixed with a classifier named as Probabilistic Neural Network (PNN) to find the performance. This study conducted Dropout technique with PNN to prevent above irrelevant features to get better performance. The main task of this method was to apply several techniques (by omitting some fixed features like-skewed, high kurtosis valued, dependent or correlated and Dropout) with PNN in several datasets and noticed that the overall performance for all datasets was significantly improved or unchanged for Dropout with PNN(DPNN) and DPNN always performed better than others.
More
Translated text
Key words
Classification,Feature selection method,Correlation,Dropout,Probabilistic Neural Network
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined