FeLU:A Fractional Exponential Linear Unit

Ke Zhang,Xinhao Yang, Jianhui Zang,Ze Li

2021 33rd Chinese Control and Decision Conference (CCDC)(2021)

引用 3|浏览3
暂无评分
摘要
Appropriate activation functions need to be selected to train the complex neural networks. Considering the dead zone problem of ReLU function. Therefore, an Exponential Linear Unit (eLU) is proposed for improvement. In this paper, by observing the function curve characteristics of different activation function, network training comparison analysis for different functions, on the basis of eLU activation function adds some is advantageous to the features of network performance, such as non monotonicity, limit tends to zero, and keep the landed weight etc., thus put forward a kind of fractional exponential linear unit (FeLU).Experiments show that FeLU function is superior to eLU function in different depth models and training data sets, and it is an activation function that can effectively improve network performance.
更多
查看译文
关键词
Activation function,Rectified linear unit,Sigmoid function,Tanh function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要