A Caputo-Type Fractional-Order Gradient Descent Learning of Deep BP Neural Networks
2019 IEEE 3rd Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC)(2019)
摘要
In recent years, There is a promising area about the Artificial neural networks by using fractional calculus. In this paper, we combine the Caputo operator of the fractional calculas and the conventional gradient descent method to optimize the deep backpropagation neural network and have proved the monotonicity and weak convergence for the presented network in detail. We do some simulations to compare the differences of the performance between presented fractional-order deep BP neural networks and Integer-order BP neural networks by using a large dataset.
更多查看译文
关键词
fractional calculus,backpropagation neural network,caputo derivative,deep learning,mnist
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要