谷歌浏览器插件
订阅小程序
在清言上使用

SpinalNet: Deep Neural Network With Gradual Input

IEEE Transactions on Artificial Intelligence(2023)

引用 23|浏览76
暂无评分
摘要
Deep neural networks (DNNs) have achieved the state-of-the-art (SOTA) performance in numerous fields. However, DNNs need high computation times, and people always expect better performance in a lower computation. Therefore, we study the human somatosensory system and design a neural network (SpinalNet) to achieve higher accuracy with fewer computations. Hidden layers (HLs) in traditional NNs receive inputs in the previous layer, apply activation function, and then transfer the outcomes to the next layer. In the proposed SpinalNet, each layer is split into three splits: 1) input split, 2) intermediate split, and 3) output split. Input split of each layer receives a part of the inputs. The intermediate split of each layer receives outputs of the intermediate split of the previous layer and outputs of the input split of the current layer. The number of incoming weights becomes significantly lower than traditional DNNs. The SpinalNet can also be used as the fully connected or classification layer of DNN and supports both traditional learning and transfer learning. We observe significant error reductions with lower computational costs in most of the DNNs. Traditional learning on the VGG-5 network with SpinalNet classification layers provided the SOTA performance on QMNIST, Kuzushiji-MNIST, and EMNIST (Letters, Digits, and Balanced) datasets. Traditional learning with ImageNet pretrained initial weights and SpinalNet classification layers provided the SOTA performance on STL-10, Fruits 360, Bird225, and Caltech-101 datasets. The scripts of the proposed SpinalNet training are available at the following link: https://github.com/dipuk0506/SpinalNet .
更多
查看译文
关键词
Classification,regression,ResNet,transfer learning (TL),transferred initialization,VGG
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要