A Deep Bidirectional LSTM-GRU Network Model for Automated Ciphertext Classification

IEEE ACCESS(2022)

引用 20|浏览2
暂无评分
摘要
Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU) are a class of Recurrent Neural Networks (RNN) suitable for sequential data processing. Bidirectional LSTM (BLSTM) enables a better understanding of context by learning the future time steps in a bidirectional manner. Moreover, GRU deploys reset and update gates in the hidden layer, which is computationally more efficient than a conventional LSTM. This paper proposes an efficient network model based on deep BLSTM-GRU for ciphertext classification aiming to mark the category to which the ciphertext belongs. The proposed model performance was evaluated using well-known evaluation metrics on two publicly available datasets encrypted with various classical cipher methods and performance was compared against one-dimensional convolutional neural network (1D-CNN) and various other deep learning-based approaches. The experimental results showed that the BLSTM-GRU cell unit network model achieved a high classification accuracy of up to 95.8%. To the best of our knowledge, this is the first time an RNN-based model has been applied for the ciphertext classification.
更多
查看译文
关键词
Logic gates, Ciphers, Recurrent neural networks, Task analysis, Encryption, Convolutional neural networks, Feature extraction, Recurrent neural networks, bidirectional long short-term memory, gated recurrent unit, ciphertext classification, 1D-convolutional neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要