A Novel Dynamic Attack On Classical Ciphers Using An Attention-Based Lstm Encoder-Decoder Model

IEEE ACCESS(2021)

引用 4|浏览2
暂无评分
摘要
Information security has become an intrinsic part of data communication. Cryptanalysis using deep learning-based methods to identify weaknesses in ciphers has not been thoroughly studied. Recently, long short-term memory (LSTM) networks have shown promising performance in sequential data processing by modeling the dependencies and data dynamics. Given an encrypted ciphertext sequence and corresponding plaintext, by taking advantage of sequential processing, LSTM can adaptively discover the decryption function regardless of the complexity level, which substantially outperforms traditional methods. However, a lengthy ciphertext sequence causes LSTM to lose important information along the sequence, leading to a decrease in network performance. To tackle these problems, we propose adding an attention mechanism to enhance the LSTM sequential processing power. This paper presents a novel, dynamic way to attack classical ciphers by using an attention-based LSTM encoder-decoder for different ciphertext sequence lengths. The proposed approach takes in a sequence of ciphertext and outputs a sequence of plaintext. The effectiveness and flexibility of the proposed model were evaluated on different classical ciphers. We got close to 100% accuracy in breaking all types of classical ciphers in character-level and word-level attacks. We empirically provide further insights into our results on two datasets with short and long ciphertext lengths. In addition, we provide a performance comparison of the proposed method against state-of-the-art methods. The proposed approach has the potential to attack modern ciphers. To the best of our knowledge, this is the first time an attention-based LSTM encoder-decoder has been applied to attack classical ciphers.
更多
查看译文
关键词
Ciphers, Logic gates, Cryptography, Recurrent neural networks, Decoding, Computer architecture, Feature extraction, Cryptanalysis, classical ciphers, attention-based LSTM encoder-decoder, recurrent neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要