Context-Aware Linguistic Steganography Model Based on Neural Machine Translation

IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING(2024)

引用 0|浏览10
暂无评分
摘要
Linguistic steganography based on text generation is a hot topic in the field of text information hiding. Previous studies have managed to improve the syntactic quality of steganography texts using natural language processing techniques based on deep learning, but their steganography models still lack the ability to control the semantic and contextual characteristics in texts, which is caused by the shortage of relevant information they can obtain. This results in a great decline in the imperceptibility of steganographic texts. To address the problem, we propose a context-aware linguistic steganography method based on neural machine translation called NMT-Stega. The model generates translation containing secret messages based on the neural machine translation model with semantic fusion and language model reference units. In this way, the semantics and contexts of translation are controlled by the additional semantic and contextual features acquired from the text to be translated. Also, a new encoding that combines arithmetic coding with a waiting mechanism is proposed in our model. This method solves the low embedding capacity problem of waiting mechanism while ensuring the semantic and contextual characteristics of steganographic text are less modified. Experimental results show that our model outperforms the previous models and encoding methods in semantic correlation, embedding capacity and imperceptibility.
更多
查看译文
关键词
Neural networks,linguistic steganography,neural machine translation,arithmetic coding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要