谷歌浏览器插件
订阅小程序
在清言上使用

A Transformer-based approach for Fake News detection using Time Series Analysis

2023 7th International Multi-Topic ICT Conference (IMTIC)(2023)

引用 0|浏览3
暂无评分
摘要
Fake news is a growing problem in the digital age, spreading misinformation and affecting public opinion. Existing fake news detection is based on style analysis or news generator's behavior analysis, the former fails if the content is generated using existing news corpus while the latter requires a very specific data set. In this research, we aim to address the issue of fake news detection by employing deep learning-based time series analysis (TSA). We propose a TSA method to gauge the authenticity of the news based on previously available news content of a similar genre. We employed pre-trained models for encoding including GloVe and BERT and transformer-based sequence-to-sequence (Seq2Seq) for TSA. The results demonstrate 98% accuracy of pre-trained models, such as GloVe and BERT, over traditional encoding approaches having accuracy between 77% and 93%. Our study also compares the effectiveness of various deep learning methods, including Long-Short Term Memory (LSTM), Gated Recurrent Unit (GRU), LSTM with attention, GRU with attention, and Transformers with 8 and 16 multi-heads. The results show that Transformers with 8 and 16 multi-heads achieve 98% and 97% accuracy respectively as compared to LSTM (87%) and GRU (88%). Our work is useful for future research in TSA-based fake news detection and proposes to use GloVe and BERT-based encoding and multi-head transform architecture.
更多
查看译文
关键词
NLP,Deep Learning,Time Series Analysis,Fake News,Sequence2Sequence,Attention,Word Embedding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要