Context-Aware Auto-Encoded Graph Neural Model for Dynamic Question Generation using NLP

Suresh Dara, CH. Srinivasulu,Ch. Madhu Babu, Ananda Ravuri,Tirumala Paruchuri,Abhishek Singh Kilak,Ankit Vidyarthi

ACM Transactions on Asian and Low-Resource Language Information Processing(2023)

引用 0|浏览0
暂无评分
摘要
Question generation is an important task in natural language processing that involves generating questions from a given text. This paper proposes a novel approach for dynamic question generation using a context-aware auto-encoded graph neural model. Our approach involves constructing a graph representation of the input text, where each node in the graph corresponds to a word or phrase in the text, and the edges represent the relationships between them. We then use an auto-encoder model to learn a compressed representation of the graph that captures the most important information in the input text. Finally, we use the compressed graph representation to generate questions by dynamically selecting nodes and edges based on their relevance to the context of the input text. We evaluate our approach on four benchmark datasets (SQuAD, Natural Questions, TriviaQA, and QuAC) and demonstrate that it outperforms existing state-of-the-art methods for dynamic question generation. In the experimentation, to evaluate the result four performance metrics are used i.e. BLEU, ROUGE, F1-Score, and Accuracy. The result of the proposed approach yields an accuracy of 92% on the SQuAD dataset, 89% with QuAC, and 84% with TriviaQA. while on the natural questions dataset, the model gives 79% accuracy. Our results suggest that the use of graph neural networks and auto-encoder models can significantly improve the accuracy and effectiveness of question generation in NLP. Further research in this area can lead to even more sophisticated models that can generate questions that are even more contextually relevant and natural-sounding.
更多
查看译文
关键词
dynamic question generation,graph neural model,nlp,context-aware,auto-encoded
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要