Knowledge graph representation learning and graph neural networks for language understanding.

ACM Conference on Management of Data(2022)

Cited 0|Views26
No score
Abstract
As AI technologies become mature in natural language processing, speech recognition and computer vision, "intelligent" user interfaces emerge to handle complex and diverse tasks that require human-like knowledge and reasoning capability. In Part 1, I will present our recent work on knowledge graph representation learning using Graph Neural Networks (GNNs): the first approach is called orthogonal transform embedding (OTE), which integrates graph context into the embedding distance scoring function and improves prediction accuracy on complex relations such as the difficult N-to-1, 1-to-N and N-to-N cases; the second approach is called multi-hop attention GNN (MAGNA), a principled way to incorporate multi-hop context information into every layer of attention computation. MAGNA uses a diffusion prior on attention values, to efficiently account for all paths between the pair of disconnected nodes. Experimental results on knowledge graph completion as well as node classification benchmarks show that MAGNA achieves state-of-the-art results. In Part 2, I will present how we take advantage of GNNs for language understanding and reasoning tasks. We show that combined with large pre-trained language models and knowledge graph embeddings, GNNs are proven effective in multi-hop reading comprehension across documents, improving time sensitivity for question answering over temporal knowledge graphs, and constructing robust syntactic information for aspect-level sentiment analysis.
More
Translated text
Key words
graph,knowledge,language,neural networks
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined