Chrome Extension
WeChat Mini Program
Use on ChatGLM

Distributed Representations For Arithmetic Word Problems

national conference on artificial intelligence(2020)

Cited 23|Views16
No score
Abstract
We consider the task of learning distributed representations for arithmetic word problems. We outline the characteristics of the domain of arithmetic word problems that make generic text embedding methods inadequate, necessitating a specialized representation learning method to facilitate the task of retrieval across a wide range of use cases within online learning platforms. Our contribution is two-fold; first, we propose several 'operators' that distil knowledge of the domain of arithmetic word problems and schemas into word problem transformations. Second, we propose a novel neural architecture that combines LSTMs with graph convolutional networks to leverage word problems and their operator-transformed versions to learn distributed representations for word problems. While our target is to ensure that the distributed representations are schema-aligned, we do not make use of schema labels in the learning process, thus yielding an unsupervised representation learning method. Through an evaluation on retrieval over a publicly available corpus of word problems, we illustrate that our framework is able to consistently improve upon contemporary generic text embeddings in terms of schema-alignment.
More
Translated text
Key words
arithmetic word problems,representations
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined