Language Modeling Using Tensor Trains
CoRR(2024)
Abstract
We propose a novel tensor network language model based on the simplest tensor
network (i.e., tensor trains), called `Tensor Train Language Model' (TTLM).
TTLM represents sentences in an exponential space constructed by the tensor
product of words, but computing the probabilities of sentences in a
low-dimensional fashion. We demonstrate that the architectures of Second-order
RNNs, Recurrent Arithmetic Circuits (RACs), and Multiplicative Integration RNNs
are, essentially, special cases of TTLM. Experimental evaluations on real
language modeling tasks show that the proposed variants of TTLM (i.e.,
TTLM-Large and TTLM-Tiny) outperform the vanilla Recurrent Neural Networks
(RNNs) with low-scale of hidden units. (The code is available at
https://github.com/shuishen112/tensortrainlm.)
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined