Neural Machine Translation with Recurrent Attention ModelingEI

    Cited by: 23|Bibtex|27|

    conference of the european chapter of the association for computational linguistics, Volume abs/1607.051082017,


    Knowing which words have been attended to in previous time steps while generating a translation is a rich source of information for predicting what words will be attended to in the future. We improve upon the attention model of Bahdanau et al. (2014) by explicitly modeling the relationship between previous and subsequent attention levels ...More
    Your rating :