$N$-gram Is Back: Residual Learning of Neural Text Generation with $n$-gram Language Model

October 26, 2022 ยท Entered Twilight ยท ๐Ÿ› findings of EMNLP 2022

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: KNQuery, README.md, fairseq, scripts

Authors Huayang Li, Deng Cai, Jin Xu, Taro Watanabe arXiv ID 2210.14431 Category cs.CL: Computation & Language Cross-listed cs.AI Citations 3 Venue findings of EMNLP 2022 Repository https://github.com/ghrua/NgramRes โญ 22 Last Checked 2 months ago
Abstract
$N$-gram language models (LM) have been largely superseded by neural LMs as the latter exhibits better performance. However, we find that $n$-gram models can achieve satisfactory performance on a large proportion of testing cases, indicating they have already captured abundant knowledge of the language with relatively low computational cost. With this observation, we propose to learn a neural LM that fits the residual between an $n$-gram LM and the real-data distribution. The combination of $n$-gram and neural LMs not only allows the neural part to focus on the deeper understanding of language but also provides a flexible way to customize an LM by switching the underlying $n$-gram model without changing the neural model. Experimental results on three typical language tasks (i.e., language modeling, machine translation, and summarization) demonstrate that our approach attains additional performance gains over popular standalone neural models consistently. We also show that our approach allows for effective domain adaptation by simply switching to a domain-specific $n$-gram model, without any extra training. Our code is released at https://github.com/ghrua/NgramRes.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computation & Language

๐ŸŒ… ๐ŸŒ… Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL ๐Ÿ› NeurIPS ๐Ÿ“š 166.0K cites 8 years ago