题名 | XLM-D: Decorate Cross-lingual Pre-training Model as Non-Autoregressive Neural Machine Translation |
作者 | |
通讯作者 | Guanhua Chen; Daxin Jiang |
发表日期 | 2022-12-07
|
会议名称 | The 2022 Conference on Empirical Methods in Natural Language Processing
|
会议录名称 | |
页码 | 6934–6946
|
会议日期 | 2022-12-7
|
会议地点 | Abu Dhabi
|
摘要 | Pre-training language models have achieved thriving success in numerous natural language understanding and autoregressive generation tasks, but non-autoregressive generation in applications such as machine translation has not sufficiently benefited from the pre-training paradigm. In this work, we establish the connection between a pre-trained masked language model (MLM) and non-autoregressive generation on machine translation. From this perspective, we present XLM-D, which seamlessly transforms an off-the-shelf cross-lingual pre-training model into a non-autoregressive translation (NAT) model with a lightweight yet effective decorator. Specifically, the decorator ensures the representation consistency of the pre-trained model and brings only one additional trainable parameter. Extensive experiments on typical translation datasets show that our models obtain state-of-the-art performance while realizing the inference speed-up by 19.9x. One striking result is that on WMT14 En-De, our XLM-D obtains 29.80 BLEU points with multiple iterations, which outperforms the previous mask-predict model by 2.77 points. |
学校署名 | 通讯
|
语种 | 英语
|
来源库 | 人工提交
|
全文链接 | https://aclanthology.org/2022.emnlp-main.466/ |
出版状态 | 正式出版
|
成果类型 | 会议论文 |
条目标识符 | http://sustech.caswiz.com/handle/2SGJ60CL/524072 |
专题 | 理学院_统计与数据科学系 |
作者单位 | 1.Tencent Corporation 2.Microsoft Corporation 3.Southern University of Science and Technology 4.Shanghai University of Finance and Economics |
通讯作者单位 | 南方科技大学 |
推荐引用方式 GB/T 7714 |
Yong Wang,Shilin He,Guanhua Chen,et al. XLM-D: Decorate Cross-lingual Pre-training Model as Non-Autoregressive Neural Machine Translation[C],2022:6934–6946.
|
条目包含的文件 | ||||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | 操作 | |
2022.emnlp-main.466+(1294KB) | -- | -- | 限制开放 | -- |
|
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论