中文版 | English
题名

XLM-D: Decorate Cross-lingual Pre-training Model as Non-Autoregressive Neural Machine Translation

作者
通讯作者Guanhua Chen; Daxin Jiang
发表日期
2022-12-07
会议名称
The 2022 Conference on Empirical Methods in Natural Language Processing
会议录名称
页码
6934–6946
会议日期
2022-12-7
会议地点
Abu Dhabi
摘要

Pre-training language models have achieved thriving success in numerous natural language understanding and autoregressive generation tasks, but non-autoregressive generation in applications such as machine translation has not sufficiently benefited from the pre-training paradigm. In this work, we establish the connection between a pre-trained masked language model (MLM) and non-autoregressive generation on machine translation. From this perspective, we present XLM-D, which seamlessly transforms an off-the-shelf cross-lingual pre-training model into a non-autoregressive translation (NAT) model with a lightweight yet effective decorator. Specifically, the decorator ensures the representation consistency of the pre-trained model and brings only one additional trainable parameter. Extensive experiments on typical translation datasets show that our models obtain state-of-the-art performance while realizing the inference speed-up by 19.9x. One striking result is that on WMT14 En-De, our XLM-D obtains 29.80 BLEU points with multiple iterations, which outperforms the previous mask-predict model by 2.77 points.

学校署名
通讯
语种
英语
来源库
人工提交
全文链接https://aclanthology.org/2022.emnlp-main.466/
出版状态
正式出版
成果类型会议论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/524072
专题理学院_统计与数据科学系
作者单位
1.Tencent Corporation
2.Microsoft Corporation
3.Southern University of Science and Technology
4.Shanghai University of Finance and Economics
通讯作者单位南方科技大学
推荐引用方式
GB/T 7714
Yong Wang,Shilin He,Guanhua Chen,et al. XLM-D: Decorate Cross-lingual Pre-training Model as Non-Autoregressive Neural Machine Translation[C],2022:6934–6946.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可 操作
2022.emnlp-main.466+(1294KB)----限制开放--
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[Yong Wang]的文章
[Shilin He]的文章
[Guanhua Chen]的文章
百度学术
百度学术中相似的文章
[Yong Wang]的文章
[Shilin He]的文章
[Guanhua Chen]的文章
必应学术
必应学术中相似的文章
[Yong Wang]的文章
[Shilin He]的文章
[Guanhua Chen]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。