中文版 | English
题名

Accelerating Vision-Language Pretraining with Free Language Modeling

作者
DOI
发表日期
2023
ISSN
1063-6919
ISBN
979-8-3503-0130-4
会议录名称
卷号
2023-June
页码
23161-23170
会议日期
17-24 June 2023
会议地点
Vancouver, BC, Canada
摘要
The state of the arts in vision-language pretraining (VLP) achieves exemplary performance but suffers from high training costs resulting from slow convergence and long training time, especially on large-scale web datasets. An essential obstacle to training efficiency lies in the entangled prediction rate (percentage of tokens for reconstruction) and corruption rate (percentage of corrupted tokens) in masked language modeling (MLM), that is, a proper corruption rate is achieved at the cost of a large portion of output tokens being excluded from prediction loss. To accelerate the convergence of VLP, we propose a new pretraining task, namely, free language modeling (FLM), that enables a 100% prediction rate with arbitrary corruption rates. FLM successfully frees the prediction rate from the tie-up with the corruption rate while allowing the corruption spans to be customized for each token to be predicted. FLM-trained models are encouraged to learn better and faster given the same GPU time by exploiting bidirectional contexts more flexibly. Extensive experiments show FLM could achieve an impressive 2.5 × pretraining time reduction in comparison to the MLM-based methods, while keeping competitive performance on both vision-language understanding and generation tasks. Code will be public at https://github.com/TencentARC/FLM.
关键词
学校署名
第一
相关链接[IEEE记录]
收录类别
WOS记录号
WOS:001062531307047
EI入藏号
20234114867548
来源库
IEEE
全文链接https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10204651
引用统计
被引频次[WOS]:1
成果类型会议论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/559186
专题南方科技大学
作者单位
1.Southern University of Science and Technology
2.ARC Lab
3.Tencent PCG
4.The University of Hong Kong
第一作者单位南方科技大学
第一作者的第一单位南方科技大学
推荐引用方式
GB/T 7714
Teng Wang,Yixiao Ge,Feng Zheng,et al. Accelerating Vision-Language Pretraining with Free Language Modeling[C],2023:23161-23170.
条目包含的文件
条目无相关文件。
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[Teng Wang]的文章
[Yixiao Ge]的文章
[Feng Zheng]的文章
百度学术
百度学术中相似的文章
[Teng Wang]的文章
[Yixiao Ge]的文章
[Feng Zheng]的文章
必应学术
必应学术中相似的文章
[Teng Wang]的文章
[Yixiao Ge]的文章
[Feng Zheng]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。