中文版 | English
题名

Unsupervised Domain Adaptation via Bidirectional Cross-Attention Transformer

作者
通讯作者Zhang, Yu
DOI
发表日期
2023
会议名称
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)
ISSN
2945-9133
EISSN
1611-3349
ISBN
978-3-031-43423-5
会议录名称
卷号
14173
会议日期
SEP 18-22, 2023
会议地点
null,Turin,ITALY
出版地
GEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND
出版者
摘要
Unsupervised Domain Adaptation (UDA) seeks to utilize the knowledge acquired from a source domain, abundant in labeled data, and apply it to a target domain that contains only unlabeled data. The majority of existing UDA research focuses on learning domain-invariant feature representations for both domains by minimizing the domain gap using convolution-based neural networks. Recently, vision transformers have made significant strides in enhancing performance across various visual tasks. In this paper, we introduce a Bidirectional Cross-Attention Transformer (BCAT) for UDA, which is built upon vision transformers with the goal of improving performance. The proposed BCAT employs an attention mechanism to extract implicit source and target mixup feature representations, thereby reducing the domain discrepancy. More specifically, BCAT is designed as a weight-sharing quadruple-branch transformer with a bidirectional cross-attention mechanism, allowing it to learn domain-invariant feature representations. Comprehensive experiments indicate that our proposed BCAT model outperforms existing state-of-the-art UDA methods, both convolution-based and transformer-based, on four benchmark datasets.
关键词
学校署名
第一 ; 通讯
语种
英语
相关链接[来源记录]
收录类别
资助项目
NSFC["62136005","62076118"] ; Shenzhen fundamental research program[JCYJ20210324105000003]
WOS研究方向
Computer Science
WOS类目
Computer Science, Artificial Intelligence ; Computer Science, Theory & Methods
WOS记录号
WOS:001156142300019
来源库
Web of Science
引用统计
成果类型会议论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/673861
专题工学院_计算机科学与工程系
作者单位
1.Department of Computer Science and Engineering, Southern University of Science and Technology, Shenzhen, China
2.School of Computer Science, Faculty of Engineering, University of Sydney, Camperdown, Australia
3.Peng Cheng Laboratory, Shenzhen, China
第一作者单位计算机科学与工程系
通讯作者单位计算机科学与工程系
第一作者的第一单位计算机科学与工程系
推荐引用方式
GB/T 7714
Wang, Xiyu,Guo, Pengxin,Zhang, Yu. Unsupervised Domain Adaptation via Bidirectional Cross-Attention Transformer[C]. GEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND:SPRINGER INTERNATIONAL PUBLISHING AG,2023.
条目包含的文件
条目无相关文件。
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[Wang, Xiyu]的文章
[Guo, Pengxin]的文章
[Zhang, Yu]的文章
百度学术
百度学术中相似的文章
[Wang, Xiyu]的文章
[Guo, Pengxin]的文章
[Zhang, Yu]的文章
必应学术
必应学术中相似的文章
[Wang, Xiyu]的文章
[Guo, Pengxin]的文章
[Zhang, Yu]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。