题名 | Unsupervised Domain Adaptation via Bidirectional Cross-Attention Transformer |
作者 | |
通讯作者 | Zhang, Yu |
DOI | |
发表日期 | 2023
|
会议名称 | European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)
|
ISSN | 2945-9133
|
EISSN | 1611-3349
|
ISBN | 978-3-031-43423-5
|
会议录名称 | |
卷号 | 14173
|
会议日期 | SEP 18-22, 2023
|
会议地点 | null,Turin,ITALY
|
出版地 | GEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND
|
出版者 | |
摘要 | Unsupervised Domain Adaptation (UDA) seeks to utilize the knowledge acquired from a source domain, abundant in labeled data, and apply it to a target domain that contains only unlabeled data. The majority of existing UDA research focuses on learning domain-invariant feature representations for both domains by minimizing the domain gap using convolution-based neural networks. Recently, vision transformers have made significant strides in enhancing performance across various visual tasks. In this paper, we introduce a Bidirectional Cross-Attention Transformer (BCAT) for UDA, which is built upon vision transformers with the goal of improving performance. The proposed BCAT employs an attention mechanism to extract implicit source and target mixup feature representations, thereby reducing the domain discrepancy. More specifically, BCAT is designed as a weight-sharing quadruple-branch transformer with a bidirectional cross-attention mechanism, allowing it to learn domain-invariant feature representations. Comprehensive experiments indicate that our proposed BCAT model outperforms existing state-of-the-art UDA methods, both convolution-based and transformer-based, on four benchmark datasets. |
关键词 | |
学校署名 | 第一
; 通讯
|
语种 | 英语
|
相关链接 | [来源记录] |
收录类别 | |
资助项目 | NSFC["62136005","62076118"]
; Shenzhen fundamental research program[JCYJ20210324105000003]
|
WOS研究方向 | Computer Science
|
WOS类目 | Computer Science, Artificial Intelligence
; Computer Science, Theory & Methods
|
WOS记录号 | WOS:001156142300019
|
来源库 | Web of Science
|
引用统计 | |
成果类型 | 会议论文 |
条目标识符 | http://sustech.caswiz.com/handle/2SGJ60CL/673861 |
专题 | 工学院_计算机科学与工程系 |
作者单位 | 1.Department of Computer Science and Engineering, Southern University of Science and Technology, Shenzhen, China 2.School of Computer Science, Faculty of Engineering, University of Sydney, Camperdown, Australia 3.Peng Cheng Laboratory, Shenzhen, China |
第一作者单位 | 计算机科学与工程系 |
通讯作者单位 | 计算机科学与工程系 |
第一作者的第一单位 | 计算机科学与工程系 |
推荐引用方式 GB/T 7714 |
Wang, Xiyu,Guo, Pengxin,Zhang, Yu. Unsupervised Domain Adaptation via Bidirectional Cross-Attention Transformer[C]. GEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND:SPRINGER INTERNATIONAL PUBLISHING AG,2023.
|
条目包含的文件 | 条目无相关文件。 |
|
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论