题名 | On the Importance of Feature Separability in Predicting Out-Of-Distribution Error |
作者 | |
通讯作者 | Hongxin Wei |
DOI | |
发表日期 | 2023-12-10
|
会议名称 | 37th Conference on Neural Information Processing Systems (NeurIPS 2023).
|
会议日期 | 2023-12-10
|
会议地点 | the New Orleans Ernest N. Morial Convention Center
|
摘要 | Estimating the generalization performance is practically challenging on out-of-distribution (OOD) data without ground-truth labels. While previous methods emphasize the connection between distribution difference and OOD accuracy, we show that a large domain gap not necessarily leads to a low test accuracy. In this paper, we investigate this problem from the perspective of feature separability empirically and theoretically. Specifically, we propose a dataset-level score based upon feature dispersion to estimate the test accuracy under distribution shift. Our method is inspired by desirable properties of features in representation learning: high inter-class dispersion and high intra-class compactness. Our analysis shows that inter-class dispersion is strongly correlated with the model accuracy, while intra-class compactness does not reflect the generalization performance on OOD data. Extensive experiments demonstrate the superiority of our method in both prediction performance and computational efficiency. |
学校署名 | 通讯
|
来源库 | 人工提交
|
引用统计 | |
成果类型 | 会议论文 |
条目标识符 | http://sustech.caswiz.com/handle/2SGJ60CL/646935 |
专题 | 理学院_统计与数据科学系 |
作者单位 | 1.School of Computer Science and Engineering, Nanyang Technological University 2.Department of Statistics and Data Science, Southern University of Science and Technology |
通讯作者单位 | 统计与数据科学系 |
推荐引用方式 GB/T 7714 |
Renchunzi Xie,Hongxin Wei,Lei Feng,et al. On the Importance of Feature Separability in Predicting Out-Of-Distribution Error[C],2023.
|
条目包含的文件 | ||||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | 操作 | |
2303.15488.pdf(2873KB) | -- | -- | 限制开放 | -- |
|
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论