题名 | BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning |
作者 | |
通讯作者 | Shang,Ronghua |
发表日期 | 2023-11-04
|
DOI | |
发表期刊 | |
ISSN | 0950-7051
|
EISSN | 1872-7409
|
卷号 | 279 |
摘要 | Knowledge distillation guides student networks’ training and enhances their performance through excellent teacher networks. However, along with the performance advantages, knowledge distillation also entails a huge computational burden, sometimes tens or even hundreds of times that of traditional training methods. So, this paper designs a book-based knowledge distillation (BookKD) to minimize the costs of knowledge distillation while improving performance. First, a decoupling-based knowledge distillation framework is designed. By decoupling the traditional knowledge distillation process into two independent sub-processes, book-making and book-learning, knowledge distillation can be completed with little resource consumption. Second, a book-making method based on knowledge ensemble and knowledge regularization is developed, which makes books by organizing and processing the knowledge generated by teachers. These books can replace these teachers to provide sufficient knowledge with little distillation costs. Finally, a book-learning method based on entropy dynamic adjustment and label smoothing is designed. The entropy dynamic adjustment optimizes the training loss and mitigates student networks’ difficulty in learning books. Label smoothing alleviates the student network's over-confidence in ground truth labels, which increases its attention to the class similarity knowledge in books. BookKD is tested on three image classification datasets, CIFAR100, ImageNet and ImageNet100, and an object detection dataset PASCAL VOC 2007. The experiment results indicate the advantages of BookKD in reducing distillation costs and improving distillation performance. |
关键词 | |
相关链接 | [Scopus记录] |
收录类别 | |
语种 | 英语
|
学校署名 | 其他
|
资助项目 | National Natural Science Foundation of China[61871306];National Natural Science Foundation of China[62176200];
|
WOS研究方向 | Computer Science
|
WOS类目 | Computer Science, Artificial Intelligence
|
WOS记录号 | WOS:001080612800001
|
出版者 | |
EI入藏号 | 20233714720027
|
EI主题词 | Classification (of information)
; Cost reduction
; Distillation
; Entropy
; Learning systems
; Object detection
; Object recognition
; Personnel training
; Students
|
EI分类号 | Thermodynamics:641.1
; Information Theory and Signal Processing:716.1
; Data Processing and Image Processing:723.2
; Chemical Operations:802.3
; Information Sources and Analysis:903.1
; Personnel:912.4
|
ESI学科分类 | COMPUTER SCIENCE
|
Scopus记录号 | 2-s2.0-85170571218
|
来源库 | Scopus
|
引用统计 |
被引频次[WOS]:5
|
成果类型 | 期刊论文 |
条目标识符 | http://sustech.caswiz.com/handle/2SGJ60CL/559487 |
专题 | 工学院_斯发基斯可信自主研究院 |
作者单位 | 1.Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education,School of Artificial Intelligence,Xidian University,Xi'an,Shaanxi Province,710071,China 2.The Research Institute of Trustworthy Autonomous Systems,Southern University of Science and Technology,Shenzhen,518055,China 3.The Institute of Medical Artificial Intelligence,the Second Affiliated Hospital of Xi'an Jiaotong University,Xi'an,710004,China |
推荐引用方式 GB/T 7714 |
Zhu,Songling,Shang,Ronghua,Tang,Ke,et al. BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning[J]. Knowledge-Based Systems,2023,279.
|
APA |
Zhu,Songling,Shang,Ronghua,Tang,Ke,Xu,Songhua,&Li,Yangyang.(2023).BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning.Knowledge-Based Systems,279.
|
MLA |
Zhu,Songling,et al."BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning".Knowledge-Based Systems 279(2023).
|
条目包含的文件 | 条目无相关文件。 |
|
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论