中文版 | English
题名

DynamicKD: An effective knowledge distillation via dynamic entropy correction-based distillation for gap optimizing

作者
通讯作者Shang,Ronghua
发表日期
2024-09-01
DOI
发表期刊
ISSN
0031-3203
卷号153
摘要
The knowledge distillation uses a high-performance teacher network to guide the student network. However, the performance gap between the teacher and student networks can affect the student's training. This paper proposes a novel knowledge distillation algorithm based on dynamic entropy correction, which adjusts the student instead of the teacher to reduce the gap. Firstly, the effect of changing the output entropy (short for output information entropy) on the distillation loss in the student is analyzed in theory. This paper shows that correcting the output entropy can reduce the gap. Then, a knowledge distillation algorithm based on dynamic entropy correction is created, which can correct the output entropy in real-time with an entropy controller updated dynamically by the distillation loss. The proposed algorithm is validated on the CIFAR100, ImageNet, and PASCAL VOC 2007. The comparison with various state-of-the-art distillation algorithms shows impressive results, especially in the experiment on the CIFAR100 regarding teacher–student pair resnet32x4–resnet8x4. The proposed algorithm raises 2.64 points over the traditional distillation algorithm and 0.87 points over the state-of-the-art algorithm CRD in classification accuracy, demonstrating its effectiveness and efficiency.
关键词
相关链接[Scopus记录]
收录类别
SCI ; EI
语种
英语
学校署名
其他
EI入藏号
20241916054022
EI主题词
Convolutional neural networks ; Personnel training ; Students
EI分类号
Chemical Operations:802.3 ; Personnel:912.4
ESI学科分类
ENGINEERING
Scopus记录号
2-s2.0-85192225231
来源库
Scopus
引用统计
成果类型期刊论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/760967
专题南方科技大学
作者单位
1.Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education,School of Artificial Intelligence,Xidian University,Xi'an, Shaanxi Province,710071,China
2.Guangdong Provincial Key Laboratory of Brain-inspired Intelligent Computation,Southern University of Science and Technology,Shenzhen,518055,China
推荐引用方式
GB/T 7714
Zhu,Songling,Shang,Ronghua,Yuan,Bo,et al. DynamicKD: An effective knowledge distillation via dynamic entropy correction-based distillation for gap optimizing[J]. Pattern Recognition,2024,153.
APA
Zhu,Songling.,Shang,Ronghua.,Yuan,Bo.,Zhang,Weitong.,Li,Wenjie.,...&Jiao,Licheng.(2024).DynamicKD: An effective knowledge distillation via dynamic entropy correction-based distillation for gap optimizing.Pattern Recognition,153.
MLA
Zhu,Songling,et al."DynamicKD: An effective knowledge distillation via dynamic entropy correction-based distillation for gap optimizing".Pattern Recognition 153(2024).
条目包含的文件
条目无相关文件。
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[Zhu,Songling]的文章
[Shang,Ronghua]的文章
[Yuan,Bo]的文章
百度学术
百度学术中相似的文章
[Zhu,Songling]的文章
[Shang,Ronghua]的文章
[Yuan,Bo]的文章
必应学术
必应学术中相似的文章
[Zhu,Songling]的文章
[Shang,Ronghua]的文章
[Yuan,Bo]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。