中文版 | English
题名

Neural Network Pruning by Cooperative Coevolution

作者
发表日期
2022
ISSN
1045-0823
会议录名称
页码
4814-4820
摘要
Neural network pruning is a popular model compression method which can significantly reduce the computing cost with negligible loss of accuracy. Recently, filters are often pruned directly by designing proper criteria or using auxiliary modules to measure their importance, which, however, requires expertise and trial-and-error. Due to the advantage of automation, pruning by evolutionary algorithms (EAs) has attracted much attention, but the performance is limited for deep neural networks as the search space can be quite large. In this paper, we propose a new filter pruning algorithm CCEP by cooperative coevolution, which prunes the filters in each layer by EAs separately. That is, CCEP reduces the pruning space by a divide-and-conquer strategy. The experiments show that CCEP can achieve a competitive performance with the state-of-the-art pruning methods, e.g., prune ResNet56 for 63.42% FLOPs on CIFAR10 with −0.24% accuracy drop, and ResNet50 for 44.56% FLOPs on ImageNet with 0.07% accuracy drop.
学校署名
其他
语种
英语
相关链接[Scopus记录]
收录类别
EI入藏号
20223812753694
EI主题词
Deep neural networks
EI分类号
Ergonomics and Human Factors Engineering:461.4
Scopus记录号
2-s2.0-85136987029
来源库
Scopus
成果类型会议论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/402417
专题工学院_计算机科学与工程系
作者单位
1.State Key Laboratory for Novel Software Technology,Nanjing University,Nanjing,210023,China
2.Department of Computer Science and Engineering,Southern University of Science and Technology,Shenzhen,518055,China
推荐引用方式
GB/T 7714
Shang,Haopu,Wu,Jia Liang,Hong,Wenjing,et al. Neural Network Pruning by Cooperative Coevolution[C],2022:4814-4820.
条目包含的文件
条目无相关文件。
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[Shang,Haopu]的文章
[Wu,Jia Liang]的文章
[Hong,Wenjing]的文章
百度学术
百度学术中相似的文章
[Shang,Haopu]的文章
[Wu,Jia Liang]的文章
[Hong,Wenjing]的文章
必应学术
必应学术中相似的文章
[Shang,Haopu]的文章
[Wu,Jia Liang]的文章
[Hong,Wenjing]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。