中文版 | English
题名

Efficient Evolutionary Search of Attention Convolutional Networks via Sampled Training and Node Inheritance

作者
通讯作者Jin,Yaochu
发表日期
2020
DOI
发表期刊
ISSN
1089-778X
EISSN
1941-0026
卷号25期号:2页码:371-385
摘要

The performance of deep neural networks is heavily dependent on its architecture and various neural architecture search strategies have been developed for automated network architecture design. Recently, evolutionary neural architecture search (EvoNAS) has received increasing attention due to the attractive global optimization capability of evolutionary algorithms. However, EvoNAS suffers from extremely high computational costs because a large number of performance evaluations are usually required in evolutionary optimization and training deep neural networks is itself computationally very expensive. To address this issue, this paper proposes a computationally efficient framework for evolutionary search of convolutional networks based on a directed acyclic graph, in which parents are randomly sampled and trained on each mini-batch of training data. In addition, a node inheritance strategy is adopted so that the fitness of all offspring individuals can be evaluated without training them. Finally, we encode a channel attention mechanism in the search space to enhance the feature processing capability of the evolved neural networks. We evaluate the proposed algorithm on the widely used datasets, in comparison with 30 state-of-the-art peer algorithms. Our experimental results show the proposed algorithm is not only computationally much more efficient, but also highly competitive in learning performance.

关键词
相关链接[Scopus记录]
收录类别
SCI ; EI
语种
英语
学校署名
其他
EI入藏号
20204909594517
EI主题词
Directed graphs ; Convolution ; Global optimization ; Evolutionary algorithms ; Network architecture
EI分类号
Ergonomics and Human Factors Engineering:461.4 ; Information Theory and Signal Processing:716.1 ; Combinatorial Mathematics, Includes Graph Theory, Set Theory:921.4 ; Optimization Techniques:921.5
ESI学科分类
COMPUTER SCIENCE
Scopus记录号
2-s2.0-85097164791
来源库
Scopus
全文链接https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9268174
引用统计
被引频次[WOS]:49
成果类型期刊论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/209705
专题工学院_计算机科学与工程系
作者单位
1.Engineering Research Center of Digitized Textile & Apparel Technology, Ministry of Education, College of Information Science and Technology, Donghua University, Shanghai 201620, China.
2.Engineering Research Center of Digitized Textile & Apparel Technology, Ministry of Education, College of Information Science and Technology, Donghua University, Shanghai 201620, China, and also with the Department of Computer Science, University of Surrey, Guildford, Surrey GU2 7XH, United Kingdom. (e-mail: yaochu.jin@surrey.ac.uk)
3.Shenzhen Key Laboratory of Computational Intelligence, University Key Laboratory of Evolving Intelligent Systems of Guangdong Province, Department of Computer Science and Engineering, Southern University of Science and Technology, Shenzhen 518055, China.
推荐引用方式
GB/T 7714
Zhang,Haoyu,Jin,Yaochu,Cheng,Ran,et al. Efficient Evolutionary Search of Attention Convolutional Networks via Sampled Training and Node Inheritance[J]. IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION,2020,25(2):371-385.
APA
Zhang,Haoyu,Jin,Yaochu,Cheng,Ran,&Hao,Kuangrong.(2020).Efficient Evolutionary Search of Attention Convolutional Networks via Sampled Training and Node Inheritance.IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION,25(2),371-385.
MLA
Zhang,Haoyu,et al."Efficient Evolutionary Search of Attention Convolutional Networks via Sampled Training and Node Inheritance".IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION 25.2(2020):371-385.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可 操作
Efficient Evolutiona(2166KB)----限制开放--
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[Zhang,Haoyu]的文章
[Jin,Yaochu]的文章
[Cheng,Ran]的文章
百度学术
百度学术中相似的文章
[Zhang,Haoyu]的文章
[Jin,Yaochu]的文章
[Cheng,Ran]的文章
必应学术
必应学术中相似的文章
[Zhang,Haoyu]的文章
[Jin,Yaochu]的文章
[Cheng,Ran]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。