中文版 | English
题名

ENAO: Evolutionary Neural Architecture Optimization in the Approximate Continuous Latent Space of a Deep Generative Model

作者
DOI
发表日期
2024-07-05
ISSN
2161-4393
ISBN
979-8-3503-5932-9
会议录名称
会议日期
30 June-5 July 2024
会议地点
Yokohama, Japan
摘要
Neural architecture search (NAS) has emerged as a transformative approach for automating the design of neural networks, demonstrating exceptional performance across a variety of tasks. Numerous NAS methods aim to optimize neural architectures within discrete or continuous search spaces, but each method possesses its own inherent limitations. Additionally, the search efficiency is notably impeded by suboptimal encoding methods, presenting an ongoing challenge. In response to these obstacles, this paper introduces a novel approach, evolutionary neural architecture optimization (ENAO), which optimizes architectures in an approximate continuous search space. ENAO begins with training a deep generative model to embed discrete architectures into a condensed latent space, leveraging unsupervised representation learning. Subsequently, evolutionary algorithm is employed to refine neural architectures within this approximate continuous latent space. Empirical comparisons against several NAS benchmarks underscore the effectiveness of the ENAO method. Thanks to its foundation in deep unsupervised representation learning, ENAO demonstrates a distinguished ability to identify high-quality architectures with fewer evaluations and achieve state-of-the-art result in NAS-Bench-201 dataset. Overall, the ENAO method is a promising approach for optimizing neural network architectures in an approximate continuous search space with evolutionary algorithms and may be a useful tool for researchers and practitioners in the field of NAS.
学校署名
第一
相关链接[IEEE记录]
引用统计
成果类型会议论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/828701
专题工学院_机械与能源工程系
工学院_系统设计与智能制造学院
作者单位
1.Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, China
2.School of Systems Science, Beijing Normal University, Beijing, China
3.School of System Design and Intelligent Manufacturing, Southern University of Science and Technology, Shenzhen, China
第一作者单位机械与能源工程系
第一作者的第一单位机械与能源工程系
推荐引用方式
GB/T 7714
Zheng Li,Xuan Rao,Shaojie Liu,et al. ENAO: Evolutionary Neural Architecture Optimization in the Approximate Continuous Latent Space of a Deep Generative Model[C],2024.
条目包含的文件
条目无相关文件。
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[Zheng Li]的文章
[Xuan Rao]的文章
[Shaojie Liu]的文章
百度学术
百度学术中相似的文章
[Zheng Li]的文章
[Xuan Rao]的文章
[Shaojie Liu]的文章
必应学术
必应学术中相似的文章
[Zheng Li]的文章
[Xuan Rao]的文章
[Shaojie Liu]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。