题名 | Evolutionary Algorithms for Sparse Large‐Scale Multi‐Objective Optimization |
作者 | |
发表日期 | 2024
|
DOI | |
发表期刊 | |
摘要 | Summary In addition to developing multi‐objective evolutionary algorithms (MOEAs) for solving general large scale multi‐objective optimization problems (LSMOPs), it is reasonable to improve the convergence speed by involving problem‐specific information in offspring generation, such as the gradients used in mathematical programming methods. This chapter presents the first three MOEAs for sparse large‐scale multi‐objectiveoptimization. The most significant feature of these MOEAs is that they search for the sparse Pareto optimal solutions of specific LSMOPs by customizing effective strategies, such as the population initialization strategy in SparseEA, the Pareto optimal subspace learning strategy in MOEA/PSL, and the evolutionary frequent pattern mining method in PM‐MOEA. Due to the wide application scenarios of sparse large‐scale multi‐objective optimization, it appears as an emerging topic in evolutionary multi‐objective optimization, and an increasing number of new methods are being proposed. |
相关链接 | [IEEE记录] |
学校署名 | 其他
|
ISBN | 9781394178421
|
引用统计 | |
成果类型 | 期刊论文 |
条目标识符 | http://sustech.caswiz.com/handle/2SGJ60CL/803245 |
专题 | 南方科技大学 |
作者单位 | 1.Anhui University, China 2.Southern University of Science and Technology, China 3.Westlake University, China |
推荐引用方式 GB/T 7714 |
Xingyi Zhang,Ran Cheng,Ye Tian,et al. Evolutionary Algorithms for Sparse Large‐Scale Multi‐Objective Optimization[J]. Evolutionary Large-Scale Multi-Objective Optimization and Applications,2024.
|
APA |
Xingyi Zhang,Ran Cheng,Ye Tian,&Yaochu Jin.(2024).Evolutionary Algorithms for Sparse Large‐Scale Multi‐Objective Optimization.Evolutionary Large-Scale Multi-Objective Optimization and Applications.
|
MLA |
Xingyi Zhang,et al."Evolutionary Algorithms for Sparse Large‐Scale Multi‐Objective Optimization".Evolutionary Large-Scale Multi-Objective Optimization and Applications (2024).
|
条目包含的文件 | 条目无相关文件。 |
|
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论