中文版 | English
题名

A novel automatic acne detection and severity quantification scheme using deep learning

作者
通讯作者Hou, Muzhou; Zhang, Jianglin; Qi, Min
发表日期
2023-07-01
DOI
发表期刊
ISSN
1746-8094
EISSN
1746-8108
卷号84
摘要
Accurate detection and severity quantification of acne are of great significance in the precise treatment of patients. Due to the similar appearance of acne with close severity, it is challenging for dermatologists to grade acne accurately and efficiently. This study aims to propose an accurate and efficient scheme based on deep learning (DL) to assist dermatologists in acne detection and severity quantification. The proposed frame consists of two steps: the Localization deep learning (Localization-DL) model and the Class segmentation (ClassSeg) model. The first model uses the distilled lightweight convolution network as the backbone and extracts multi-scale features through a pyramid pooling module for facial region localization and distinction. The second model is a unified framework that combines a Class module to distinguish background and facial skin sub-images and a segmentation (Seg) module to perform segmentation for different classes to obtain lesion masks. The facial skin segmentation branch of the ClassSeg model is built based on a high-resolution network (HRNet) and modified by mask-aware attention, shuffle attention, and conditional channel weight block. The experiments show that the two models achieve promising results and demonstrate effectiveness in lesion detection compared to other methods. The proposed scheme shows excellent results in acne severity quantification and yields a comparable performance with dermatologists (accuracy: 0.9091 for ours, 0.9301 for SDerms, 0.8741 for IDerms, and 0.7483 for JDerms). The assessment performance also outperforms the existing approaches. This work opens new avenues for acne severity quantification and provides valuable diagnosis evidence for dermatologists in clinical practice.
关键词
相关链接[来源记录]
收录类别
语种
英语
学校署名
通讯
资助项目
Scientific Research Fund of Hunan Provincial Education Department, China[20C0402] ; Hunan First Normal University, China[XYS16N03] ; National Natural Science Foundation of China["82073019","82073018"] ; Shenzhen Science and Technology Innovation Commission, China (Natural Science Foundation of Shenzhen)[JCYJ20210324113001005]
WOS研究方向
Engineering
WOS类目
Engineering, Biomedical
WOS记录号
WOS:000962499100001
出版者
来源库
Web of Science
引用统计
被引频次[WOS]:4
成果类型期刊论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/527712
专题南方科技大学第一附属医院
作者单位
1.Cent South Univ, Sch Math & Stat, Changsha 410083, Peoples R China
2.Cent South Univ, Xiangya Hosp, Dept Dermatol, Changsha 410008, Peoples R China
3.Hunan First Normal Univ, Sch Comp Sci, Changsha 410205, Peoples R China
4.Jinan Univ, Dept Dermatol, Shenzhen Peoples Hosp, Clin Med Coll 2, Shenzhen 518020, Guangdong, Peoples R China
5.Southern Univ Sci & Technol, Affiliated Hosp 1, Shenzhen 518020, Guangdong, Peoples R China
6.Natl Clin Res Ctr Skin Dis, Beijing, Peoples R China
7.Cent South Univ, Xiangya Hosp, Dept Plast Surg, Changsha 410008, Peoples R China
通讯作者单位南方科技大学第一附属医院
推荐引用方式
GB/T 7714
Wang, Jiaoju,Wang, Chong,Wang, Zheng,et al. A novel automatic acne detection and severity quantification scheme using deep learning[J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL,2023,84.
APA
Wang, Jiaoju.,Wang, Chong.,Wang, Zheng.,Hounye, Alphonse Houssou.,Li, Zhaoying.,...&Qi, Min.(2023).A novel automatic acne detection and severity quantification scheme using deep learning.BIOMEDICAL SIGNAL PROCESSING AND CONTROL,84.
MLA
Wang, Jiaoju,et al."A novel automatic acne detection and severity quantification scheme using deep learning".BIOMEDICAL SIGNAL PROCESSING AND CONTROL 84(2023).
条目包含的文件
条目无相关文件。
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[Wang, Jiaoju]的文章
[Wang, Chong]的文章
[Wang, Zheng]的文章
百度学术
百度学术中相似的文章
[Wang, Jiaoju]的文章
[Wang, Chong]的文章
[Wang, Zheng]的文章
必应学术
必应学术中相似的文章
[Wang, Jiaoju]的文章
[Wang, Chong]的文章
[Wang, Zheng]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。