题名 | Close the Sim2real Gap via Physically-based Structured Light Synthetic Data Simulation |
作者 | |
DOI | |
发表日期 | 2024-05-17
|
ISBN | 979-8-3503-8458-1
|
会议录名称 | |
会议日期 | 13-17 May 2024
|
会议地点 | Yokohama, Japan
|
摘要 | Despite the substantial progress in deep learning, its adoption in industrial robotics projects remains limited, primarily due to challenges in data acquisition and labeling. Previous sim2real approaches using domain randomization require extensive scene and model optimization. To address these issues, we introduce an innovative physically-based structured light simulation system, generating both RGB and physically realistic depth images, surpassing previous dataset generation tools. We create an RGBD dataset tailored for robotic industrial grasping scenarios and evaluate it across various tasks, including object detection, instance segmentation, and embedding sim2real visual perception in industrial robotic grasping. By reducing the sim2real gap and enhancing deep learning training, we facilitate the application of deep learning models in industrial settings. Project details are available at https://baikaixin-public.github.io/structured_light_3D_synthesizer/ |
学校署名 | 其他
|
相关链接 | [IEEE记录] |
收录类别 | |
引用统计 | |
成果类型 | 会议论文 |
条目标识符 | http://sustech.caswiz.com/handle/2SGJ60CL/803344 |
专题 | 创新创意设计学院 |
作者单位 | 1.Department of Informatics, TAMS (Technical Aspects of Multimodal Systems), Universität Hamburg, Germany 2.Agile Robots AG, Munich, Germany 3.School of Design, Southern University of Science and Technology, Shenzhen, China |
推荐引用方式 GB/T 7714 |
Kaixin Bai,Lei Zhang,Zhaopeng Chen,et al. Close the Sim2real Gap via Physically-based Structured Light Synthetic Data Simulation[C],2024.
|
条目包含的文件 | 条目无相关文件。 |
|
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论