中文版 | English
题名

针对复杂建筑环境的多传感系统联合标定算法研究

其他题名
THE JOINT CALIBRATION ALGORITHM WITH A MULTI-SENSOR SYSTEM FOR COMPLEX BUILDING ENVIRONMENTS
姓名
姓名拼音
GE Yangtao
学号
12132254
学位类型
硕士
学位专业
0801Z1 智能制造与机器人
学科门类/专业学位类别
08 工学
导师
陈义明
导师单位
机械与能源工程系
论文答辩日期
2024-05-10
论文提交日期
2024-06-27
学位授予单位
南方科技大学
学位授予地点
深圳
摘要

      多传感器联合标定是确保建筑机器人能够将多源异构的传感器在空间和时间 上信息互补并优化处理,进而得到精准鲁棒状态估计的前提和先决条件。本课题 主要针对复杂环境下的建筑质检机器人平台展开研究。一方面,建筑质检机器人 的工作环境复杂多样化,往往需要大量的人为辅助才能建立传感器之间的优化关 系;另一方面,建筑环境属于结构化的低纹理环境,无法为相机和雷达供有效的 参考特征。这两者为建筑质检机器人的多传感器系统标定提出了挑战,因此本文 的主要研究内容如下:

(1)为了满足建筑质检机器人对多源异构传感器融合的需求和解决不同传 感器之间因时间不同步导致数据融合不准确的问题,本文搭建了一套包含激光雷 达、相机、IMU、RTK 等多种类型传感器的系统。对除结构光之外的其他传感器, 本文通过 RTK 上的 GNSS 接收器,将所有的传感器与 GPS 时钟对齐并在硬件上实现了时间同步。

(2)针对结构光相机成像条件特殊,且现有的标定方法无法在减少人为干预 的同时保证标定精度的问题,本文提出了一种基于定制的半球形标定板的结构光 相机-雷达标定算法。该方法通过标定板上的几何约束条件来进一步优化间接获取 的关联特征,使得所提标定方法具有更强的抗干扰能力。此外,由于标定板的特殊设计,所提方法只需一个标定板位姿即可完成标定。

(3)为了解决机器人传感器会不可避免地出现刚性位移,且进一步减少人为 干预辅助标定的问题,本文出了一种基于环境几何信息的相机-IMU-雷达在线标 定算法,通过粗标定和细标定两步走的方式高标定算法的效率。该方法通过单 个传感器的观测数据估计自身的位姿,并利用手眼标定原理获得初始外参;建立 图像边缘特征和边界点云之间的关联,进一步优化得到更准确的外参结果。

综上所述,本文针对建筑质检机器人能实现智能自主运行的应用需求,搭建 了一套包含多种异构传感器的传感器系统,该系统针对不同的传感器采用了不同 的时间同步策略,并ᨀ出了结构光相机-雷达和相机-IMU-雷达的标定方法用以完 成多传感器系统的在空间上的同步。

关键词
语种
中文
培养类别
独立培养
入学年份
2021
学位授予年份
2024-07
参考文献列表

[1] 刘政鑫. 博智林: 建筑机器人解决智能建造核心难题[J]. 机器人产业, 2023: 55-57.
[2] 张顺善, 尹华辉, 张吉松. 建筑机器人研究综述[C]//第九届全国 BIM 学术会议论文集. 中国建筑工业出版社, 2023: 268-275.
[3] 于军琪, 曹建福, 雷小康. 建筑机器人研究现状与展望[J]. 自动化博览, 2016(8): 68-75.
[4] KIM D, GOYAL A, NEWELL A, et al. Semantic relation detection between construction entities to support safe human-robot collaboration in construction[C]//ASCE International Conference on Computing in Civil Engineering 2019. American Society of Civil Engineers Reston,VA, 2019: 265-272.
[5] LIU Y, JEBELLI H. Intention estimation in physical human-robot interaction in constuction:Empowering robots to gauge workers’posture[C]//Construction Research Congress 2022. 2022:621-630.
[6] ZHOU T, ZHU Q, SHI Y, et al. Construction robot teleoperation safeguard based on real-time human hand motion prediction[J]. Journal of Construction Engineering and Management, 2022, 148(7): 04022040.
[7] NAGANO H, TAKENOUCHI H, CAO N, et al. Tactile feedback system of high-frequecy vibration signals for supporting delicate teleoperation of construction robots[J]. Advanced Robotics, 2020, 34(11): 730-743.
[8] BRAUN A, TUTTAS S, BORRMANN A, et al. A concept for automated construction progress monitoring using bim-based geometric constraints and photogrammetric point clouds.[J]. J. Inf. Technol. Constr., 2015, 20(5): 68-79.
[9] RAHIMIAN F P, SEYEDZADEH S, OLIVER S, et al. On-demand monitoring of construction projects through a game-like hybrid application of BIM and machine learning[J]. Automation in Construction, 2020, 110: 103012.
[10] IKEDA T, BANDO N, YAMADA H. Semi-automatic visual support system with drone for teleoperated construction robot[J]. Journal of Robotics and Mechatronics, 2021, 33(2): 313-321.
[11] 傅博. 移动车辆的多传感器标定[D]. 浙江大学, 2022.
[12] LIU Y, FU Y, QIN M, et al. BotanicGarden: A high-quality and large-scale robot navigation dataset in challenging natural environments[A]. 2023.
[13] ZHAO S, SINGH D, SUN H, et al. Subt-mrs: A subterranean, multi-robot, multi-spectral and multi-degraded dataset for robust slam[A]. 2023.
[14] FENG D, QI Y, ZHONG S, et al. S3e: A large-scale multimodal dataset for collaborative slam[A]. 2022.
[15] TSCHOPP F, RINER M, FEHR M, et al. Versavis—an open versatile multi-camera visualinertial sensor suite[J]. Sensors, 2020, 20(5): 1439.
[16] GAO L, LIANG Y, YANG J, et al. Vector: A versatile event-centric benchmark for multi-sensorslam[J]. IEEE Robotics and Automation Letters, 2022, 7(3): 8217-8224.
[17] JIAO J, WEI H, HU T, et al. Fusionportable: A multi-sensor campus-scene dataset forevaluation of localization and mapping accuracy on diverse platforms[C]//2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2022: 3851-3856.
[18] FAIZULLIN M, KORNILOVA A, FERRER G. Open-source LiDAR time synchronization system by mimicking GNSS-clock[C]//2022 IEEE International Symposium on Precision Clock Synchronization for Measurement, Control, and Communication (ISPCS). IEEE, 2022: 1-5.
[19] ZHU Y, ZHENG C, YUAN C, et al. Camvox: A low-cost and accurate LiDAR-assisted visual slam system[C]//2021 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2021: 5049-5055.
[20] ZHANG X, ZHU S, GUO S, et al. Line-based automatic extrinsic calibration of LiDAR and camera[C]//2021 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2021: 9347-9353.
[21] ZHAO Y, WANG Y, TSAI Y. 2D-image to 3D-range registration in urban environments via scene categorization and combination of similarity measurements[C]//2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2016: 1866-1872.
[22] SCHNEIDER N, PIEWAK F, STILLER C, et al. RegNet: Multimodal sensor registration using deep neural networks[C]//2017 IEEE intelligent vehicles symposium (IV). IEEE, 2017: 1803-1810.
[23] IYER G, RAM R K, MURTHY J K, et al. CalibNet: Geometrically supervised extrinsic calibration using 3D spatial transformer networks[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 1110-1117.
[24] LV X, WANG B, DOU Z, et al. LCCNet: LiDAR and camera self-calibration using cost volume network[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2021: 2894-2901.
[25] WANG W, NOBUHARA S, NAKAMURA R, et al. Soic: Semantic online initialization and calibration for LiDAR and camera[A]. 2020.
[26] TSAI R Y, LENZ R K, et al. A new technique for fully autonomous and effcient 3d robotics hand/eye calibration[J]. IEEE Transactions on robotics and automation, 1989, 5(3): 345-358.
[27] ISHIKAWA R, OISHI T, IKEUCHI K. LiDAR and camera calibration using motions estimated by sensor fusion odometry[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 7342-7349.
[28] GIAMOU M, MA Z, PERETROUKHIN V, et al. Certifably globally optimal extrinsic calibration from per-sensor egomotion[J]. IEEE Robotics and Automation Letters, 2019, 4(2):367-374.
[29] HORN M, WODTKO T, BUCHHOLZ M, et al. Online extrinsic calibration based on per-sensor ego-motion using dual quaternions[J]. IEEE Robotics and Automation Letters, 2021, 6(2): 982-989.
[30] LEPETIT V, MORENO-NOGUER F, FUA P. EPnP: An accurate O(n) solution to the PnP problem[J]. International journal of computer vision, 2009, 81: 155-166.
[31] KÜMMERLE R, GRISETTI G, STRASDAT H, et al. g2o: A general framework for graph optimization[C]//2011 IEEE International Conference on Robotics and Automation. IEEE, 2011: 3607-3613.
[32] ZHOU L, LI Z, KAESS M. Automatic extrinsic calibration of a camera and a 3d LiDAR using line and plane correspondences[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 5562-5569.
[33] KOO G, KANG J, JANG B, et al. Analytic plane covariances construction for precise planarity based extrinsic calibration of camera and LiDAR[C]//2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2020: 6042-6048.
[34] ZHANG Q, PLESS R. Extrinsic calibration of a camera and laser range fnder (improves camera calibration)[C]//2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)(IEEE Cat. No. 04CH37566): volume 3. IEEE, 2004: 2301-2306.
[35] UNNIKRISHNAN R, HEBERT M. Fast extrinsic calibration of a laser rangefnder to a camera[J]. Robotics Institute, Pittsburgh, PA, Tech. Rep. CMU-RI-TR-05-09, 2005.
[36] WANG W, SAKURADA K, KAWAGUCHI N. Refectance intensity assisted automatic and accurate extrinsic calibration of 3d LiDAR and panoramic camera using a printed chessboard[J]. Remote Sensing, 2017, 9(8): 851.
[37] XIE Y, SHAO R, GULI P, et al. Infrastructure based calibration of a multi-camera and multiLiDAR system using apriltags[C]//2018 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2018: 605-610.
[38] GEIGER A, MOOSMANN F, CAR Ö, et al. Automatic camera and range sensor calibration using a single shot[C]//2012 IEEE international conference on robotics and automation. IEEE, 2012: 3936-3943.
[39] FANG C, DING S, DONG Z, et al. Single-shot is enough: Panoramic infrastructure based calibration of multiple cameras and 3D LiDARs[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2021: 8890-8897.
[40] DEBATTISTI S, MAZZEI L, PANCIROLI M. Automated extrinsic laser and camera inter calibration using triangular targets[C]//2013 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2013: 696-701.
[41] PARK Y, YUN S, WON C S, et al. Calibration between color camera and 3D LiDAR instruments with a polygonal planar board[J]. Sensors, 2014, 14(3): 5333-5353.
[42] LIAO Q, CHEN Z, LIU Y, et al. Extrinsic calibration of LiDAR and camera with polygon[C]//2018 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, 2018:200-205.
[43] DENG Z, XIONG L, YIN D, et al. Joint calibration of dual LiDARs and camera using a circular chessboard[R]. SAE Technical Paper, 2020.
[44] FREMONT V, RODRIGUEZ F S A, BONNIFAIT P. Circular targets for 3d alignment of video and LiDAR sensors[J]. Advanced Robotics, 2012, 26(18): 2087-2113.
[45] PEREIRA M, SILVA D, SANTOS V, et al. Self calibration of multiple LiDARs and camerason autonomous vehicles[J]. Robotics and Autonomous Systems, 2016, 83: 326-337.
[46] KÜMMERLE J, KÜHNER T, LAUER M. Automatic calibration of multiple cameras anddepth sensors with a spherical target[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 1-8.
[47] BELTRÁN J, GUINDEL C, DE LA ESCALERA A, et al. Automatic extrinsic calibrationmethod for LiDAR and camera sensor setups[J]. IEEE Transactions on Intelligent Transportation Systems, 2022, 23(10): 17677-17689.
[48] TÓTH T, PUSZTAI Z, HAJDER L. Automatic LiDAR-camera calibration of extrinsic parameters using a spherical target[C]//2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2020: 8580-8586.
[49] DHALL A, CHELANI K, RADHAKRISHNAN V, et al. LiDAR-camera calibration using 3D3D point correspondences[A]. 2017.
[50] MUR-ARTAL R, TARDÓS J D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras[J]. IEEE transactions on robotics, 2017, 33(5): 1255-1262.
[51] YANG Z, SHEN S. Monocular visual-inertial state estimation with online initialization and camera-IMU extrinsic calibration[J]. IEEE Transactions on Automation Science and Engineering, 2016, 14(1): 39-51.
[52] MIRZAEI F M, ROUMELIOTIS S I. A kalman flter-based algorithm for IMU-camera calibration: Observability analysis and performance evaluation[J]. IEEE transactions on robotics, 2008, 24(5): 1143-1156.
[53] FURGALE P, REHDER J, SIEGWART R. Unifed temporal and spatial calibration for multisensor systems[C]//2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2013: 1280-1286.
[54] ECKENHOFF K, GENEVA P, BLOECKER J, et al. Multi-camera visual-inertial navigation with online intrinsic and extrinsic calibration[C]//2019 International Conference on Robotics and Automation (ICRA). IEEE, 2019: 3158-3164.
[55] ZUO X, GENEVA P, LEE W, et al. Lic-fusion: LiDAR-inertial-camera odometry[C]//2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2019: 5848-5854.
[56] ZHI X, HOU J, LU Y, et al. Multical: Spatiotemporal calibration for multiple IMUs, cameras and LiDARs[C]//2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2022: 2446-2453.
[57] REHDER J, BEARDSLEY P, SIEGWART R, et al. Spatio-temporal laser to visual/inertial calibration with applications to hand-held, large scale scanning[C]//2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2014: 459-465.
[58] LI X, CHEN S, LI S, et al. Accurate and consistent spatiotemporal calibration for heterogenouscamera/IMU/LiDAR system based on continuous-time batch estimation[J]. IEEE/ASME Transactions on Mechatronics, 2023.
[59] LI S, LI X, CHEN S, et al. Two-Step LiDAR/camera/IMU spatial and temporal calibration basedon continuous-time trajectory estimation[J]. IEEE Transactions on Industrial Electronics, 2023(99): 1-10.
[60] BENTLEY J L. Multidimensional binary search trees used for associative searching[J]. Communications of the ACM, 1975, 18(9): 509-517.
[61] KOENIG N, HOWARD A. Design and use paradigms for gazebo, an open-source multirobot simulator[C]//2004 IEEE/RSJ international conference on intelligent robots and systems (IROS)(IEEE Cat. No. 04CH37566): volume 3. IEEE, 2004: 2149-2154.
[62] JIAO J, CHEN F, WEI H, et al. LCE-Calib: Automatic LiDAR-frame/event camera extrinsic calibration with a globally optimal solution[A]. 2023. arXiv: 2303.09825.
[63] 高翔. 视觉 SLAM 十四讲: 从理论到实践[M]. 电子工业出版社, 2017.
[64] ZHANG Z. Flexible camera calibration by viewing a plane from unknown orientations[C]// Proceedings of the seventh ieee international conference on computer vision: volume 1. Ieee,1999: 666-673.
[65] ZHANG J, SIRITANAWAN P, YUE Y, et al. A two-step method for extrinsic calibration between a sparse 3d LiDAR and a thermal camera[C]//2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV). IEEE, 2018: 1039-1044.
[66] BARTOLI A, STURM P. Structure-from-motion using lines: Representation, triangulation, and bundle adjustment[J]. Computer vision and image understanding, 2005, 100(3): 416-441.

所在学位评定分委会
力学
国内图书分类号
TP242
来源库
人工提交
成果类型学位论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/766219
专题工学院_机械与能源工程系
推荐引用方式
GB/T 7714
葛扬涛. 针对复杂建筑环境的多传感系统联合标定算法研究[D]. 深圳. 南方科技大学,2024.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可 操作
12132254-葛扬涛-机械与能源工程(10042KB)----限制开放--请求全文
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[葛扬涛]的文章
百度学术
百度学术中相似的文章
[葛扬涛]的文章
必应学术
必应学术中相似的文章
[葛扬涛]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。