中文版 | English
题名

FUSING LIDAR AND CAMERA: A RELIABLE AND ACCURATE SLAM

其他题名
基于激光雷达和相机融合的可靠且准确的 SLAM 系统
姓名
姓名拼音
HUANG Xu
学号
12032856
学位类型
硕士
学位专业
080902 电路与系统
学科门类/专业学位类别
08 工学
导师
洪小平
导师单位
系统设计与智能制造学院
论文答辩日期
2023-05-17
论文提交日期
2023-06-27
学位授予单位
南方科技大学
学位授予地点
深圳
摘要

Simultaneous localization and mapping (SLAM) is crucial in autonomous driving because autonomous driving requires high-definition maps obtained from SLAM to enable the vehicle’s access to a prior information about the environment in order to make
reliable decisions. To achieve accurate and reliable estimation in SLAM, many sensors and algorithms have been introduced into SLAM over the decades. The study of sensor fusion-based SLAM is of great importance because different sensors have their own
strengths and weaknesses in detecting the environment, and multi-modal sensor fusion achieves the complementary strengths of different sensors, thus enabling accurate modeling of the surrounding environment under different conditions. The tightly coupling based sensor fusion is also a hot research topic in recent years because it can achieve more accurate estimation. However, when there are multiple sensors existing at the same time, how to trust the data from different sensors has also become a major problem plaguing sensor fusion research.
This paper proposes a sensor fusion SLAM method based on uncertainty perception to address the issue of sensor fusion uncertainty in SLAM. The proposed method accurately evaluates sensor measurement noise used in SLAM and determines the use of different sensors based on weights, thus providing a more reliable and accurate environment modeling. Meanwhile, the sensor measurement noise allows us to obtain an uncertainty estimate of the environmental modeling, which turns into a reliable self-assessment tool for SLAM quality.
To address this issue, this paper explored probability-based sensor fusion and uncertainty characterization methods, using a variety of different sensors and measurement models, such as cameras, LiDARs, and IMUs, to achieve sensor complementary advan-
tages. We have also studied the form and propagation mechanism of uncertainty in SLAM, such as landmark points and robot uncertainty propagated in different coordinate systems,
thereby achieving gradual transmission of sensor uncertainty in SLAM. Combining with sliding window filters, we have implemented nonlinear optimization based on uncertainty perception, which can limit the optimization scale and retain historical prior information.
We have experimented with the proposed SLAM system on the dataset collected from Southern University of Science and Technology. The experimental results have shown that nonlinear optimization based on uncertainty perception enabled the SLAM system to achieve more accurate estimation as compared to other similar SLAM systems. The provided uncertainty estimate has also shown a significant correlation with the actual error, making it a reliable self-assessment tool for SLAM quality.

其他摘要

同步定位和建图(SLAM)在自动驾驶中至关重要,因为自动驾驶需要通过
SLAM 得到的高精地图来实现汽车对环境先验信息的获取以便作出可靠的决策。
为了在 SLAM 中实现准确和可靠的估计,几十年来,许多传感器和算法被引入SLAM 中。研究基于传感器融合的 SLAM 有着非常重要的意义,因为不同的传感器在探测环境时都有各自的长处和短处,而基于多模态的传感器融合就可以实现多种传感器的优势互补,从而实现了在不同条件下对周围环境的准确建模。而基于紧耦合的传感器融合因为可以实现更准确的估计,所以也是近些年来的研究热点。可是,当有多个传感器同时存在时,如何去相信不同传感器的数据,也成了困扰传感器融合研究的一大难题。
本论文提出了一种基于不确定性感知的传感器融合 SLAM 方法,旨在解决
SLAM 中传感器融合的不确定性问题。该方法准确评估使用于 SLAM 的传感器测量噪声并根据权重决定不同传感器的使用,从而提供了更可靠准确的环境建模。时,通过传感器测量噪声,我们可得到对环境建模的不确定性估计,实现可靠的自我评估 SLAM 质量的工具。
针对此问题,本文探索了基于概率的传感器融合和不确定性表征的方法,利
用多种不同传感器的不确定性和测量模型,如相机、LiDAR IMU 等等,以实现传感器优势互补。我们还研究了 SLAM 中不确定性的形式及其传播机制,如路标点和机器人的不确定性在不同坐标系下的传播,从而实现了传感器的不确定性在SLAM 中的逐步传输。结合滑动窗口滤波器,我们实现了基于不确定性感知的非线性优化,可以限制优化规模同时保留历史先验信息。我们在南方科技大学收集的数据上对本文所提出的 SLAM 系统进行了实验验证。实验结果证明,基于不确定性感知的非线性优化使得该 SLAM 系统实现了与同类其他 SLAM 系统更准确的估计,同时提供的不确定性估计也和真实的误差表现出了较大的相关性,使其可以成为可靠的自我评估 SLAM 质量的工具。

关键词
其他关键词
语种
英语
培养类别
独立培养
入学年份
2020
学位授予年份
2023-06
参考文献列表

[1] GAO X, ZHANG T, LIU Y, et al. Introduction to Visual SLAM: From Theory to Practice[M/OL]. Publishing House of Electronics Industry, 2021. DOI: 10.1007/978-981-16-4939-4.
[2] SEGAL A, HAEHNEL D, THRUN S. Generalized-ICP[C/OL]//Proceedings of Robotics: Science and Systems. Seattle, USA, 2009. DOI: 10.15607/RSS.2009.V.021.
[3] ZAFFAR M, EHSAN S, STOLKIN R, et al. Sensors, SLAM and Long-term Autonomy: A Review[C/OL]//2018 NASA/ESA Conference on Adaptive Hardware and Systems (AHS). 2018:285-290. DOI: 10.1109/AHS.2018.8541483.
[4] ELFES A. Sonar-based real-world mapping and navigation[J/OL]. IEEE Journal on Robotics and Automation, 1987, 3(3): 249-265. DOI: 10.1109/JRA.1987.1087096.
[5] SMITH R, SELF M, CHEESEMAN P. Estimating uncertain spatial relationships in robotics[C/OL]//Proceedings. 1987 IEEE International Conference on Robotics and Automation: volume 4. 1987: 850-850. DOI: 10.1109/ROBOT.1987.1087846.
[6] MOURIKIS A, ROUMELIOTIS S. Analysis of positioning uncertainty in simultaneous localization and mapping (SLAM)[C/OL]//2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566): volume 1. 2004: 13-20 vol.1. DOI: 10.1109/IROS.2004.1389322.
[7] HUANG S, DISSANAYAKE G. Convergence and Consistency Analysis for Extended Kalman Filter Based SLAM[J/OL]. IEEE Transactions on Robotics, 2007, 23(5): 1036-1049. DOI: 10.1109/TRO.2007.903811.
[8] DAVISON A J, REID I D, MOLTON N D, et al. MonoSLAM: Real-Time Single Camera SLAM[J/OL]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(6):1052-1067. DOI: 10.1109/TPAMI.2007.1049.
[9] KLEIN G, MURRAY D. Parallel Tracking and Mapping for Small AR Workspaces[C/OL]//2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. 2007:225-234. DOI: 10.1109/ISMAR.2007.4538852.
[10] MUR-ARTAL R, MONTIEL J M M, TARDóS J D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System[J/OL]. IEEE Transactions on Robotics, 2015, 31(5): 1147-1163. DOI: 10.1109/TRO.2015.2463671.
[11] MUR-ARTAL R, TARDóS J D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras[J/OL]. IEEE Transactions on Robotics, 2017, 33(5): 1255-1262. DOI: 10.1109/TRO.2017.2705103.
[12] ENGEL J, KOLTUN V, CREMERS D. Direct Sparse Odometry[J/OL]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40(3): 611-625. DOI: 10.1109/TPAMI.2017.2658577.
[13] ENGEL J, SCHÖPS T, CREMERS D. LSD-SLAM: Large-Scale Direct Monocular SLAM[C/OL]//Computer Vision – ECCV 2014. Cham: Springer International Publishing, 2014: 834-849. DOI: 10.1007/978-3-319-10605-2_54.
[14] FORSTER C, PIZZOLI M, SCARAMUZZA D. SVO: Fast semi-direct monocular visual odometry[C/OL]//2014 IEEE International Conference on Robotics and Automation (ICRA). 2014:15-22. DOI: 10.1109/ICRA.2014.6906584.
[15] FORSTER C, ZHANG Z, GASSNER M, et al. SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems[J/OL]. IEEE Transactions on Robotics, 2017, 33(2): 249-265. DOI: 10.1109/TRO.2016.2623335.
[16] GRISETTI G, STACHNISS C, BURGARD W. Improved Techniques for Grid Mapping With Rao-Blackwellized Particle Filters[J/OL]. IEEE Transactions on Robotics, 2007, 23(1): 34-46. DOI: 10.1109/TRO.2006.889486.
[17] KOHLBRECHER S, VON STRYK O, MEYER J, et al. A flexible and scalable SLAM system with full 3D motion estimation[C/OL]//2011 IEEE International Symposium on Safety, Security, and Rescue Robotics. 2011: 155-160. DOI: 10.1109/SSRR.2011.6106777.
[18] KartoSLAM[EB/OL]. http://www.ros.org/wiki/karto.
[19] CoreSLAM[EB/OL]. http://www.ros.orq/wiki/coreslam.
[20] LagoSLAM[EB/OL]. https://github.com/rrg-polito/rrg-polito-ros-pkg.
[21] ZHANG J, SINGH S. LOAM: Lidar Odometry and Mapping in Real-time[C/OL]//Proceedings of Robotics: Science and Systems. Berkeley, USA, 2014. DOI: 10.15607/RSS.2014.X.007.
[22] SHAN T, ENGLOT B. LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain[C/OL]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2018: 4758-4765. DOI: 10.1109/IROS.2018.8594299.
[23] LIU Z, ZHANG F. BALM: Bundle Adjustment for Lidar Mapping[J/OL]. IEEE Robotics and Automation Letters, 2021, 6(2): 3184-3191. DOI: 10.1109/LRA.2021.3062815.
[24] LIU Z, ZHANG F, HONG X. Low-Cost Retina-Like Robotic Lidars Based on Incommensurable Scanning[J/OL]. IEEE/ASME Transactions on Mechatronics, 2022, 27(1): 58-68. DOI: 10.1109/TMECH.2021.3058173.
[25] LIN J, ZHANG F. Loam livox: A fast, robust, high-precision LiDAR odometry and mapping package for LiDARs of small FoV[C/OL]//2020 IEEE International Conference on Robotics and Automation (ICRA). 2020: 3126-3131. DOI: 10.1109/ICRA40945.2020.9197440.
[26] LEUTENEGGER S, FURGALE P, RABAUD V, et al. Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization[C/OL]//Proceedings of Robotics: Science and Systems. Berlin, Germany, 2013. DOI: 10.15607/RSS.2013.IX.037.
[27] LEUTENEGGER S, LYNEN S, BOSSE M, et al. Keyframe-Based Visual-Inertial Odometry Using Nonlinear Optimization[J/OL]. The International Journal of Robotics Research, 2015, 34(3): 314-334. DOI: 10.1177/0278364914554813.
[28] FORSTER C, CARLONE L, DELLAERT F, et al. On-Manifold Preintegration for Real-Time Visual–Inertial Odometry[J/OL]. IEEE Transactions on Robotics, 2017, 33(1): 1-21. DOI: 10.1109/TRO.2016.2597321.REFERENCES
[29] QIN T, LI P, SHEN S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator[J/OL]. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020. DOI: 10.1109/TRO.2018.2853729.
[30] CAMPOS C, ELVIRA R, RODRíGUEZ J J G, et al. ORB-SLAM3: An Accurate OpenSource Library for Visual, Visual–Inertial, and Multimap SLAM[J/OL]. IEEE Transactions on Robotics, 2021, 37(6): 1874-1890. DOI: 10.1109/TRO.2021.3075644.
[31] GENTIL C L, VIDAL-CALLEJA T, HUANG S. IN2LAMA: INertial Lidar Localisation And MApping[C/OL]//2019 International Conference on Robotics and Automation (ICRA). IEEE Press, 2019: 6388–6394. DOI: 10.1109/ICRA.2019.8794429.
[32] XU W, ZHANG F. FAST-LIO: A Fast, Robust LiDAR-Inertial Odometry Package by Tightly-Coupled Iterated Kalman Filter[J/OL]. IEEE Robotics and Automation Letters, 2021, 6(2):3317-3324. DOI: 10.1109/LRA.2021.3064227.
[33] XU W, CAI Y, HE D, et al. FAST-LIO2: Fast Direct LiDAR-Inertial Odometry[J/OL]. IEEE Transactions on Robotics, 2022, 38(4): 2053-2073. DOI: 10.1109/TRO.2022.3141876.
[34] ZHANG J, SINGH S. Visual-lidar odometry and mapping: low-drift, robust, and fast[C/OL]//2015 IEEE International Conference on Robotics and Automation (ICRA). 2015: 2174-2181.DOI: 10.1109/ICRA.2015.7139486.
[35] ZHANG J, SINGH S. Laser–visual–inertial odometry and mapping with high robustness and low drift[J/OL]. Journal of Field Robotics, 2018, 35(8): 1242-1264. DOI: 10.1002/rob.21809.
[36] ZHU Y, ZHENG C, YUAN C, et al. CamVox: A Low-cost and Accurate Lidar-assisted Visual SLAM System[C/OL]//2021 IEEE International Conference on Robotics and Automation(ICRA). 2021: 5049-5055. DOI: 10.1109/ICRA48506.2021.9561149.
[37] What Is Lidar-Camera Calibration[EB/OL]. https://www.mathworks.com/help/lidar/ug/lidar-camera-calibration.html.
[38] CANNY J. A Computational Approach to Edge Detection[J/OL]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1986, PAMI-8(6): 679-698. DOI: 10.1109/TPAMI.1986.4767851.
[39] HUANG G P, MOURIKIS A I, ROUMELIOTIS S I. A First-Estimates Jacobian EKFfor Improving SLAM Consistency[C/OL]//KHATIB O, KUMAR V, PAPPAS G J. Experimental Robotics. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009: 373-382. DOI: 10.1007/978-3-642-00196-3_43.
[40] ECKENHOFF K, PAULL L, HUANG G. Decoupled, consistent node removal and edge sparsification for graph-based SLAM[C/OL]//2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2016: 3275-3282. DOI: 10.1109/IROS.2016.7759505.
[41] BARFOOT T D, FURGALE P T. Associating Uncertainty With Three-Dimensional Poses for Use in Estimation Problems[J/OL]. IEEE Transactions on Robotics, 2014, 30(3): 679-693.DOI: 10.1109/TRO.2014.2298059.
[42] BARFOOT T D. Introduction[M/OL]. Cambridge University Press, 2017: 1–6. DOI: 10.1017/9781316671528.002REFERENCES
[43] MANGELSON J G, GHAFFARI M, VASUDEVAN R, et al. Characterizing the Uncertainty of Jointly Distributed Poses in the Lie Algebra[J/OL]. IEEE Transactions on Robotics, 2020, 36(5): 1371-1388. DOI: 10.1109/TRO.2020.2994457.
[44] SIBLEY G, MATTHIES L, SUKHATME G. Sliding window filter with application to planetary landing[J/OL]. Journal of Field Robotics, 2010, 27(5): 587-608. DOI: 10.1002/rob.20360.
[45] GRUPP M. evo: Python package for the evaluation of odometry and SLAM.[EB/OL]. 2017.https://github.com/MichaelGrupp/evo

所在学位评定分委会
电子科学与技术
国内图书分类号
TP391.4
来源库
人工提交
成果类型学位论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/544160
专题工学院_系统设计与智能制造学院
推荐引用方式
GB/T 7714
Huang X. FUSING LIDAR AND CAMERA: A RELIABLE AND ACCURATE SLAM[D]. 深圳. 南方科技大学,2023.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可 操作
12032856-黄旭-系统设计与智能制(5082KB)----限制开放--请求全文
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[黄旭]的文章
百度学术
百度学术中相似的文章
[黄旭]的文章
必应学术
必应学术中相似的文章
[黄旭]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。