中文版 | English
题名

相机-激光雷达联合标定研究

其他题名
RESEARCH ON JOINT CALIBRATION OF CAMERA AND LIDAR
姓名
姓名拼音
XUN Zhi
学号
12132587
学位类型
硕士
学位专业
0809 电子科学与技术
学科门类/专业学位类别
08 工学
导师
李慧云
导师单位
深圳理工大学(筹)
论文答辩日期
2024-05-11
论文提交日期
2024-07-04
学位授予单位
南方科技大学
学位授予地点
深圳
摘要

在自动驾驶系统中,多传感器融合技术是通过整合不同传感器数据实现精确环境感知,提高感知系统准确性与鲁棒性的有效方法。在众多传感器中,相机和激光雷达因为拥有互补的信息获取能力而成为自动驾驶领域的核心设备。相机提供丰富的纹理信息,却缺乏精确的空间信息,并且对光照强度敏感。激光雷达提供准确的三维环境信息,却由于点云稀疏特性,存在远距离目标语义信息难提取等问题。二者的数据融合可以有效解决单一传感器存在的缺陷,而获取传感器内部参数以及相互间的外部参数是数据融合准确的前提和关键。因此,如何精确、便捷地通过标定确定相机和激光雷达之间的内参和外参是需要解决的核心问题,具有重要的研究价值。

针对相机和激光雷达联合标定存在累计误差、受点云噪声影响大、标定过程自动化程度低等问题,本文提出一种基于联合标定物进行标定的方法,旨在减小累计误差和点云噪声对标定结果的影响。本文主要研究内容如下:

1)采用联合标定板进行相机-激光雷达的标定。标定板由棋盘格图案和四个大小相同的圆孔组成。利用标定板采集的数据同时包括相机和激光雷达需要的约束条件,通过一次性的数据采集完成内外参数的标定,避免了传统标定方法的测量需两次采集数据的做法,从而减小了数据采集过程中因传感器间位姿发生变化而引入的累计误差e1

2)设计一套圆心坐标提取优化算法,该算法能够准确提取标定板上四个圆孔的圆心坐标。利用标定板的先验物理知识创建模板点云用来提供约束条件和初始圆心坐标。随后利用PnPPerspective-n-Point problem)算法对采集数据中提取的目标点云和模板点云进行位姿匹配,通过迭代方法求解圆心最优坐标。最后最小化图像与点云中的圆心重投影误差以得到优化后的内外参数。相比于传统点云拟合的方法,该算法有效减小了点云稀疏性引入的测量误差e2 带来的影响,提高了外参标定的精度

实验结果证明,本文的相机-激光雷达联合标定方法简化了数据采集过程,标定结果的重投影误差减小,提高了传感器的标定精度,是一种具有准确性和鲁棒性的标定方法。

关键词
语种
中文
培养类别
独立培养
入学年份
2021
学位授予年份
2024-06
参考文献列表

[1] 王佳荣.面向自动驾驶的多传感器三维环境感知系统关键技术研究[D].中国科学院大学(中国科学院长春光学精密机械与物理研究所), 2020.
[2] 周晓蕾.基于激光雷达和相机的三维目标检测算法研究[D].华北电力大学(北京), 2020.
[3] 刘健.基于三维激光雷达的无人驾驶车辆环境建模关键技术研究[D].中国科学技术大学, 2016.
[4] 邹斌,谭亮,侯献军.基于激光雷达的道路可行区域检测[J].武汉理工大学学报:交通科学与工程版, 2017, 41(2): 203-207.
[5] Fan R, Wang H, Cai P, et al. Sne-roadseg: Incorporating surface normal information into semantic segmentation for accurate freespace detection [C]//European Conference on Computer Vision. Springer, 2020: 340-356.
[6] Yeong D J, Velasco-Hernandez G, Barry J, et al. Sensor and sensor fusion technology in autonomous vehicles: A review[J]. Sensors, 2021, 21(6): 2140.
[7] Song L, Wu W, Guo J, et al. Survey on camera calibration technique[C]//2013 5th International conference on intelligent human-machine systems and cybernetics. IEEE, 2013, 2: 389-392.
[8] Wang Y, Li J, Sun Y, et al. A survey of extrinsic calibration of lidar and camera[C]//International Conference on Autonomous Unmanned Systems. Singapore: Springer Singapore, 2021: 933-944.
[9] 王世强,孟召宗,高楠,等.激光雷达与相机融合标定技术研究进展[J].红外与激光工程,2023,52(08):119-132.
[10] Hall E L, Tio J B K, McPherson C A, et al. Measuring curved surfaces for robot vision[J]. Computer, 1982, 15(12): 42-54.
[11] Faugeras O D. The calibration problem for stereo[C]//Proc. IEEE Conf. on Computer Vision and Pattern Recognition. 1986: 15-20.
[12] American Society of Photogrammetry. Manual of photogrammetry[M]. American Society of photogrammetry, 1952.
[13] Salvi J. An approach to coded structured light to obtain three dimensional information[M]. Universitat de Girona, 1998.
[14] Tsai R. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses[J]. IEEE Journal on Robotics and Automation, 1987, 3(4): 323-344.
[15] De la Escalera A, Armingol J M. Automatic chessboard detection for intrinsic and extrinsic camera parameter calibration[J]. Sensors, 2010, 10(3): 2027-2044.
[16] Jin B, Lei H, Geng W. Accurate intrinsic calibration of depth camera with cuboids[C]//Computer Vision–ECCV 2014: 13th European Conference, Zurich, Switzerland, September 6-12, 2014, Proceedings, Part V 13. Springer International Publishing, 2014: 788-803.
[17] An G H, Lee S, Seo M W, et al. Charuco board-based omnidirectional camera calibration method[J]. Electronics, 2018, 7(12): 421.
[18] Bogdan O, Eckstein V, Rameau F, et al. DeepCalib: A deep learning approach for automatic intrinsic calibration of wide field-of-view cameras[C]//Proceedings of the 15th ACM SIGGRAPH European Conference on Visual Media Production. 2018: 1-10.
[19] Lopez M, Mari R, Gargallo P, et al. Deep single image camera calibration with radial distortion[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2019: 11817-11825.
[20] Zhang Z. A flexible new technique for camera calibration[J]. IEEE Transactions on pattern analysis and machine intelligence, 2000, 22(11): 1330-1334.
[21] Glennie C, Lichti D D. Static calibration and analysis of the Velodyne HDL-64E S2 for high accuracy mobile scanning[J]. Remote sensing, 2010, 2(6): 1610-1624.
[22] Zalud L, Kocmanova P, Burian F, et al. Calibration and evaluation of parameters in a 3D proximity rotating scanner[J]. Elektronika ir Elektrotechnika, 2015, 21(1): 3-12.
[23] Chan T O, Lichti D D. Automatic in situ calibration of a spinning beam LiDAR system in static and kinematic modes[J]. Remote Sensing, 2015, 7(8): 10480-10500.
[24] Levinson J, Thrun S. Unsupervised calibration for multi-beam lasers[C]//Experimental Robotics: The 12th International Symposium on Experimental Robotics. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014: 179-193.
[25] Wohlfeil J, Grießbach D, Ernst I, et al. Automatic camera system calibration with a chessboard enabling full image coverage[J]. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 2019, 42: 1715-1722.
[26] Cai H, Pang W, Chen X, et al. A novel calibration board and experiments for 3D LiDAR and camera calibration[J]. Sensors, 2020, 20(4): 1130.
[27] Liebrich T, Bringmann B, Knapp W. Calibration of a 3D-ball plate[J]. Precision Engineering, 2009, 33(1): 1-6.
[28] Wang W, Nobuhara S, Nakamura R, et al. Soic: Semantic online intialization and calibration for lidar and camera[J]. arXiv preprint arXiv:2003.04260, 2020.
[29] Geiger A,Moosmann F,Car Ö,et al. Automatic Camera and Range Sensor Calibration Using a Single Shot[C]// IEEE International Conference on Robotics and Automation,Saint Paul,USA,2012
[30] Itami F, Yamazaki T. An improved method for the calibration of a 2-D LiDAR with respect to a camera by using a checkerboard target[J]. IEEE Sensors Journal, 2020, 20(14): 7906-7917.
[31] Zhou L P,Li Z M,Kaess M. Automatic Extrinsic Calibration of a Camera and a 3D LiDAR Using Line and Plane Correspondences[C]// IEEE/RSJ International Conference on Intelligent Robots and Systems,Madrid,Spain,2018
[32] Zhu Fei,Fan Jia,Huang Yuchun,et al. Extrinsic Calibration of Camera and 2D Laser‐Rangefinder with Various Chessboard Constrains[J]. Geomatics and Information Science of Wuhan University,2019,44(10):1524‐1529
[33] Zhang J, Liu Y, Wen M, et al. L 2 V 2 T 2 Calib: Automatic and Unified Extrinsic Calibration Toolbox for Different 3D LiDAR, Visual Camera and Thermal Camera[C]//2023 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2023: 1-7.
[34] Huang Y, Li W, Macheret F, et al. A tutorial on calibration measurements and calibration models for clinical prediction models[J]. Journal of the American Medical Informatics Association, 2020, 27(4): 621-633.
[35] Beltrán J, Guindel C, De La Escalera A, et al. Automatic extrinsic calibration method for lidar and camera sensor setups[J]. IEEE Transactions on Intelligent Transportation Systems, 2022, 23(10): 17677-17689.
[36] Chen Z, Zhuo L, Sun K, et al. Extrinsic calibration of a camera and a laser range finder using point to line constraint[J]. Procedia Engineering, 2012, 29: 4348-4352.
[37] Dong W, Isler V. A novel method for the extrinsic calibration of a 2D laser rangefinder and a camera[J]. IEEE Sensors Journal, 2018, 18(10): 4200-4211.
[38] Pusztai Z, Hajder L. Accurate calibration of LiDAR-camera systems using ordinary boxes[C]//Proceedings of the IEEE International Conference on Computer Vision Workshops. 2017: 394-402.
[39] Fan S, Yu Y, Xu M, et al. High-precision external parameter calibration method for camera and LiDAR based on a calibration device[J]. IEEE Access, 2023, 11: 18750-18760.
[40] Scaramuzza D, Harati A, Siegwart R. Extrinsic self calibration of a camera and a 3d laser range finder from natural scenes[C]//2007 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2007: 4164-4169.
[41] Kang J, Doh N L. Automatic targetless camera–lidar calibration by aligning edge with gaussian mixture model[J]. Journal of Field Robotics, 2020, 37(1): 158-179.
[42] Yuan K, Guo Z, Wang Z J. RGGNet: Tolerance aware LiDAR-camera online calibration with geometric deep learning and generative model[J]. IEEE Robotics and Automation Letters, 2020, 5(4): 6956-6963.
[43] Liu X, Yuan C, Zhang F. Targetless extrinsic calibration of multiple small FoV LiDARs and cameras using adaptive voxelization[J]. IEEE Transactions on Instrumentation and Measurement, 2022, 71: 1-12.
[44] Munoz-Banon M A, Candelas F A, Torres F. Targetless camera-lidar calibration in unstructured environments[J]. IEEE Access, 2020, 8: 143692-143705.
[45] Bai Z X,Jiang G,Xu A L. LiDAR‐Camera Calibration Using Line Correspondences [J]. Sensors,2020,20(21):6 319‐6 336
[46] Moghadam P, Bosse M, Zlot R. Line-based extrinsic calibration of range and image sensors[C]//2013 IEEE International Conference on Robotics and Automation. IEEE, 2013: 3685-3691.
[47] Abedinia A, Hahnb M, Samadzadegana F. An investigation into the registration of LIDAR intensity data and aerial images using the SIFT approach[J]. ratio (first, second), 2008, 2(6).
[48] Pandey G, McBride J, Savarese S, et al. Automatic targetless extrinsic calibration of a 3d lidar and camera by maximizing mutual information[C]//Proceedings of the AAAI conference on artificial intelligence. 2012, 26(1): 2053-2059.
[49] Taylor Z, Nieto J. A mutual information approach to automatic calibration of camera and lidar in natural environments[C]//Australian Conference on Robotics and Automation. 2012: 3-5.
[50] 宋文松, 张宗华, 高楠, 等. 基于强度信息的激光雷达和相机空间位姿标定方法[J]. Laser & Optoelectronics Progress, 2022, 59(2): 0215003-0215003-9.
[51] Tamas L, Kato Z. Targetless calibration of a lidar-perspective camera pair[C]//Proceedings of the IEEE International Conference on Computer Vision Workshops. 2013: 668-675.
[52] Napier A,Corke P,Newman P. Cross‐Calibration of Push‐Broom 2D LiDAR and Cameras in Natural Scenes[C]//IEEE International Conference on Robotics and Automation,Karlsruhe,Germany,2013
[53] Enebuse I, Foo M, Ibrahim B S K K, et al. A comparative review of hand-eye calibration techniques for vision guided robots[J]. IEEE Access, 2021, 9: 113143-113155.
[54] Park F C, Martin B J. Robot sensor calibration: solving AX= XB on the Euclidean group[J]. IEEE Transactions on Robotics and Automation, 1994, 10(5): 717-721.
[55] Shiu Y C, Ahmad S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX= XB[J]. 1987.
[56] Ishikawa R, Oishi T, Ikeuchi K. Lidar and camera calibration using motions estimated by sensor fusion odometry[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 7342-7349.
[57] Dornaika F, Horaud R. Simultaneous robot-world and hand-eye calibration[J]. IEEE transactions on Robotics and Automation, 1998, 14(4): 617-622.
[58] Strobl K H, Hirzinger G. Optimal hand-eye calibration[C]//2006 IEEE/RSJ international conference on intelligent robots and systems. IEEE, 2006: 4647-4653.
[59] Huang K H,Stachniss C. Extrinsic Multi‐sensor Calibration for Mobile Robots Using the Gauss‐Helmert Model[C]//IEEE International Conference on Intelligent Robots and Systems,Vancouver,Canada,2017
[60] Daniilidis K. Hand-eye calibration using dual quaternions[J]. The International Journal of Robotics Research, 1999, 18(3): 286-298.
[61] Qin T, Shen S. Online temporal calibration for monocular visual-inertial systems[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 3662-3669.
[62] Qiu K, Qin T, Pan J, et al. Real-time temporal and rotational calibration of heterogeneous sensors using motion correlation analysis[J]. IEEE Transactions on Robotics, 2020, 37(2): 587-602.
[63] Shi J, Zhu Z, Zhang J, et al. Calibrcnn: Calibrating camera and lidar by recurrent convolutional neural network and geometric constraints[C]//2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2020: 10197-10202.
[64] Wu S, Hadachi A, Vivet D, et al. NetCalib: A Novel Approach for LiDAR-Camera Auto-Calibration Based on Deep Learning[C]//2020 25th International Conference on Pattern Recognition (ICPR). IEEE, 2021: 6648-6655.
[65] Schneider N,Piewak F,Stiller C,et al. RegNet:Multimodal Sensor Registration Using Deep Neural Networks[C]//IEEE Intelligent Vehicles Symposium(IV),Los Angeles,USA,2017
[66] Iyer G,Ram R K,Murthy J K,et al. CalibNet:Geometrically Supervised Extrinsic Calibration Using 3D Spatial Transformer Networks[C]//IEEE International Conference on Intelligent Robots and Systems,Madrid,Spain,2018
[67] Lv X, Wang B, Dou Z, et al. LCCNet: LiDAR and camera self-calibration using cost volume network[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2021: 2894-2901.
[68] Veľas M, Španěl M, Materna Z, et al. Calibration of rgb camera with velodyne lidar[J]. 2014.
[69] Hesch J A, Roumeliotis S I. A direct least-squares (DLS) method for PnP[C]//2011 International Conference on Computer Vision. IEEE, 2011: 383-390.
[70] Kalman D. A singularly valuable decomposition: the SVD of a matrix[J]. The college mathematics journal, 1996, 27(1): 2-23.
[71] Rusinkiewicz S, Levoy M. Efficient variants of the ICP algorithm[C]//Proceedings third international conference on 3-D digital imaging and modeling. IEEE, 2001: 145-152.

所在学位评定分委会
电子科学与技术
国内图书分类号
TN957.5
来源库
人工提交
成果类型学位论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/778944
专题中国科学院深圳理工大学(筹)联合培养
推荐引用方式
GB/T 7714
荀智. 相机-激光雷达联合标定研究[D]. 深圳. 南方科技大学,2024.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可 操作
12132587-荀智-中国科学院深圳理(3949KB)----限制开放--请求全文
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[荀智]的文章
百度学术
百度学术中相似的文章
[荀智]的文章
必应学术
必应学术中相似的文章
[荀智]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。