[1] GAO X, ZHANG T, LIU Y, et al. Introduction to Visual SLAM: From Theory to Practice[M/OL]. Publishing House of Electronics Industry, 2021. DOI: 10.1007/978-981-16-4939-4.
[2] SEGAL A, HAEHNEL D, THRUN S. Generalized-ICP[C/OL]//Proceedings of Robotics: Science and Systems. Seattle, USA, 2009. DOI: 10.15607/RSS.2009.V.021.
[3] ZAFFAR M, EHSAN S, STOLKIN R, et al. Sensors, SLAM and Long-term Autonomy: A Review[C/OL]//2018 NASA/ESA Conference on Adaptive Hardware and Systems (AHS). 2018:285-290. DOI: 10.1109/AHS.2018.8541483.
[4] ELFES A. Sonar-based real-world mapping and navigation[J/OL]. IEEE Journal on Robotics and Automation, 1987, 3(3): 249-265. DOI: 10.1109/JRA.1987.1087096.
[5] SMITH R, SELF M, CHEESEMAN P. Estimating uncertain spatial relationships in robotics[C/OL]//Proceedings. 1987 IEEE International Conference on Robotics and Automation: volume 4. 1987: 850-850. DOI: 10.1109/ROBOT.1987.1087846.
[6] MOURIKIS A, ROUMELIOTIS S. Analysis of positioning uncertainty in simultaneous localization and mapping (SLAM)[C/OL]//2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566): volume 1. 2004: 13-20 vol.1. DOI: 10.1109/IROS.2004.1389322.
[7] HUANG S, DISSANAYAKE G. Convergence and Consistency Analysis for Extended Kalman Filter Based SLAM[J/OL]. IEEE Transactions on Robotics, 2007, 23(5): 1036-1049. DOI: 10.1109/TRO.2007.903811.
[8] DAVISON A J, REID I D, MOLTON N D, et al. MonoSLAM: Real-Time Single Camera SLAM[J/OL]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(6):1052-1067. DOI: 10.1109/TPAMI.2007.1049.
[9] KLEIN G, MURRAY D. Parallel Tracking and Mapping for Small AR Workspaces[C/OL]//2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. 2007:225-234. DOI: 10.1109/ISMAR.2007.4538852.
[10] MUR-ARTAL R, MONTIEL J M M, TARDóS J D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System[J/OL]. IEEE Transactions on Robotics, 2015, 31(5): 1147-1163. DOI: 10.1109/TRO.2015.2463671.
[11] MUR-ARTAL R, TARDóS J D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras[J/OL]. IEEE Transactions on Robotics, 2017, 33(5): 1255-1262. DOI: 10.1109/TRO.2017.2705103.
[12] ENGEL J, KOLTUN V, CREMERS D. Direct Sparse Odometry[J/OL]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40(3): 611-625. DOI: 10.1109/TPAMI.2017.2658577.
[13] ENGEL J, SCHÖPS T, CREMERS D. LSD-SLAM: Large-Scale Direct Monocular SLAM[C/OL]//Computer Vision – ECCV 2014. Cham: Springer International Publishing, 2014: 834-849. DOI: 10.1007/978-3-319-10605-2_54.
[14] FORSTER C, PIZZOLI M, SCARAMUZZA D. SVO: Fast semi-direct monocular visual odometry[C/OL]//2014 IEEE International Conference on Robotics and Automation (ICRA). 2014:15-22. DOI: 10.1109/ICRA.2014.6906584.
[15] FORSTER C, ZHANG Z, GASSNER M, et al. SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems[J/OL]. IEEE Transactions on Robotics, 2017, 33(2): 249-265. DOI: 10.1109/TRO.2016.2623335.
[16] GRISETTI G, STACHNISS C, BURGARD W. Improved Techniques for Grid Mapping With Rao-Blackwellized Particle Filters[J/OL]. IEEE Transactions on Robotics, 2007, 23(1): 34-46. DOI: 10.1109/TRO.2006.889486.
[17] KOHLBRECHER S, VON STRYK O, MEYER J, et al. A flexible and scalable SLAM system with full 3D motion estimation[C/OL]//2011 IEEE International Symposium on Safety, Security, and Rescue Robotics. 2011: 155-160. DOI: 10.1109/SSRR.2011.6106777.
[18] KartoSLAM[EB/OL]. http://www.ros.org/wiki/karto.
[19] CoreSLAM[EB/OL]. http://www.ros.orq/wiki/coreslam.
[20] LagoSLAM[EB/OL]. https://github.com/rrg-polito/rrg-polito-ros-pkg.
[21] ZHANG J, SINGH S. LOAM: Lidar Odometry and Mapping in Real-time[C/OL]//Proceedings of Robotics: Science and Systems. Berkeley, USA, 2014. DOI: 10.15607/RSS.2014.X.007.
[22] SHAN T, ENGLOT B. LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain[C/OL]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2018: 4758-4765. DOI: 10.1109/IROS.2018.8594299.
[23] LIU Z, ZHANG F. BALM: Bundle Adjustment for Lidar Mapping[J/OL]. IEEE Robotics and Automation Letters, 2021, 6(2): 3184-3191. DOI: 10.1109/LRA.2021.3062815.
[24] LIU Z, ZHANG F, HONG X. Low-Cost Retina-Like Robotic Lidars Based on Incommensurable Scanning[J/OL]. IEEE/ASME Transactions on Mechatronics, 2022, 27(1): 58-68. DOI: 10.1109/TMECH.2021.3058173.
[25] LIN J, ZHANG F. Loam livox: A fast, robust, high-precision LiDAR odometry and mapping package for LiDARs of small FoV[C/OL]//2020 IEEE International Conference on Robotics and Automation (ICRA). 2020: 3126-3131. DOI: 10.1109/ICRA40945.2020.9197440.
[26] LEUTENEGGER S, FURGALE P, RABAUD V, et al. Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization[C/OL]//Proceedings of Robotics: Science and Systems. Berlin, Germany, 2013. DOI: 10.15607/RSS.2013.IX.037.
[27] LEUTENEGGER S, LYNEN S, BOSSE M, et al. Keyframe-Based Visual-Inertial Odometry Using Nonlinear Optimization[J/OL]. The International Journal of Robotics Research, 2015, 34(3): 314-334. DOI: 10.1177/0278364914554813.
[28] FORSTER C, CARLONE L, DELLAERT F, et al. On-Manifold Preintegration for Real-Time Visual–Inertial Odometry[J/OL]. IEEE Transactions on Robotics, 2017, 33(1): 1-21. DOI: 10.1109/TRO.2016.2597321.REFERENCES
[29] QIN T, LI P, SHEN S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator[J/OL]. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020. DOI: 10.1109/TRO.2018.2853729.
[30] CAMPOS C, ELVIRA R, RODRíGUEZ J J G, et al. ORB-SLAM3: An Accurate OpenSource Library for Visual, Visual–Inertial, and Multimap SLAM[J/OL]. IEEE Transactions on Robotics, 2021, 37(6): 1874-1890. DOI: 10.1109/TRO.2021.3075644.
[31] GENTIL C L, VIDAL-CALLEJA T, HUANG S. IN2LAMA: INertial Lidar Localisation And MApping[C/OL]//2019 International Conference on Robotics and Automation (ICRA). IEEE Press, 2019: 6388–6394. DOI: 10.1109/ICRA.2019.8794429.
[32] XU W, ZHANG F. FAST-LIO: A Fast, Robust LiDAR-Inertial Odometry Package by Tightly-Coupled Iterated Kalman Filter[J/OL]. IEEE Robotics and Automation Letters, 2021, 6(2):3317-3324. DOI: 10.1109/LRA.2021.3064227.
[33] XU W, CAI Y, HE D, et al. FAST-LIO2: Fast Direct LiDAR-Inertial Odometry[J/OL]. IEEE Transactions on Robotics, 2022, 38(4): 2053-2073. DOI: 10.1109/TRO.2022.3141876.
[34] ZHANG J, SINGH S. Visual-lidar odometry and mapping: low-drift, robust, and fast[C/OL]//2015 IEEE International Conference on Robotics and Automation (ICRA). 2015: 2174-2181.DOI: 10.1109/ICRA.2015.7139486.
[35] ZHANG J, SINGH S. Laser–visual–inertial odometry and mapping with high robustness and low drift[J/OL]. Journal of Field Robotics, 2018, 35(8): 1242-1264. DOI: 10.1002/rob.21809.
[36] ZHU Y, ZHENG C, YUAN C, et al. CamVox: A Low-cost and Accurate Lidar-assisted Visual SLAM System[C/OL]//2021 IEEE International Conference on Robotics and Automation(ICRA). 2021: 5049-5055. DOI: 10.1109/ICRA48506.2021.9561149.
[37] What Is Lidar-Camera Calibration[EB/OL]. https://www.mathworks.com/help/lidar/ug/lidar-camera-calibration.html.
[38] CANNY J. A Computational Approach to Edge Detection[J/OL]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1986, PAMI-8(6): 679-698. DOI: 10.1109/TPAMI.1986.4767851.
[39] HUANG G P, MOURIKIS A I, ROUMELIOTIS S I. A First-Estimates Jacobian EKFfor Improving SLAM Consistency[C/OL]//KHATIB O, KUMAR V, PAPPAS G J. Experimental Robotics. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009: 373-382. DOI: 10.1007/978-3-642-00196-3_43.
[40] ECKENHOFF K, PAULL L, HUANG G. Decoupled, consistent node removal and edge sparsification for graph-based SLAM[C/OL]//2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2016: 3275-3282. DOI: 10.1109/IROS.2016.7759505.
[41] BARFOOT T D, FURGALE P T. Associating Uncertainty With Three-Dimensional Poses for Use in Estimation Problems[J/OL]. IEEE Transactions on Robotics, 2014, 30(3): 679-693.DOI: 10.1109/TRO.2014.2298059.
[42] BARFOOT T D. Introduction[M/OL]. Cambridge University Press, 2017: 1–6. DOI: 10.1017/9781316671528.002REFERENCES
[43] MANGELSON J G, GHAFFARI M, VASUDEVAN R, et al. Characterizing the Uncertainty of Jointly Distributed Poses in the Lie Algebra[J/OL]. IEEE Transactions on Robotics, 2020, 36(5): 1371-1388. DOI: 10.1109/TRO.2020.2994457.
[44] SIBLEY G, MATTHIES L, SUKHATME G. Sliding window filter with application to planetary landing[J/OL]. Journal of Field Robotics, 2010, 27(5): 587-608. DOI: 10.1002/rob.20360.
[45] GRUPP M. evo: Python package for the evaluation of odometry and SLAM.[EB/OL]. 2017.https://github.com/MichaelGrupp/evo
修改评论