[1] 刘政鑫. 博智林: 建筑机器人解决智能建造核心难题[J]. 机器人产业, 2023: 55-57.
[2] 张顺善, 尹华辉, 张吉松. 建筑机器人研究综述[C]//第九届全国 BIM 学术会议论文集. 中国建筑工业出版社, 2023: 268-275.
[3] 于军琪, 曹建福, 雷小康. 建筑机器人研究现状与展望[J]. 自动化博览, 2016(8): 68-75.
[4] KIM D, GOYAL A, NEWELL A, et al. Semantic relation detection between construction entities to support safe human-robot collaboration in construction[C]//ASCE International Conference on Computing in Civil Engineering 2019. American Society of Civil Engineers Reston,VA, 2019: 265-272.
[5] LIU Y, JEBELLI H. Intention estimation in physical human-robot interaction in constuction:Empowering robots to gauge workers’posture[C]//Construction Research Congress 2022. 2022:621-630.
[6] ZHOU T, ZHU Q, SHI Y, et al. Construction robot teleoperation safeguard based on real-time human hand motion prediction[J]. Journal of Construction Engineering and Management, 2022, 148(7): 04022040.
[7] NAGANO H, TAKENOUCHI H, CAO N, et al. Tactile feedback system of high-frequecy vibration signals for supporting delicate teleoperation of construction robots[J]. Advanced Robotics, 2020, 34(11): 730-743.
[8] BRAUN A, TUTTAS S, BORRMANN A, et al. A concept for automated construction progress monitoring using bim-based geometric constraints and photogrammetric point clouds.[J]. J. Inf. Technol. Constr., 2015, 20(5): 68-79.
[9] RAHIMIAN F P, SEYEDZADEH S, OLIVER S, et al. On-demand monitoring of construction projects through a game-like hybrid application of BIM and machine learning[J]. Automation in Construction, 2020, 110: 103012.
[10] IKEDA T, BANDO N, YAMADA H. Semi-automatic visual support system with drone for teleoperated construction robot[J]. Journal of Robotics and Mechatronics, 2021, 33(2): 313-321.
[11] 傅博. 移动车辆的多传感器标定[D]. 浙江大学, 2022.
[12] LIU Y, FU Y, QIN M, et al. BotanicGarden: A high-quality and large-scale robot navigation dataset in challenging natural environments[A]. 2023.
[13] ZHAO S, SINGH D, SUN H, et al. Subt-mrs: A subterranean, multi-robot, multi-spectral and multi-degraded dataset for robust slam[A]. 2023.
[14] FENG D, QI Y, ZHONG S, et al. S3e: A large-scale multimodal dataset for collaborative slam[A]. 2022.
[15] TSCHOPP F, RINER M, FEHR M, et al. Versavis—an open versatile multi-camera visualinertial sensor suite[J]. Sensors, 2020, 20(5): 1439.
[16] GAO L, LIANG Y, YANG J, et al. Vector: A versatile event-centric benchmark for multi-sensorslam[J]. IEEE Robotics and Automation Letters, 2022, 7(3): 8217-8224.
[17] JIAO J, WEI H, HU T, et al. Fusionportable: A multi-sensor campus-scene dataset forevaluation of localization and mapping accuracy on diverse platforms[C]//2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2022: 3851-3856.
[18] FAIZULLIN M, KORNILOVA A, FERRER G. Open-source LiDAR time synchronization system by mimicking GNSS-clock[C]//2022 IEEE International Symposium on Precision Clock Synchronization for Measurement, Control, and Communication (ISPCS). IEEE, 2022: 1-5.
[19] ZHU Y, ZHENG C, YUAN C, et al. Camvox: A low-cost and accurate LiDAR-assisted visual slam system[C]//2021 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2021: 5049-5055.
[20] ZHANG X, ZHU S, GUO S, et al. Line-based automatic extrinsic calibration of LiDAR and camera[C]//2021 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2021: 9347-9353.
[21] ZHAO Y, WANG Y, TSAI Y. 2D-image to 3D-range registration in urban environments via scene categorization and combination of similarity measurements[C]//2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2016: 1866-1872.
[22] SCHNEIDER N, PIEWAK F, STILLER C, et al. RegNet: Multimodal sensor registration using deep neural networks[C]//2017 IEEE intelligent vehicles symposium (IV). IEEE, 2017: 1803-1810.
[23] IYER G, RAM R K, MURTHY J K, et al. CalibNet: Geometrically supervised extrinsic calibration using 3D spatial transformer networks[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 1110-1117.
[24] LV X, WANG B, DOU Z, et al. LCCNet: LiDAR and camera self-calibration using cost volume network[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2021: 2894-2901.
[25] WANG W, NOBUHARA S, NAKAMURA R, et al. Soic: Semantic online initialization and calibration for LiDAR and camera[A]. 2020.
[26] TSAI R Y, LENZ R K, et al. A new technique for fully autonomous and effcient 3d robotics hand/eye calibration[J]. IEEE Transactions on robotics and automation, 1989, 5(3): 345-358.
[27] ISHIKAWA R, OISHI T, IKEUCHI K. LiDAR and camera calibration using motions estimated by sensor fusion odometry[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 7342-7349.
[28] GIAMOU M, MA Z, PERETROUKHIN V, et al. Certifably globally optimal extrinsic calibration from per-sensor egomotion[J]. IEEE Robotics and Automation Letters, 2019, 4(2):367-374.
[29] HORN M, WODTKO T, BUCHHOLZ M, et al. Online extrinsic calibration based on per-sensor ego-motion using dual quaternions[J]. IEEE Robotics and Automation Letters, 2021, 6(2): 982-989.
[30] LEPETIT V, MORENO-NOGUER F, FUA P. EPnP: An accurate O(n) solution to the PnP problem[J]. International journal of computer vision, 2009, 81: 155-166.
[31] KÜMMERLE R, GRISETTI G, STRASDAT H, et al. g2o: A general framework for graph optimization[C]//2011 IEEE International Conference on Robotics and Automation. IEEE, 2011: 3607-3613.
[32] ZHOU L, LI Z, KAESS M. Automatic extrinsic calibration of a camera and a 3d LiDAR using line and plane correspondences[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 5562-5569.
[33] KOO G, KANG J, JANG B, et al. Analytic plane covariances construction for precise planarity based extrinsic calibration of camera and LiDAR[C]//2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2020: 6042-6048.
[34] ZHANG Q, PLESS R. Extrinsic calibration of a camera and laser range fnder (improves camera calibration)[C]//2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)(IEEE Cat. No. 04CH37566): volume 3. IEEE, 2004: 2301-2306.
[35] UNNIKRISHNAN R, HEBERT M. Fast extrinsic calibration of a laser rangefnder to a camera[J]. Robotics Institute, Pittsburgh, PA, Tech. Rep. CMU-RI-TR-05-09, 2005.
[36] WANG W, SAKURADA K, KAWAGUCHI N. Refectance intensity assisted automatic and accurate extrinsic calibration of 3d LiDAR and panoramic camera using a printed chessboard[J]. Remote Sensing, 2017, 9(8): 851.
[37] XIE Y, SHAO R, GULI P, et al. Infrastructure based calibration of a multi-camera and multiLiDAR system using apriltags[C]//2018 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2018: 605-610.
[38] GEIGER A, MOOSMANN F, CAR Ö, et al. Automatic camera and range sensor calibration using a single shot[C]//2012 IEEE international conference on robotics and automation. IEEE, 2012: 3936-3943.
[39] FANG C, DING S, DONG Z, et al. Single-shot is enough: Panoramic infrastructure based calibration of multiple cameras and 3D LiDARs[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2021: 8890-8897.
[40] DEBATTISTI S, MAZZEI L, PANCIROLI M. Automated extrinsic laser and camera inter calibration using triangular targets[C]//2013 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2013: 696-701.
[41] PARK Y, YUN S, WON C S, et al. Calibration between color camera and 3D LiDAR instruments with a polygonal planar board[J]. Sensors, 2014, 14(3): 5333-5353.
[42] LIAO Q, CHEN Z, LIU Y, et al. Extrinsic calibration of LiDAR and camera with polygon[C]//2018 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, 2018:200-205.
[43] DENG Z, XIONG L, YIN D, et al. Joint calibration of dual LiDARs and camera using a circular chessboard[R]. SAE Technical Paper, 2020.
[44] FREMONT V, RODRIGUEZ F S A, BONNIFAIT P. Circular targets for 3d alignment of video and LiDAR sensors[J]. Advanced Robotics, 2012, 26(18): 2087-2113.
[45] PEREIRA M, SILVA D, SANTOS V, et al. Self calibration of multiple LiDARs and camerason autonomous vehicles[J]. Robotics and Autonomous Systems, 2016, 83: 326-337.
[46] KÜMMERLE J, KÜHNER T, LAUER M. Automatic calibration of multiple cameras anddepth sensors with a spherical target[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 1-8.
[47] BELTRÁN J, GUINDEL C, DE LA ESCALERA A, et al. Automatic extrinsic calibrationmethod for LiDAR and camera sensor setups[J]. IEEE Transactions on Intelligent Transportation Systems, 2022, 23(10): 17677-17689.
[48] TÓTH T, PUSZTAI Z, HAJDER L. Automatic LiDAR-camera calibration of extrinsic parameters using a spherical target[C]//2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2020: 8580-8586.
[49] DHALL A, CHELANI K, RADHAKRISHNAN V, et al. LiDAR-camera calibration using 3D3D point correspondences[A]. 2017.
[50] MUR-ARTAL R, TARDÓS J D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras[J]. IEEE transactions on robotics, 2017, 33(5): 1255-1262.
[51] YANG Z, SHEN S. Monocular visual-inertial state estimation with online initialization and camera-IMU extrinsic calibration[J]. IEEE Transactions on Automation Science and Engineering, 2016, 14(1): 39-51.
[52] MIRZAEI F M, ROUMELIOTIS S I. A kalman flter-based algorithm for IMU-camera calibration: Observability analysis and performance evaluation[J]. IEEE transactions on robotics, 2008, 24(5): 1143-1156.
[53] FURGALE P, REHDER J, SIEGWART R. Unifed temporal and spatial calibration for multisensor systems[C]//2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2013: 1280-1286.
[54] ECKENHOFF K, GENEVA P, BLOECKER J, et al. Multi-camera visual-inertial navigation with online intrinsic and extrinsic calibration[C]//2019 International Conference on Robotics and Automation (ICRA). IEEE, 2019: 3158-3164.
[55] ZUO X, GENEVA P, LEE W, et al. Lic-fusion: LiDAR-inertial-camera odometry[C]//2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2019: 5848-5854.
[56] ZHI X, HOU J, LU Y, et al. Multical: Spatiotemporal calibration for multiple IMUs, cameras and LiDARs[C]//2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2022: 2446-2453.
[57] REHDER J, BEARDSLEY P, SIEGWART R, et al. Spatio-temporal laser to visual/inertial calibration with applications to hand-held, large scale scanning[C]//2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2014: 459-465.
[58] LI X, CHEN S, LI S, et al. Accurate and consistent spatiotemporal calibration for heterogenouscamera/IMU/LiDAR system based on continuous-time batch estimation[J]. IEEE/ASME Transactions on Mechatronics, 2023.
[59] LI S, LI X, CHEN S, et al. Two-Step LiDAR/camera/IMU spatial and temporal calibration basedon continuous-time trajectory estimation[J]. IEEE Transactions on Industrial Electronics, 2023(99): 1-10.
[60] BENTLEY J L. Multidimensional binary search trees used for associative searching[J]. Communications of the ACM, 1975, 18(9): 509-517.
[61] KOENIG N, HOWARD A. Design and use paradigms for gazebo, an open-source multirobot simulator[C]//2004 IEEE/RSJ international conference on intelligent robots and systems (IROS)(IEEE Cat. No. 04CH37566): volume 3. IEEE, 2004: 2149-2154.
[62] JIAO J, CHEN F, WEI H, et al. LCE-Calib: Automatic LiDAR-frame/event camera extrinsic calibration with a globally optimal solution[A]. 2023. arXiv: 2303.09825.
[63] 高翔. 视觉 SLAM 十四讲: 从理论到实践[M]. 电子工业出版社, 2017.
[64] ZHANG Z. Flexible camera calibration by viewing a plane from unknown orientations[C]// Proceedings of the seventh ieee international conference on computer vision: volume 1. Ieee,1999: 666-673.
[65] ZHANG J, SIRITANAWAN P, YUE Y, et al. A two-step method for extrinsic calibration between a sparse 3d LiDAR and a thermal camera[C]//2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV). IEEE, 2018: 1039-1044.
[66] BARTOLI A, STURM P. Structure-from-motion using lines: Representation, triangulation, and bundle adjustment[J]. Computer vision and image understanding, 2005, 100(3): 416-441.
修改评论