[1] PETTORELLI N, VIK J O, MYSTERUD A, et al. Using the satellite-derived NDVI to assessecological responses to environmental change[J]. Trends in ecology & evolution, 2005, 20(9):503-510.
[2] KOPAČKOVÁ-STRNADOVÁ V, KOUCKÁ L, JELÉNEK J, et al. Canopy top, height and photosynthetic pigment estimation using parrot sequoia multispectral imagery and the unmannedaerial vehicle (UAV)[J]. Remote Sensing, 2021, 13(4): 705.
[3] 马昊翔, 陈长成, 宋英强, 等. 青海省近 10 年草地植被覆盖动态变化及其驱动因素分析[J].水土保持研究, 2018, 25(6): 137-145.
[4] WALLACE A, NICHOL C, WOODHOUSE I. Recovery of forest canopy parameters by inversion of multispectral LiDAR data[J]. Remote Sensing, 2012, 4(2): 509-531.
[5] SU Y, GUO Q, JIN S, et al. The development and evaluation of a backpack LiDAR system foraccurate and effcient forest inventory[J]. IEEE Geoscience and Remote Sensing Letters, 2020,18(9): 1660-1664.
[6] 张智刚, 罗锡文, 周志艳, 等. 久保田插秧机的 GPS 导航控制系统设计[J]. 农业机械学报,2006, 37(7): 95-97.
[7] BECHAR A, VIGNEAULT C. Agricultural robots for field operations. Part 2: Operations andsystems[J]. Biosystems engineering, 2017, 153: 110-128.
[8] KAMIENSKI C, SOININEN J P, TAUMBERGER M, et al. Smart water management platform:IoT-based precision irrigation for agriculture[J]. Sensors, 2019, 19(2): 276.
[9] FAGUA J C, JANTZ P, RODRIGUEZ-BURITICA S, et al. Integrating LiDAR, multispectraland SAR data to estimate and map canopy height in tropical forests[J]. Remote Sensing, 2019,11(22): 2697.
[10] BENDER A, WHELAN B, SUKKARIEH S. A high-resolution, multimodal data set for agricultural robotics: A Ladybird’s-eye view of Brassica[J]. Journal of Field Robotics, 2020, 37(1):73-96.
[11] TREMBLAY J F, BÉLAND M, GAGNON R, et al. Automatic three-dimensional mapping fortree diameter measurements in inventory operations[J]. Journal of Field Robotics, 2020, 37(8):1328-1346.
[12] DONG W, ROY P, ISLER V. Semantic mapping for orchard environments by merging two-sidesreconstructions of tree rows[J]. Journal of Field Robotics, 2020, 37(1): 97-121.
[13] HÄNI N, ROY P, ISLER V. A comparative study of fruit detection and counting methods foryield mapping in apple orchards[J]. Journal of Field Robotics, 2020, 37(2): 263-282.
[14] KING A. Technology: The future of agriculture[J]. Nature, 2017, 544(7651): S21-S23.
[15] DAS J, CROSS G, QU C, et al. Devices, systems, and methods for automated monitoringenabling precision agriculture[C]//2015 IEEE International Conference on Automation Scienceand Engineering (CASE). IEEE, 2015: 462-469.
[16] FISCHER C, KAKOULLI I. Multispectral and hyperspectral imaging technologies in conservation: current research and potential applications[J]. Studies in Conservation, 2006, 51(sup1):3-16.
[17] BRAUERS J, AACH T. A color filter array based multispectral camera[C]//12. Workshop Farbbildverarbeitung. Ilmenau, 2006.
[18] EVERITT J, ESCOBAR D, VILLARREAL R, et al. Airborne video systems for agriculturalassessment[J]. Remote Sensing of Environment, 1991, 35(2-3): 231-242.
[19] HUANG Y, LAN Y, HOFFMANN W. Use of airborne multi-spectral imagery in pest management systems[J]. Agricultural Engineering International: CIGR Journal, 2008.
[20] GREEN R O, EASTWOOD M L, SARTURE C M, et al. Imaging spectroscopy and the airbornevisible/infrared imaging spectrometer (AVIRIS)[J]. Remote sensing of environment, 1998, 65(3): 227-248.
[21] CHEN J M, LEBLANC S G, MILLER J R, et al. Compact Airborne Spectrographic Imager(CASI) used for mapping biophysical parameters of boreal forests[J]. Journal of GeophysicalResearch: Atmospheres, 1999, 104(D22): 27945-27958.
[22] MÄKYNEN J, HOLMLUND C, SAARI H, et al. Unmanned aerial vehicle (UAV) operatedmegapixel spectral camera[C]//Electro-Optical Remote Sensing, Photonic Technologies, andApplications V: volume 8186. International Society for Optics and Photonics, 2011: 81860Y.
[23] FRANZINI M, RONCHETTI G, SONA G, et al. Geometric and radiometric consistency ofparrot sequoia multispectral imagery for precision agriculture applications[J]. Applied Sciences,2019, 9(24): 5314.
[24] 孙鑫, 白加光, 王忠厚, 等. 一种机载多光谱相机的光学系统设计[J]. 光子学报, 2009, 38(12): 3160-3164.
[25] 罗刚银, 王弼陡, 陈玉琦, 等. 可见近红外波段无人机载成像光谱仪设计[J]. 光子学报,2017, 46(9): 59-68.
[26] 曹丛峰, 方俊永, 赵冬. 基于滤光片阵列分光的无人机载多光谱相机研制[J]. 光学技术,2018, 44(1): 51-55.
[27] LIU Z, ZHANG F, HONG X. Low-cost retina-like robotic lidars based on incommensurablescanning[J]. IEEE/ASME Transactions on Mechatronics, 2021.
[28] AHMAD N, GHAZILLA R A R, KHAIRI N M, et al. Reviews on various inertial measurementunit (IMU) sensor applications[J]. International Journal of Signal Processing Systems, 2013, 1(2): 256-262.
[29] WIESSNER P, REZABEK F, HOLZINGER K. Intra-vehicular Data Sources[J]. Network,2021, 73.
[30] LIU Y, HUANG T S, FAUGERAS O D. Determination of camera location from 2-D to 3-D lineand point correspondences[J]. IEEE Transactions on pattern analysis and machine intelligence,1990, 12(1): 28-37.
[31] RICOLFE-VIALA C, SANCHEZ-SALMERON A J. Lens distortion models evaluation[J]. Applied optics, 2010, 49(30): 5914-5928.
[32] ZHANG Z. A flexible new technique for camera calibration[J]. IEEE Transactions on patternanalysis and machine intelligence, 2000, 22(11): 1330-1334.
[33] VEL’AS M, ŠPANĚL M, MATERNA Z, et al. Calibration of rgb camera with velodyne lidar[J]. 2014.
[34] PUSZTAI Z, HAJDER L. Accurate calibration of LiDAR-camera systems using ordinary boxes[C]//Proceedings of the IEEE International Conference on Computer Vision Workshops. 2017:394-402.
[35] TAYLOR Z, NIETO J. Motion-based calibration of multimodal sensor extrinsics and timingoffset estimation[J]. IEEE Transactions on Robotics, 2016, 32(5): 1215-1229.
[36] RUSINKIEWICZ S, LEVOY M. Effcient variants of the ICP algorithm[C]//Proceedings thirdinternational conference on 3-D digital imaging and modeling. IEEE, 2001: 145-152.
[37] HAUG S, OSTERMANN J. A crop/weed field image dataset for the evaluation of computervision based precision agriculture tasks[C]//European conference on computer vision. Springer,2014: 105-116.
[38] LU Y, YOUNG S. A survey of public datasets for computer vision tasks in precision agriculture[J]. Computers and Electronics in Agriculture, 2020, 178: 105760.
[39] CHEBROLU N, LOTTES P, SCHAEFER A, et al. Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields[J]. The International Journal of RoboticsResearch, 2017, 36(10): 1045-1052.
[40] CADENA C, CARLONE L, CARRILLO H, et al. Past, present, and future of simultaneouslocalization and mapping: Toward the robust-perception age[J]. IEEE Transactions on robotics,2016, 32(6): 1309-1332.
[41] SCHONBERGER J L, FRAHM J M. Structure-from-motion revisited[C]//Proceedings of theIEEE conference on computer vision and pattern recognition. 2016: 4104-4113.
[42] KHAIRUDDIN A R, TALIB M S, HARON H. Review on simultaneous localization and mapping (SLAM)[C]//2015 IEEE international conference on control system, computing and engineering (ICCSCE). IEEE, 2015: 85-90.
[43] TAKETOMI T, UCHIYAMA H, IKEDA S. Visual SLAM algorithms: A survey from 2010 to2016[J]. IPSJ Transactions on Computer Vision and Applications, 2017, 9(1): 1-11.
[44] HESS W, KOHLER D, RAPP H, et al. Real-time loop closure in 2D LIDAR SLAM[C]//2016IEEE international conference on robotics and automation (ICRA). IEEE, 2016: 1271-1278.
[45] HUANG S, DISSANAYAKE G. Convergence and consistency analysis for extended Kalmanfilter based SLAM[J]. IEEE Transactions on robotics, 2007, 23(5): 1036-1049.
[46] GRISETTI G, KÜMMERLE R, STACHNISS C, et al. A tutorial on graph-based SLAM[J].IEEE Intelligent Transportation Systems Magazine, 2010, 2(4): 31-43.
[47] LATEGAHN H, GEIGER A, KITT B. Visual SLAM for autonomous ground vehicles[C]//2011IEEE International Conference on Robotics and Automation. IEEE, 2011: 1732-1737.
[48] ZHU Y, ZHENG C, YUAN C, et al. Camvox: A low-cost and accurate lidar-assisted visual slamsystem[C]//2021 IEEE International Conference on Robotics and Automation (ICRA). IEEE,2021: 5049-5055.
[49] MUR-ARTAL R, MONTIEL J M M, TARDOS J D. ORB-SLAM: a versatile and accuratemonocular SLAM system[J]. IEEE transactions on robotics, 2015, 31(5): 1147-1163.
[50] MUR-ARTAL R, TARDÓS J D. Orb-slam2: An open-source slam system for monocular, stereo,and rgb-d cameras[J]. IEEE transactions on robotics, 2017, 33(5): 1255-1262.
[51] ZHANG J, SINGH S. LOAM: Lidar Odometry and Mapping in Real-time.[C]//Robotics: Science and Systems: volume 2. Berkeley, CA, 2014: 1-9.
[52] 蒙继华, 吴炳方, 杜鑫, 等. 遥感在精准农业中的应用进展及展望[J]. 国土资源遥感, 2011,3(1): 7.
[53] 郭铌. 植被指数及其研究进展[J]. 干旱气象, 2003, 21(4): 71.
[54] 邓书斌, 陈秋锦. 植被波谱特征与植被指数综述[J]. 中国遥感应用协会 2010 年会暨区域遥感发展与产业高层论坛论文集, 2010.
[55] XU W, ZHANG F. Fast-lio: A fast, robust lidar-inertial odometry package by tightly-couplediterated kalman filter[J]. IEEE Robotics and Automation Letters, 2021, 6(2): 3317-3324.
[56] XU W, CAI Y, HE D, et al. Fast-lio2: Fast direct lidar-inertial odometry[J]. arXiv preprintarXiv:2107.06829, 2021.
[57] SHENDRYK I, BROICH M, TULBURE M G, et al. Bottom-up delineation of individual treesfrom full-waveform airborne laser scans in a structurally complex eucalypt forest[J]. RemoteSensing of Environment, 2016, 173: 69-83.
[58] KOCH B, KATTENBORN T, STRAUB C, et al. Segmentation of forest to tree objects[J].Forestry applications of airborne laser scanning, 2014: 89-112.
[59] SOLBERG S, NAESSET E, BOLLANDSAS O M. Single tree segmentation using airborne laserscanner data in a structurally heterogeneous spruce forest[J]. Photogrammetric Engineering &Remote Sensing, 2006, 72(12): 1369-1378.
[60] SANKARAN S, MAJA J M, BUCHANON S, et al. Huanglongbing (citrus greening) detectionusing visible, near infrared and thermal imaging techniques[J]. Sensors, 2013, 13(2): 2117-2130.
修改评论