[1] CHOI C, DEL PRETO J, RUS D. Using vision for pre-and post-grasping object localization for soft hands[C]//2016 International Symposium on Experimental Robotics. Springer, 2017: 601-612.
[2] GÓMEZ-DE GABRIEL J M, WURDEMANN H A. Adaptive underactuated finger with active rolling surface[J]. IEEE Robotics and Automation Letters, 2021, 6(4): 8253-8260.
[3] FANG B, SUN F, WU L, et al. Multimode grasping soft gripper achieved by layer jamming structure and tendon-driven mechanism[J]. Soft Robotics, 2022, 9(2): 233-249.
[4] HUSSAIN I, AL-KETAN O, RENDA F, et al. Design and Prototyping Soft–Rigid Tendon-Driven Modular Grippers Using Interpenetrating Phase Composites Materials[J]. The International Journal of Robotics Research, 2020, 39(14): 1635-1646.
[5] 李泳耀, 江磊, 刘宇飞, 等. 基于抓取刚度增强的刚软耦合仿人手设计与实验[J]. 机械工程学报, 2023, 5.
[6] KIM H I, HAN M W, SONG S H, et al. Soft morphing hand driven by SMA tendon wire[J]. Composites Part B: Engineering, 2016, 105: 138-148.
[7] RODRIGUE H, WANG W, KIM D R, et al. Curved shape memory alloy-based soft actuators and application to soft gripper[J]. Composite Structures, 2017, 176: 398-406.
[8] WANG W, AHN S H. Shape memory alloy-based soft gripper with variable stiffness for compliant and effective grasping[J]. Soft robotics, 2017, 4(4): 379-389.
[9] HU Q, DONG E, SUN D. Soft gripper design based on the integration of flat dry adhesive, soft actuator, and microspine[J]. IEEE Transactions on Robotics, 2021, 37(4): 1065-1080.
[10] ZHANG Y, LIU T, LAN X, et al. A compliant robotic grip structure based on shape memory polymer composite[J]. Composites Communications, 2022, 36: 101383.
[11] YANG Y, CHEN Y, WEI Y, et al. Novel design and three-dimensional printing of variable stiffness robotic grippers[J]. Journal of Mechanisms and Robotics, 2016, 8(6): 061010.
[12] YANG Y, ZHU H, LIU J, et al. A novel variable stiffness and tunable bending shape soft robotic finger based on thermoresponsive polymers[J]. IEEE Transactions on Instrumentation and Measurement, 2023.
[13] BEHL M, KRATZ K, ZOTZMANN J, et al. Reversible bidirectional shape-memory polymers[J]. Advanced Materials (Deerfield Beach, Fla.), 2013, 25(32): 4466-4469.
[14] ILIEVSKI F, MAZZEO A D, SHEPHERD R F, et al. Soft robotics for chemists[J]. Angewandte Chemie International Editiion, 2011.
[15] WU Y, ZENG G, XU J, et al. A bioinspired multi-knuckle dexterous pneumatic soft finger[J]. Sensors and Actuators A: Physical, 2023, 350: 114105.
[16] SUN T, CHEN Y, HAN T, et al. A soft gripper with variable stiffness inspired by pangolin scales, toothed pneumatic actuator and autonomous controller[J]. Robotics and Computer-Integrated Manufacturing, 2020, 61: 101848.
[17] YAN J, XU Z, SHI P, et al. A human-inspired soft finger with dual-mode morphing enabled by variable stiffness mechanism[J]. Soft Robotics, 2022, 9(2): 399-411.
[18] DINAKARAN V P, BALASUBRAMANIYAN M P, LE Q H, et al. A novel multi objective constraints based industrial gripper design with optimized stiffness for object grasping[J]. Robotics and Autonomous Systems, 2023, 160: 104303.
[19] TAWK C, GAO Y, MUTLU R, et al. Fully 3D printed monolithic soft gripper with high conformal grasping capability[C]//2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM). IEEE, 2019: 1139-1144.
[20] LAU G K, HENG K R, AHMED A S, et al. Dielectric elastomer fingers for versatile grasping and nimble pinching[J]. Applied Physics Letters, 2017, 110(18).
[21] LU C, ZHANG X. Ionic Polymer–Metal Composites: From Material Engineering to Flexible Applications[J]. Accounts of Chemical Research, 2023, 57(1): 131-139.
[22] HU X, CHEN H, TANG C, et al. A Variable Stiffness Finger Based on Low Melting Point Alloys[C/OL]//2023 IEEE 18th Conference on Industrial Electronics and Applications (ICIEA). 2023: 641-645. DOI: 10.1109/ICIEA58696.2023.10241380.
[23] PFAFF O, SIMEONOV S, CIROVIC I, et al. Application of fin ray effect approach for production process automation[J]. Annals of DAAAM & Proceedings, 2011, 22(1): 1247-1249.
[24] CROOKS W, ROZEN-LEVY S, TRIMMER B, et al. Passive gripper inspired by Manduca sexta and the Fin Ray® Effect[J]. International Journal of Advanced Robotic Systems, 2017, 14(4): 1729881417721155.
[25] FU J, LIN H, PRATHYUSH I, et al. A novel discrete variable stiffness gripper based on the fin ray effect[C]//International Conference on Intelligent Robotics and Applications. Springer, 2022: 791-802.
[26] DENG Z, LI M. Learning optimal fin-ray finger design for soft grasping[J]. Frontiers in Robotics and AI, 2021, 7: 590076.
[27] ELGENEIDY K, LIGHTBODY P, PEARSON S, et al. Characterising 3D-printed soft fin ray robotic fingers with layer jamming capability for delicate grasping[C]//2019 2nd IEEE International Conference on Soft Robotics (RoboSoft). IEEE, 2019: 143-148.
[28] ELGENEIDY K, FANSA A, HUSSAIN I, et al. Structural optimization of adaptive soft fin ray fingers with variable stiffening capability[C]//2020 3rd IEEE International Conference on Soft Robotics (RoboSoft). IEEE, 2020: 779-784.
[29] CHEN R, SONG R, ZHANG Z, et al. Bio-inspired shape-adaptive soft robotic grippers augmented with electroadhesion functionality[J]. Soft robotics, 2019, 6(6): 701-712.
[30] LUO C, YANG S, ZHANG W, et al. MPJ hand: a self-adaptive underactuated hand with flexible fingers of multiple passive joints[C]//2016 international conference on advanced robotics and mechatronics (ICARM). IEEE, 2016: 184-189.
[31] PETKOVIĆ D, D. PAVLOVIĆ N, SHAMSHIRBAND S, et al. Development of a new type of passively adaptive compliant gripper[J]. Industrial Robot: An International Journal, 2013, 40(6): 610-623.
[32] LIU C H, HUANG G F, CHIU C H, et al. Topology synthesis and optimal design of an adaptive compliant gripper to maximize output displacement[J]. Journal of Intelligent & RoboticSystems, 2018, 90: 287-304.
[33] WAN F, WANG H, WU J, et al. A reconfigurable design for omni-adaptive grasp learning[J].IEEE Robotics and Automation Letters, 2020, 5(3): 4210-4217.
[34] CHU A H, CHENG T, MURALT A, et al. A passively conforming soft robotic gripper withthree-dimensional negative bending stiffness fingers[J]. Soft Robotics, 2023, 10(3): 556-567.
[35] CAI Y, YUAN S. In-hand manipulation in power grasp: Design of an adaptive robot hand withactive surfaces[C]//2023 IEEE International Conference on Robotics and Automation (ICRA).IEEE, 2023: 10296-10302.
[36] XU W, ZHANG H, YUAN H, et al. A Compliant Adaptive Gripper and Its Intrinsic ForceSensing Method[J]. IEEE Transactions on Robotics, 2021, 37(5): 1584-1603.
[37] SHAN X, BIRGLEN L. Modeling and Analysis of Soft Robotic Fingers Using the Fin RayEffect[J]. The International Journal of Robotics Research, 2020, 39(14): 1686-1705.
[38] CHEN G, TANG S, XU S, et al. Intrinsic Contact Sensing and Object Perception of an AdaptiveFin-Ray Gripper Integrating Compact Deflection Sensors[J]. IEEE Transactions on Robotics,2023.
[39] GOMES D F, LIN Z, LUO S. GelTip: A finger-shaped optical tactile sensor for robotic manipulation[C]//2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).IEEE, 2020: 9903-9909.
[40] SUN H, KUCHENBECKER K J, MARTIUS G. A Soft Thumb-Sized Vision-Based Sensor withAccurate All-Round Force Perception[J]. Nature Machine Intelligence, 2022, 4(2): 135-145.
[41] TIPPUR M H, ADELSON E H. GelSight360: An omnidirectional camera-based tactile sensorfor dexterous robotic manipulation[C]//2023 IEEE International Conference on Soft Robotics(RoboSoft). IEEE, 2023: 1-8.
[42] THURUTHEL T G, SHIH B, LASCHI C, et al. Soft Robot Perception Using Embedded SoftSensors and Recurrent Neural Networks[J]. Science Robotics, 2019, 4(26): eaav1488.
[43] TRUBY R L, WEHNER M, GROSSKOPF A K, et al. Soft somatosensitive actuators via embedded 3D printing[J]. Advanced Materials, 2018, 30(15): 1706383.
[44] WAN F, LIU X, GUO N, et al. Visual Learning Towards Soft Robot Force Control using a3D Metamaterial with Differential Stiffness[C]//Conference on Robot Learning. PMLR, 2022:1269-1278.
[45] MAHSERECI Y, SALLER S, RICHTER H, et al. An ultra-thin flexible CMOS stress sensordemonstrated on an adaptive robotic gripper[J]. IEEE Journal of Solid-State Circuits, 2015, 51(1): 273-280.
[46] HOFER M, SFERRAZZA C, D’ANDREA R. A vision-based sensing approach for a sphericalsoft robotic arm[J]. Frontiers in Robotics and AI, 2021, 8: 630935.
[47] YUAN W, SRINIVASAN M A, ADELSON E H. Estimating object hardness with a gelsighttouch sensor[C]//2016 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS). IEEE, 2016: 208-215.
[48] WANG Y, WU X, MEI D, et al. Flexible tactile sensor array for distributed tactile sensingand slip detection in robotic hand grasping[J]. Sensors and Actuators A: Physical, 2019, 297:111512.
[49] LI J, DONG S, ADELSON E. Slip detection with combined tactile and visual information[C]//2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2018:7772-7777.
[50] LIU H, YU Y, SUN F, et al. Visual–tactile fusion for object recognition[J]. IEEE Transactionson Automation Science and Engineering, 2016, 14(2): 996-1008.
[51] CHEN Y, GUO S, LI C, et al. Size recognition and adaptive grasping using an integration ofactuating and sensing soft pneumatic gripper[J]. Robotics and Autonomous Systems, 2018,104: 14-24.
[52] LI G, LIU S, WANG L, et al. Skin-inspired quadruple tactile sensors integrated on a robot handenable object recognition[J]. Science Robotics, 2020, 5(49): eabc8134.
[53] KUPPUSWAMY N, ALSPACH A, UTTAMCHANDANI A, et al. Soft-bubble grippers forrobust and perceptive manipulation[C]//2020 IEEE/RSJ International Conference on IntelligentRobots and Systems (IROS). IEEE, 2020: 9917-9924.
[54] YANG L, HAN X, GUO W, et al. Learning-based optoelectronically innervated tactile fingerfor rigid-soft interactive grasping[J]. IEEE Robotics and Automation Letters, 2021, 6(2): 3817-3824.
[55] DIKHALE S, PATEL K, DHINGRA D, et al. Visuotactile 6d pose estimation of an in-handobject using vision and tactile sensor data[J]. IEEE Robotics and Automation Letters, 2022, 7(2): 2148-2155.
[56] KHAZATSKY A, PERTSCH K, NAIR S, et al. DROID: A large-scale in-the-wild robot manipulation dataset[C]//Robotics: Science and Systems. 2024.
[57] FU Z, ZHAO T Z, FINN C. Mobile ALOHA: Learning Bimanual Mobile Manipulation withLow-Cost Whole-Body Teleoperation[C]//arXiv. 2024.
[58] CHI C, XU Z, PAN C, et al. Universal Manipulation Interface: In-The-Wild Robot TeachingWithout In-The-Wild Robots[C]//Proceedings of Robotics: Science and Systems (RSS). 2024.
[59] HUANG Y, SUN Y. A dataset of daily interactive manipulation[J]. The International Journalof Robotics Research, 2019, 38(8): 879-886.
[60] LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998, 86(11): 2278-2324.
[61] KRIZHEVSKY A, SUTSKEVER I, HINTON G E. Imagenet classification with deep convolutional neural networks[J]. Advances in neural information processing systems, 2012, 25.
[62] SIMONYAN K, ZISSERMAN A. Very deep convolutional networks for large-scale imagerecognition[C]//3rd International Conference on Learning Representations (ICLR 2015). Computational and Biological Learning Society, 2015.
[63] RONNEBERGER O, FISCHER P, BROX T. U-net: Convolutional networks for biomedical image segmentation[C]//Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, October 5-9, 2015, Proceedings, Part III 18. Springer, 2015: 234-241.
[64] SZEGEDY C, LIU W, JIA Y, et al. Going deeper with convolutions[C]//Proceedings of theIEEE conference on computer vision and pattern recognition. 2015: 1-9.
[65] REDMON J, DIVVALA S, GIRSHICK R, et al. You only look once: Unified, real-time objectdetection[C]//Proceedings of the IEEE conference on computer vision and pattern recognition.2016: 779-788.
[66] HE K, ZHANG X, REN S, et al. Deep residual learning for image recognition[C]//Proceedingsof the IEEE conference on computer vision and pattern recognition. 2016: 770-778.
[67] LIU W, ANGUELOV D, ERHAN D, et al. Ssd: Single shot multibox detector[C]//ComputerVision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, October 11–14,2016, Proceedings, Part I 14. Springer, 2016: 21-37.
[68] GIRSHICK R. Fast r-cnn[C]//Proceedings of the IEEE international conference on computervision. 2015: 1440-1448.
[69] REN S, HE K, GIRSHICK R, et al. Faster r-cnn: Towards real-time object detection with regionproposal networks[J]. Advances in neural information processing systems, 2015, 28.
[70] HE K, GKIOXARI G, DOLLÁR P, et al. Mask r-cnn[C]//Proceedings of the IEEE internationalconference on computer vision. 2017: 2961-2969.
[71] QI C R, YI L, SU H, et al. Pointnet++: Deep hierarchical feature learning on point sets in ametric space[J]. Advances in neural information processing systems, 2017, 30.
[72] ROMBACH R, BLATTMANN A, LORENZ D, et al. High-resolution image synthesis withlatent diffusion models[C]//Proceedings of the IEEE/CVF conference on computer vision andpattern recognition. 2022: 10684-10695.
[73] DONG Q, CAO C, FU Y. Incremental transformer structure enhanced image inpainting withmasking positional encoding[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022: 11358-11368.
[74] IOFFE S, SZEGEDY C. Batch normalization: Accelerating deep network training by reducinginternal covariate shift[C]//International conference on machine learning. pmlr, 2015: 448-456.
[75] KINGMA D P, BA J. Adam: A method for stochastic optimization[A]. 2014.
[76] MASON M T. Toward robotic manipulation[J]. Annual Review of Control, Robotics, andAutonomous Systems, 2018, 1: 1-28.
[77] HARRIS C, STEPHENS M, et al. A combined corner and edge detector[C]//Alvey visionconference: Vol. 15. Manchester, UK, 1988: 10-5244.
[78] LOWE D G. Object recognition from local scale-invariant features[C]//Proceedings of the seventh IEEE international conference on computer vision: Vol. 2. Ieee, 1999: 1150-1157.
[79] BAY H, ESS A, TUYTELAARS T, et al. Speeded-up robust features (SURF)[J]. Computervision and image understanding, 2008, 110(3): 346-359.
[80] RUBLEE E, RABAUD V, KONOLIGE K, et al. ORB: An efficient alternative to SIFT or SURF[C]//2011 International conference on computer vision. Ieee, 2011: 2564-2571.
[81] RUSU R B, MARTON Z C, BLODOW N, et al. Persistent point feature histograms for 3Dpoint clouds[C]//Proc 10th Int Conf Intel Autonomous Syst (IAS-10), Baden-Baden, Germany.2008: 119-128.
[82] RUSU R B, BLODOW N, BEETZ M. Fast point feature histograms (FPFH) for 3D registration[C]//2009 IEEE international conference on robotics and automation. IEEE, 2009: 3212-3217.
[83] SALTI S, TOMBARI F, DI STEFANO L. SHOT: Unique signatures of histograms for surfaceand texture description[J]. Computer Vision and Image Understanding, 2014, 125: 251-264.
[84] GUO Y, SOHEL F, BENNAMOUN M, et al. Rotational projection statistics for 3D local surfacedescription and object recognition[J]. International journal of computer vision, 2013, 105: 63-86.
[85] DROST B, ULRICH M, NAVAB N, et al. Model globally, match locally: Efficient and robust3D object recognition[C]//2010 IEEE computer society conference on computer vision and pattern recognition. Ieee, 2010: 998-1005.
[86] MUJA M, LOWE D. Flann-fast library for approximate nearest neighbors user manual[J].Computer Science Department, University of British Columbia, Vancouver, BC, Canada, 2009,5: 6.
[87] LI S, XU C, XIE M. A robust O (n) solution to the perspective-n-point problem[J]. IEEEtransactions on pattern analysis and machine intelligence, 2012, 34(7): 1444-1450.
[88] BESL P J, MCKAY N D. A Method for Registration of 3-D Shapes[J]. IEEE Transactions onPattern Analysis and Machine Intelligence, 1992, 14(2): 239-256.
[89] WANG G, MANHARDT F, TOMBARI F, et al. Gdr-net: Geometry-guided direct regressionnetwork for monocular 6d object pose estimation[C]//Proceedings of the IEEE/CVF Conferenceon Computer Vision and Pattern Recognition. 2021: 16611-16621.
[90] KRULL A, BRACHMANN E, MICHEL F, et al. Learning analysis-by-synthesis for 6D poseestimation in RGB-D images[C]//Proceedings of the IEEE international conference on computervision. 2015: 954-962.
[91] WANG C, XU D, ZHU Y, et al. Densefusion: 6d object pose estimation by iterative dense fusion[C]//Proceedings of the IEEE/CVF conference on computer vision and pattern recognition.2019: 3343-3352.
[92] AOKI Y, GOFORTH H, SRIVATSAN R A, et al. Pointnetlk: Robust & efficient point cloudregistration using pointnet[C]//Proceedings of the IEEE/CVF conference on computer visionand pattern recognition. 2019: 7163-7172.
[93] ZHAO H, WEI S, SHI D, et al. Learning Symmetry-Aware Geometry Correspondences for 6DObject Pose Estimation[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision. 2023: 14045-14054.
[94] VON DRIGALSKI F, TANIGUCHI S, LEE R, et al. Contact-based in-hand pose estimationusing bayesian state estimation and particle filtering[C]//2020 IEEE International Conferenceon Robotics and Automation (ICRA). IEEE, 2020: 7294-7299.
[95] WEN B, MITASH C, SOORIAN S, et al. Robust, occlusion-aware pose estimation for objectsgrasped by adaptive hands[C]//2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2020: 6210-6217.
[96] PFANNE M, CHALON M, STULP F, et al. Fusing joint measurements and visual features forin-hand object pose estimation[J]. IEEE Robotics and Automation Letters, 2018, 3(4): 3497-3504.
[97] ÁLVAREZ D, ROA M A, MORENO L. Tactile-based in-hand object pose estimation[C]//Iberian Robotics conference. Springer, 2017: 716-728.
[98] TU Y, JIANG J, LI S, et al. PoseFusion: Robust Object-in-Hand Pose Estimation with SelectLSTM[A]. 2023.
[99] VILLALONGA M B, RODRIGUEZ A, LIM B, et al. Tactile object pose estimation from thefirst touch with geometric contact rendering[C]//Conference on Robot Learning. PMLR, 2021:1015-1029.
[100] YANG S, KIM W D, PARK H, et al. In-Hand Object Classification and Pose Estimation WithSim-to-Real Tactile Transfer for Robotic Manipulation[J]. IEEE Robotics and Automation Letters, 2023, 9(1): 659-666.
[101] LIU X, HAN X, GUO N, et al. Bio-Inspired Proprioceptive Touch of a Soft Finger with InnerFinger Kinesthetic Perception[J]. Biomimetics, 2023, 8(6): 501.
[102] LIU X, HAN X, HONG W, et al. Proprioceptive learning with soft polyhedral networks[J]. TheInternational Journal of Robotics Research, 2024: 02783649241238765.
[103] DEMAINE E D, O’ROURKE J. Geometric Folding Algorithms: Linkages, Origami, Polyhedra[M]. Cambridge University Press, 2007.
[104] YANG L, WAN F, WANG H, et al. Rigid-Soft interactive learning for robust grasping[J]. IEEERobotics and Automation Letters, 2020, 5(2): 1720-1727.
[105] YANG Z, GE S, WAN F, et al. Scalable tactile sensing for an omni-adaptive soft robot finger[C]//2020 3rd IEEE International Conference on Soft Robotics (RoboSoft). IEEE, 2020: 572-577.
[106] YI J, CHEN X, SONG C, et al. Customizable Three-Dimensional-Printed Origami Soft RoboticJoint With Effective Behavior Shaping for Safe Interactions[J]. IEEE Transactions on Robotics,2019, 35(1): 114-123.
[107] TEKIN B, BOGO F, POLLEFEYS M. H+ o: Unified egocentric recognition of 3d hand-objectposes and interactions[C/OL]//Proceedings of the IEEE/CVF conference on computer visionand pattern recognition. 2019: 4511-4520. DOI: 10.1109/CVPR.2019.00464.
[108] HASSON Y, VAROL G, TZIONAS D, et al. Learning joint reconstruction of hands and manipulated objects[C/OL]//Proceedings of the IEEE/CVF conference on computer vision and patternrecognition. 2019: 11807-11816. DOI: 10.1109/CVPR.2019.01208.
[109] HAMPALI S, SARKAR S D, RAD M, et al. Keypoint transformer: Solving joint identification in challenging hands and object interactions for accurate 3d pose estimation[C/OL]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022:11090-11100. DOI: 10.1109/CVPR52688.2022.01081.
[110] ZIMMERMANN C, CEYLAN D, YANG J, et al. Freihand: A dataset for markerless capture ofhand pose and shape from single rgb images[C/OL]//Proceedings of the IEEE/CVF InternationalConference on Computer Vision. 2019: 813-822. DOI: 10.1109/ICCV.2019.00090.
[111] CHEN X, LIU Y, DONG Y, et al. Mobrecon: Mobile-friendly hand mesh reconstruction frommonocular image[C/OL]//Proceedings of the IEEE/CVF Conference on Computer Vision andPattern Recognition. 2022: 20544-20554. DOI: 10.1109/CVPR52688.2022.01989.
[112] YUAN W, DONG S, ADELSON E H. Gelsight: High-resolution robot tactile sensors for estimating geometry and force[J]. Sensors, 2017, 17(12): 2762.
[113] LAMBETA M, CHOU P W, TIAN S, et al. Digit: A novel design for a low-cost compacthigh-resolution tactile sensor with application to in-hand manipulation[J]. IEEE Robotics andAutomation Letters, 2020, 5(3): 3838-3845.
[114] YAMAGUCHI A, ATKESON C G. Combining finger vision and optical tactile sensing: Reducing and handling errors while cutting vegetables[C]//2016 IEEE-RAS 16th InternationalConference on Humanoid Robots (Humanoids). IEEE, 2016: 1045-1051.
[115] CHO K, VAN MERRIENBOER B, GULCEHRE C, et al. Learning Phrase Representationsusing RNN Encoder–Decoder for Statistical Machine Translation[C]//Proceedings of the 2014Conference on Empirical Methods in Natural Language Processing (EMNLP). Association forComputational Linguistics, 2014: 1724.
[116] SATTLER T, MADDERN W, TOFT C, et al. Benchmarking 6dof outdoor visual localizationin changing conditions[C]//Proceedings of the IEEE conference on computer vision and patternrecognition. 2018: 8601-8610.
[117] GAO Y, MATSUOKA S, WAN W, et al. In-Hand Pose Estimation Using Hand-Mounted RGBCameras and Visuotactile Sensors[J]. IEEE Access, 2023, 11: 17218-17232.
[118] FISHEL J A, LOEB G E. Sensing tactile microvibrations with the BioTac—Comparison withhuman sensitivity[C]//2012 4th IEEE RAS & EMBS international conference on biomedicalrobotics and biomechatronics (BioRob). IEEE, 2012: 1122-1127.
[119] YAN Y, HU Z, YANG Z, et al. Soft magnetic skin for super-resolution tactile sensing with forceself-decoupling[J]. Science Robotics, 2021, 6(51): eabc8801.
[120] ARGALL B D, CHERNOVA S, VELOSO M, et al. A survey of robot learning from demonstration[J/OL]. Robotics and Autonomous Systems, 2009, 57(5): 469 - 483. DOI: https://doi.org/10.1016/j.robot.2008.10.024.
[121] KROEMER O, NIEKUM S, KONIDARIS G. A Review of Robot Learning for Manipulation:Challenges, Representations, and Algorithms[A]. 2019. arXiv: 1907.03146.
[122] OSA T, PAJARINEN J, NEUMANN G, et al. An Algorithmic Perspective on Imitation Learning[Z]. 2018.
[123] KAPLAN H, HILL K, LANCASTER J B, et al. A theory of human life history evolution: Diet,intelligence, and longevity[J]. Evolutionary Anthropology, 2000, 9(4): 156-185.
[124] BROWN S, SAMMUT C. Tool Use Learning in Robots[Z]. 2011.
[125] CHRISTEN S, STEVSIC S, HILLIGES O. Demonstration-Guided Deep Reinforcement Learning of Control Policies for Dexterous Human-Robot Interaction[A]. 2019.
[126] CHU V, AKGUN B, THOMAZ A L. Learning haptic affordances from demonstration andhuman-guided exploration[Z]. 2016: 119-125.
[127] BILLARD A, KRAGIC D. Trends and challenges in robot manipulation[J/OL]. Science, 2019,364(6446). DOI: 10.1126/science.aat8414.
[128] FAZELI N, OLLER M, WU J, et al. See, feel, act: Hierarchical learning for complex manipulation skills with multisensory fusion: Vol. 4[Z]. 2019.
[129] DAUTENHAHN K, NEHANIV C L. Imitation in Animals and Artifacts[Z]. 2002.
[130] GEALY D V, MCKINLEY S, YI B, et al. Quasi-Direct Drive for Low-Cost Compliant RoboticManipulation[J/OL]. 2019 International Conference on Robotics and Automation (ICRA),2019. DOI: 10.1109/icra.2019.8794236.
[131] Calli B, Walsman A, Singh A, et al. Benchmarking in Manipulation Research: Using the YaleCMU-Berkeley Object and Model Set[J]. IEEE Robotics Automation Magazine, 2015, 22(3):36-52.
[132] HUANG O W, CHENG H N, CHAN T W. Number jigsaw puzzle: A mathematical puzzle gamefor facilitating players’ problem-solving strategies[C]//2007 First IEEE International Workshopon Digital Game and Intelligent Toy Enhanced Learning (DIGITEL’07). IEEE, 2007: 130-134.
[133] LIU X, WAN F, GE S, et al. Jigsaw-based Benchmarking for Learning Robotic Manipulation[C]//2023 International Conference on Advanced Robotics and Mechatronics (ICARM). IEEE,2023: 124-130.
[134] EVERINGHAM M, VAN GOOL L, WILLIAMS C K, et al. The pascal visual object classes(voc) challenge[J]. International journal of computer vision, 2010, 88: 303-338.
[135] GUPTA A, MURALI A, GANDHI D P, et al. Robot learning in homes: Improving generalization and reducing dataset bias[J]. Advances in neural information processing systems, 2018,31.
[136] JIANG Y, MOSESON S, SAXENA A. Efficient grasping from rgbd images: Learning using anew rectangle representation[C]//2011 IEEE International conference on robotics and automation. IEEE, 2011: 3304-3311.
[137] KEHL W, MANHARDT F, TOMBARI F, et al. Ssd-6d: Making rgb-based 3d detection and 6dpose estimation great again[C]//Proceedings of the IEEE international conference on computervision. 2017: 1521-1529.
[138] WANG H, LIU X, QIU N, et al. DeepClaw 2.0: A Data Collection Platform for LearningHuman Manipulation[J]. Frontiers in Robotics and AI, 2022, 9: 787291.
[139] SUZUI K, YOSHIYASU Y, GABAS A, et al. Toward 6 DOF Object Pose Estimation withMinimum Dataset[C]//2019 IEEE/SICE International Symposium on System Integration (SII).IEEE, 2019: 462-467.
[140] MINICHINO J. Community experience distilled: Learning OpenCV 3 Computer Vision withPython: Unleash the Power of Computer Vision with Python Using OpenCV[M]. Packt Publishing, 2015.
[141] MARCHAND E, SPINDLER F, CHAUMETTE F. ViSP for visual servoing: a generic softwareplatform with a wide class of robot control skills[J]. IEEE Robotics and Automation Magazine,2005, 12(4): 40-52.
[142] YUAN W, LI R, SRINIVASAN M A, et al. Measurement of shear and slip with a GelSighttactile sensor[C]//2015 IEEE International Conference on Robotics and Automation (ICRA).IEEE, 2015: 304-311.
[143] WU T, DONG Y, LIU X, et al. Vision-based tactile intelligence with soft robotic metamaterial[J]. Materials & Design, 2024, 238: 112629.
[144] SHIMOGA K, GOLDENBERG A. Soft Robotic Fingertips: Part I: A Comparison of Construction Materials[J]. The International Journal of Robotics Research, 1996, 15(4): 320-334.
[145] SHIMOGA K, GOLDENBERG A. Soft Robotic Fingertips: Part II: Modeling and ImpedanceRegulation[J]. The International Journal of Robotics Research, 1996, 15(4): 335-350.
[146] LIU Z, SONG L, HOU Z, et al. Screw insertion method in peg-in-hole assembly for axialfriction reduction[J]. IEEE Access, 2019, 7: 148313-148325.
[147] TANG T, LIN H C, ZHAO Y, et al. Autonomous alignment of peg and hole by force/torquemeasurement for robotic assembly[C]//2016 IEEE international conference on automation science and engineering (CASE). IEEE, 2016: 162-167.
[148] TSUJI T, OHKUMA J, SAKAINO S. Dynamic object manipulation considering contact condition of robot with tool[J]. IEEE Transactions on Industrial Electronics, 2015, 63(3): 1972-1980.
[149] MA D, RODRIGUEZ A. Friction variability in planar pushing data: Anisotropic friction anddata-collection bias[J]. IEEE Robotics and Automation Letters, 2018, 3(4): 3232-3239.
[150] AGER A L, BORMS D, DESCHEPPER L, et al. Proprioception: How is it Affected by ShoulderPain? A Systematic Review[J]. Journal of Hand Therapy, 2020, 33(4): 507-516.
[151] SCOTT M G. Measurement of kinesthesis[J]. Research Quarterly. American Association forHealth, Physical Education and Recreation, 1955, 26(3): 324-341.
[152] GARRIDO-JURADO S, MUÑOZ-SALINAS R, MADRID-CUEVAS F J, et al. Automaticgeneration and detection of highly reliable fiducial markers under occlusion[J]. Pattern Recognition, 2014, 47(6): 2280-2292.
[153] BRADSKI G. The OpenCV Library[J]. Dr. Dobb’s Journal of Software Tools, 2000.
[154] 陈政清. 梁杆结构几何非线性有限元的数值实现方法[J]. 工程力学, 2014(6): 42-52.
[155] 陈政清, 曾庆元, 颜全胜. 空间杆系结构大挠度问题内力分析的 UL 列式法[J]. 土木工程学报, 1992, 25(5): 34-44.
[156] CHEN Z, AGAR T. Geometric nonlinear analysis of flexible spatial beam structures[J]. Computers & structures, 1993, 49(6): 1083-1094.
[157] GUTIERREZ-LEMINI D. Engineering Viscoelasticity[M]. Springer New York, NY, 2013.
[158] MANTI M, HASSAN T, PASSETTI G, et al. A Bioinspired Soft Robotic Gripper for Adaptableand Effective Grasping[J]. Soft Robotics, 2015, 2(3): 107-116.
[159] ZHANG Y, ZHANG W, GAO P, et al. Finger-Palm Synergistic Soft Gripper for DynamicCapture via Energy Harvesting and Dissipation[J]. Nature Communications, 2022, 13(1): 7700.
[160] JRAD H, DION J L, RENAUD F, et al. Non-linear Generalized Maxwell Model for Dynamic Characterization of Viscoelastic Components and Parametric Identification Techniques[C]//International Design Engineering Technical Conferences and Computers and Informationin Engineering Conference: Vol. 45004. American Society of Mechanical Engineers, 2012:291-300.
[161] COX E. A method of assigning numerical and percentage values to the degree of roundness ofsand grains[J]. Journal of paleontology, 1927, 1(3): 179-183.
[162] LIU Z, HOWE R D. Beyond Coulomb: Stochastic Friction Models for Practical Grasping andManipulation[J]. IEEE Robotics and Automation Letters, 2023.
[163] TREMBLAY M R, CUTKOSKY M R. Estimating friction using incipient slip sensing duringa manipulation task[C]//
[1993] Proceedings IEEE International Conference on Robotics andAutomation. IEEE, 1993: 429-434.
[164] SON J S, MONTEVERDE E A, HOWE R D. A tactile sensor for localizing transient eventsin manipulation[C]//Proceedings of the 1994 IEEE International Conference on Robotics andAutomation. IEEE, 1994: 471-476.
[165] SPIERS A J, CALLI B, DOLLAR A M. Variable-friction finger surfaces to enable within-handmanipulation via gripping and sliding[J]. IEEE Robotics and Automation Letters, 2018, 3(4):4116-4123.
[166] TEEPLE C B, AKTAŞ B, YUEN M C, et al. Controlling palm-object interactions via frictionfor enhanced in-hand manipulation[J]. IEEE Robotics and Automation Letters, 2022, 7(2):2258-2265.
[167] LIN Y, CHURCH A, YANG M, et al. Bi-touch: Bimanual tactile manipulation with sim-to-realdeep reinforcement learning[J]. IEEE Robotics and Automation Letters, 2023.
[168] KICKI P, BEDNAREK M, WALAS K. Robotic manipulation of elongated and elastic objects[C]//2019 Signal Processing: Algorithms, Architectures, Arrangements, and Applications(SPA). IEEE, 2019: 23-27.
[169] SARKAR N, YUN X, KUMAR V. Dynamic control of 3-D rolling contacts in two-arm manipulation[J]. IEEE Transactions on Robotics and Automation, 1997, 13(3): 364-376.
[170] PINTO L, DAVIDSON J, GUPTA A. Supervision via competition: Robot adversaries for learning tasks[C]//2017 IEEE International Conference on Robotics and Automation (ICRA). IEEE,2017: 1601-1608.
修改评论