中文版 | English
题名

面向日常出行的动力假肢无监督地形识别 与视觉惯性定位

其他题名
UNSUPERVISED TERRAIN RECOGNITION AND VISUAL-INERTIAL LOCALIZATION FOR POWERED PROSTHESES IN DAILY WALKING
姓名
姓名拼音
CHEN Chuheng
学号
12032418
学位类型
硕士
学位专业
0801Z1 智能制造与机器人
学科门类/专业学位类别
08 工学
导师
付成龙
导师单位
机械与能源工程系
论文答辩日期
2023-05-18
论文提交日期
2023-06-27
学位授予单位
南方科技大学
学位授予地点
深圳
摘要

视觉传感器能够提高动力大腿假肢的环境适应性,并通过结合环境地形识别帮助截肢者在多种日常地形中行走。然而,现有提高假肢环境适应能力的研究大多集中在基于监督学习的环境分类与地形参数估计上。这些方法存在数据标注的负担,也没有充分利用视觉系统所提供的信息,假肢对于环境的理解能力仍然存在不足,难以适应日常生活中具有挑战的地形。

为解决上述问题,本文提出了基于无监督领域自适应的多地形识别方法与动力假肢的视觉惯性定位方法,加强假肢的环境感知与理解能力并帮助假肢更好地实现日常生活中多地形下的行走任务,为假肢的环境适应性控制提供支撑。

本文首先对假肢的环境感知系统进行研究,通过对不同深度相机采集的地形数据进行参数估计,并分析其精度,选取合适的相机与惯性测量单元为实验室第二代动力假肢搭建环境感知系统。

针对监督学习带来的数据标注负担,本文生成了五种日常地形的仿真数据,并为其添加随机尺寸与几何噪声进行增强。之后,使用仿真数据与无标注的真实环境数据,通过基于最大分类器差异的无监督领域自适应方法训练网络,实现对真实世界地形的准确分类。

之后,为增强假肢的环境理解与态势感知能力,本文通过基于迭代最近点的点云配准方法估计相机位置,并通过误差状态卡尔曼滤波器融合视觉与惯性测量单元估计的相机运动信息,以增强相机定位的鲁棒性与准确性。通过相机位置与假肢连杆模型推算假肢各关节与脚尖位置,并结合地面坐标系的二维环境点云构建环境点云地图。楼梯与障碍物的步行定位实验证明了方法的准确性。

最后,通过设计动力假肢上楼梯实验,对无监督地形分类与假肢视觉惯性定位方法在穿戴假肢行走过程中的真实表现进行评估,验证了所提出方法的在假肢行走中的有效性。

关键词
语种
中文
培养类别
独立培养
入学年份
2020
学位授予年份
2023-06
参考文献列表

[1] 2006年第二次全国残疾人抽样调查主要数据公报[J].中国康复理论与实践, 2006(12): 1013.
[2] 2010年第六次全国人口普查主要数据公报(第1号) [J].中国计划生育学杂志, 2011, 19(08): 511-512.
[3] AMTMANN D, MORGAN S J, KIM J, et al. Health-related profiles of people with lower limb loss[J]. Archives of Physical Medicine and Rehabilitation, 2015, 96(8): 1474-1483.
[4] SEKER A, KARA A, CAMUR S, et al. Comparison of mortality rates and functional results after transtibial and transfemoral amputations due to diabetes in elderly patients-a retrospective study[J]. International Journal of Surgery, 2016, 33: 78-82.
[5] MORGAN S J, KELLY V E, AMTMANN D, et al. Self-reported cognitive concerns in people with lower limb loss[J]. Archives of Physical Medicine and Rehabilitation, 2016, 97(6): 912-918.
[6] LAWSON B E, MITCHELL J E, TRUEX D, et al. A robotic leg prosthesis: Design, control, and implementation[J]. IEEE Robotics & Automation Magazine, 2014, 21(4): 70-81.
[7] CLITES T R, CARTY M J, ULLAURI J B, et al. Proprioception from a neurally controlled lower-extremity prosthesis[J]. Science Translational Medicine, 2018, 10(443): eaap8373.
[8] QUINTERO D, VILLARREAL D J, LAMBERT D J, et al. Continuous-phase control of a powered knee-ankle prosthesis: Amputee experiments across speeds and inclines[J]. IEEE Transactions on Robotics, 2018, 34(3): 686-701.
[9] CORTINO R J, BOLÍVAR-NIETO E, BEST T K, et al. Stair ascent phase-variable control of a powered knee-ankle prosthesis[C]//2022 International Conference on Robotics and Automation (ICRA).IEEE, 2022: 5673-5678.
[10] MENDEZ J, HOOD S, GUNNEL A, et al. Powered knee and ankle prosthesis with indirect volitional swing control enables level-ground walking and crossing over obstacles[J]. Science Robotics, 2020, 5(44): eaba6635.
[11] TITLEY C. We have the technology[J]. Engineering & Technology, 2009, 4(15): 22-25.
[12] SUP F, BOHARA A, GOLDFARB M. Design and control of a powered transfemoral prosthesis[J]. The International Journal of Robotics Research, 2008, 27(2): 263-273.
[13] TRAN M, GABERT L, HOOD S, et al. A lightweight robotic leg prosthesis replicating the biomechanics of the knee, ankle, and toe joint[J]. Science Robotics, 2022, 7(72).
[14] 付翱. 机器人化动力大腿假肢设计与控制[D]. 北京: 清华大学, 2014, 23-24.
[15] WANG X, XIU H, ZHANG Y, et al. Design and validation of a polycentric hybrid knee prosthesis with electromagnet-controlled mode transition[J]. IEEE Robotics and Automation Letters, 2022, 7(4): 10502-10509.
[16] GAO S, MAI J, ZHU J, et al. Mechanism and controller design of a transfemoral prosthesis with electrohydraulic knee and motor-driven ankle[J]. IEEE/ASME Transactions on Mechatronics, 2020, 26(5): 2429-2439.
[17] ELERY T, REZAZADEH S, NESLER C, et al. Design and validation of a powered knee–ankle prosthesis with high-torque, low-impedance actuators[J]. IEEE Transactions on Robotics, 2020, 36(6): 1649-1668.
[18] 肖文涛. 动力大腿假肢楼梯步态规划与控制[D]. 深圳: 哈尔滨工业大学, 2020.
[19] CHENG S, BOLIVAR-NIETO E, WELKER C G, et al. Modeling the transitional kinematics between variable-incline walking and stair climbing[J]. IEEE Transactions on Medical Robotics and Bionics, 2022, 4(3): 840-851.
[20] FEI G, LIU G, LIANG F, et al. IMU-based locomotion mode identification for transtibial prostheses, orthoses, and exoskeletons[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2020, 28(6): 1334-1343.
[21] MATTHIS J S, YATES J L, HAYHOE M M. Gaze and the Control of Foot Placement When Walking in Natural Terrain[J]. Current Biology, 2018, 28(8): 1224-1233
[22] TSCHIEDEL M, RUSSOLD M F, KANIUSAS E. Relying on more sense for enhancing lower limb prostheses control: a review[J]. Journal of NeuroEngineering and Rehabilitation, 2020, 17(1): 99.
[23] SAHOO S, MAHESHWARI M, PRATIHAR D K, et al. A geometry recognition-based strategy for locomotion transitions' early prediction of prosthetic devices[J]. IEEE Transactions on Instrumentation and Measurement, 2020, 69(4): 1259-1267.
[24] NOVO-TORRES L, RAMIREZ-PAREDES J P, VILLARREAL D J. Obstacle recognition using computer vision and convolutional neural networks for powered prosthetic leg applications[C]//2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2019: 3360-3363.
[25] TUCKER M R, OLIVIER J, PAGEL A, et al. Control strategies for active lower extremity prosthetics and orthotics: a review[J]. Journal of NeuroEngineering and Rehabilitation, 2015, 12: 1-29.
[26] ZHONG B, SILVA R, TRAN M, et al. Efficient environmental context prediction for lower limb prostheses[J]. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2022, 52(6): 3980-3994.
[27] XU D, FENG Y, MAI J, et al. Real-time on-board recognition of continuous locomotion modes for amputees with robotic transtibial prostheses[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2018, 26(10): 2015-2025.
[28] HUANG H, ZHANG F, HARGROVE L, et al. Continuous locomotion-mode identification for prosthetic legs based on neuromuscular–mechanical Fusion[J]. IEEE Transactions on Biomedical Engineering, 2011, 58(10): 2867-2875.
[29] AL-DABBAGH A, RONSSE R. A review of terrain detection systems for applications in locomotion assistance[J]. Robotics and Autonomous Systems, 2020, 133: 103628.
[30] FAN Z, F ZHENG, MING L, et al. Preliminary design of a terrain recognition system[C]//2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 2011: 5452-5455.
[31] SAHOO S, PANDA S K, PRATIHAR D K, et al. Locomotion modes and environmental features recognition using laser distance sensors[J]. IEEE Sensors Journal, 2022, 22(5): 4625-4633.
[32] KLEINER B, ZIEGENSPECK N, STOLYAROV R, et al. A radar-based terrain mapping approach for stair detection towards enhanced prosthetic foot control[C]// 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob). IEEE, 2018: 105-110.
[33] LIU M, WANG D, HUANG H. Development of an environment-aware locomotion mode recognition system for powered lower limb prostheses[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2016, 24(4): 434-443.
[34] BOUHAMED S A, KALLEL I K, MASMOUDI D S. Stair case detection and recognition using ultrasonic signal[C]//2013 IEEE 36th International Conference on Telecommunications & Signal Processing (TSP). IEEE, 2013: 672-676.
[35] ZHANG K, XIONG C, ZHANG W, et al. Environmental features recognition for lower limb prostheses toward predictive walking[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2019, 27(3): 465-476.
[36] LASCHOWSKI B, MCNALLY W, WONG A, et al. Preliminary design of an environment recognition system for controlling robotic lower-limb prostheses and exoskeletons[C]//2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR). IEEE, 2019: 868-873.
[37] DIAZ J P, Da SILVA R L, ZHONG B, et al. Visual terrain identification and surface inclination estimation for improving human locomotion with a lower-limb prosthetic[C]//2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2018: 1817-1820.
[38] ZHONG B, DA SILVA R L, LI M, et al. Environmental context prediction for lower limb prostheses with uncertainty quantification[J]. IEEE Transactions on Automation Science and Engineering, 2020, 18(2): 458-470.
[39] KRAUSZ N E, HARGROVE L J. Sensor fusion of vision, kinetics, and kinematics for forward prediction during walking with a transfemoral prosthesis[J]. IEEE Transactions on Medical Robotics and Bionics, 2021, 3(3): 813-824
[40] VAROL H A, MASSALIN Y. A feasibility study of depth image based intent recognition for lower limb prostheses[C]// 2016 38th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). IEEE, 2016: 5055-5058.
[41] HEBERT M. Active and passive range sensing for robotics[C]// Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065), 2000: 102-110.
[42] HENRY P, KRAININ M, HERBST E, et al. RGB-D mapping: Using kinect-style depth cameras for dense 3D modeling of indoor environments[J]. International Journal of Robotics Research, 2013, 31(5): 647-663.
[43] ALADRÉN A, LÓPEZ-NICOLÁS A, PUIG L, et al. Navigation assistance for the visually impaired using RGB-D sensor with range expansion. [J]. IEEE Systems Journal, 2016, 10(3): 922-932.
[44] KRAUSZ N E, LENZI T, HARGOVE L J, et al. Depth sensing for improved control of lower limb prostheses[J]. IEEE Transactions on Biomedical Engineering, 2015, 62(11): 2576-2587.
[45] PEREZ-YUS A, GUTIERREZ-GOMEZ D, LOPEZ-NICOLAS G, et al. Stairs detection with odometry-aided traversal from a wearable RGB-D camera[J]. Computer Vision & Image Understanding, 2017, 154: 192-205.
[46] MASSALIN Y, ABDRAKHMANOVA M, VAROL H A. User-Independent Intent Recognition for Lower-Limb Prostheses Using Depth Sensing[J]. IEEE Transactions on Biomedical Engineering, 2018, 62(11): 2576-2587.
[47] AL-DABBAGH A, RONSSE R. Using depth vision for terrain detection during active locomotion[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2021: 508-515.
[48] LECUN Y, BENGIO Y, HINTON G. Deep learning[J]. Nature, 2015, 521(7553): 436-444.
[49] KHADEMI G, DAN S. Convolutional neural network for environmentally aware locomotion mode recognition of lower-limb amputees[C]//ASME Dynamic Systems and Control Conference. 2019, 11.
[50] ZHANG K, ZHANG W, XIAO W, et al. Sequential decision fusion for environmental classification in assistive walking[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2019, 27(9): 1780-1790.
[51] ZHANG K, LUO J, XIAO W, et al. A subvision system for enhancing the environmental adaptability of the powered transfemoral prosthesis[J]. IEEE Transactions on Cybernetics, 2021, 51(6): 3285-3297.
[52] BOUSMALIS K, IRPAN A, WOHLHART P, et al. Using simulation and domain adaptation to improve efficiency of deep robotic grasping[C]//2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2018: 4243-4250.
[53] N SÜNDERHAUF, BROCK O, SCHEIRER W, et al. The limits and potentials of deep learning for robotics[J]. The International Journal of Robotics Research, 2018, 37(4-5): 405-420.
[54] LASCHOWSKI B, MCNALLY W, WONG A, et al. ExoNet database: Wearable camera images of human locomotion environments[J]. Frontiers in Robotics and AI, 2020, 7: 188.
[55] TOBIN J, FONG R, RAY A, et al. Domain randomization for transferring deep neural networks from simulation to the real world[C]//2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2017: 23-30.
[56] ZHU F, ZHU L, YANG Y. Sim-Real joint reinforcement transfer for 3D indoor navigation[C]//2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2019: 11380-11389.
[57] LIU Z, MENG Z, GAO N, et al. Calibration of the relative orientation between multiple depth cameras based on a three-dimensional target[J]. Sensors, 2019, 19(13): 3008.
[58] LEE S, SHIM H. Skewed stereo time-of-flight camera for translucent object imaging[J]. Image and Vision Computing, 2015, 43: 27-38.
[59] KOUW W M, LOOG M. A review of domain adaptation without target labels[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 43(3): 766-785.
[60] WULFMEIER M, BEWLEY A, POSNER I. Addressing appearance change in outdoor robotics with adversarial domain adaptation[C]//2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2017: 1551-1558.
[61] GANIN Y, USTINOVA E, AJAKAN H, ET AL. Domain-adversarial training of neural networks[J]. Journal of Machine Learning Research, 2016, 17(1): 2096-2030.
[62] TZENG E, HOFFMAN J, SAENKO K, et al. Adversarial discriminative domain adaptation[C]//2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).IEEE, 2017: 2962-2971.
[63] WANG X, LI L, YE W, et al. Transferable attention for domain adaptation[C]// Proceedings of the AAAI Conference on Artificial Intelligence. 2019, 33: 5345-5352.
[64] AL-DABBAGH A, RONSSE R. Depth vision-based terrain detection algorithm during human locomotion[J]. IEEE Transactions on Medical Robotics and Bionics, 2022, 4(4): 1010-1021.
[65] LOWRY S, SUNDERHAUF N, NEWMAN P, et al. Visual place recognition: A survey[J]. IEEE Transactions on Robotics, 2016, 32(1): 1-19.
[66] SADEGHI F, TOSHEV A, JANG E, et al. Sim2Real viewpoint invariant visual servoing by recurrent control[C]/ 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2018: 4691-4699.
[67] LI B, ZHANG X, FANG Y, et al. Visual servo regulation of wheeled mobile robots with simultaneous depth identification[J]. IEEE Transactions on Industrial Electronics, 2017, 65(1): 460-469.
[68] 成大先. 机械设计手册:第六版[M]. 北京: 化学工业出版社, 2016: 787-800.
[69] RIENER R, RABUFFETTI M, FRIGO C. Stair ascent and descent at different inclinations[J]. Gait & Posture, 2002, 15(1): 32-44.
[70] LASCHOWSKI B, MCNALLY W, WONG A, et al. Comparative analysis of environment recognition systems for control of lower-limb exoskeletons and prostheses[C]// 2020 8th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob). IEEE, 2020: 581-586.
[71] BEN-DAVID S, BLITZER J, CRAMMER K, et al. A theory of learning from different domains[J]. Machine Learning, 2010, 79(1-2): 151-175.
[72] SAITO K, WATANABE K, USHIKU Y, et al. Maximum classifier discrepancy for unsupervised domain adaptation[C]//2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2018: 3723-3732.
[73] WANG H, WANG Z, DU M, et al. Score-CAM: Score-weighted visual explanations for convolutional neural networks[C]// 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops(CVPRW). IEEE, 2020: 24-25.
[74] ZHOU B, KHOSLA A, LAPEDRIZA A, et al. Learning deep features for discriminative localization[C]//2016 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2016: 2921-2929.
[75] LAURENS V D M, HINTON G. Visualizing data using t-SNE[J]. Journal of Machine Learning Research, 2008, 9(2605):2579-2605.
[76] SU B Y, WANG J, LIU S Q, et al. A CNN-based method for intent recognition using inertial measurement units and intelligent lower limb prosthesis[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2019, 27(5): 1032-1042.
[77] WU B, WAN A, YUE X, et al. SqueezeSeg: Convolutional neural nets with recurrent CRF for real-time road-object segmentation from 3D LiDAR point loud[C]// 2018 International Conference on Robotics and Automation (ICRA). IEEE, 2018: 1887-1893.
[78] LIU Y X, WANG R, GUTIERREZ-FAREWIK E M. A muscle synergy-inspired method of detecting human movement intentions based on wearable sensor fusion[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2018, 29: 1089-1098.
[79] BESL P J, MCKAY N D. A method for registration of 3-D shapes[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 1992, 14(2): 239-256.
[80] SIPIRAN I, BUSTOS B. Harris 3D: a robust extension of the Harris operator for interest point detection on 3D meshes[J]. The Visual Computer, 2011, 27(11): 963-976.
[81] SOLÀ J. Quaternion kinematics for the error-state Kalman filter[J]. arXiv preprint arXiv: 1711.02508, 2017.

所在学位评定分委会
力学
国内图书分类号
TP242
来源库
人工提交
成果类型学位论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/544140
专题工学院_机械与能源工程系
推荐引用方式
GB/T 7714
陈楚衡. 面向日常出行的动力假肢无监督地形识别 与视觉惯性定位[D]. 深圳. 南方科技大学,2023.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可 操作
12032418-陈楚衡-机械与能源工程(4771KB)----限制开放--请求全文
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[陈楚衡]的文章
百度学术
百度学术中相似的文章
[陈楚衡]的文章
必应学术
必应学术中相似的文章
[陈楚衡]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。