[1] 洪翔. 习近平关于海洋强国重要论述研究[D]. 集美大学, 2020.
[2] 深圳市市规划和自然资源局. 深圳市海洋发展规划(2022-2035 年)[Z]. 2022.
[3] 左立标. 国外深海机器人技术发展现状及对我国的启示[J]. 采矿技术, 2011(5): 5.
[4] 梁凇. 水下检测与作业机器人ROV 控制系统研制及动力定位研究[D]. 江苏科技大学,2017.
[5] 杨兵. 穿戴式潜水推进系统的潜水运动建模及推进器位置优化[D]. 天津: 河北工业大学,2019.
[6] 王帅. 水下运载器总体优化与自平衡控制技术研究[D]. 北京: 中国舰船研究院, 2012.
[7] JIMF. Farallon MK-III DPV Scooter[EB/OL]. 2015. https://scubaboard.com/community/threads/farallon-mk-iii-dpv-uw-scooter.513199/.
[8] SUBMERGE. Submerge Scooters DPV[EB/OL]. 2011. https://www.silent-submersion.com/Products/imperial.html.
[9] BAZONKA. Protei-5 Diver Propulsion Vehicle[EB/OL]. 2013. https://military-history.fandom.com/wiki/Protei-5_Russian_diver_propulsion_vehicle?oldid=163829.
[10] A22SHADY. Oceanic Mako[EB/OL]. 2011. https://scubaboard.com/community/threads/oceanic-mako-scooter.392169/.
[11] LLORENS-BONILLA B, PARIETTI F, ASADA H. Demonstration-based control of supernumerary robotic limbs. Intelligent Robots and Systems (IROS), 2012 IEEE[C]//RSJ International Conference on. 2012: 7-12.
[12] PARIETTI F, ASADA H H. Dynamic analysis and state estimation for wearable robotic limbs subject to human-induced disturbances[C]//2013 IEEE International Conference on Robotics and Automation. IEEE, 2013: 3880-3887.
[13] PARIETTI F, ASADA H H. Supernumerary robotic limbs for aircraft fuselage assembly: body stabilization and guidance by bracing[C]//2014 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2014: 1176-1183.
[14] BONILLA B L, ASADA H H. A robot on the shoulder: Coordinated human-wearable robot control using coloured petri nets and partial least squares predictions[C]//2014 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2014: 119-125.
[15] PARIETTI F, CHAN K, ASADA H H. Bracing the human body with supernumerary robotic limbs for physical assistance and load reduction[C]//2014 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2014: 141-148.
[16] PARIETTI F, CHAN K C, HUNTER B, et al. Design and control of supernumerary robotic limbs for balance augmentation[C]//2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2015: 5010-5017.
[17] KHODAMBASHI R, WEINBERG G, SINGHOSE W, et al. User oriented assessment of vibration suppression by command shaping in a supernumerary wearable robotic arm[C]//2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids). IEEE, 2016: 1067-1072.
[18] KUREK D A, ASADA H H. The MantisBot: Design and impedance control of supernumerary robotic limbs for near-ground work[C]//2017 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2017: 5942-5947.
[19] GONZALEZ D J, ASADA H H. Design of extra robotic legs for augmenting human payload capabilities by exploiting singularity and torque redistribution[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 4348-4354.
[20] GUGGENHEIM J, HOFFMAN R, SONG H, et al. Leveraging the human operator in the design and control of supernumerary robotic limbs[J]. IEEE Robotics and Automation Letters, 2020, 5(2): 2177-2184.
[21] VATSAL V, HOFFMAN G. Design and analysis of a wearable robotic forearm[C]//2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2018: 5489-5496.
[22] AL-SADA M, HÖGLUND T, KHAMIS M, et al. Orochi: investigating requirements and expectations for multipurpose daily used supernumerary robotic limbs[C]//Proceedings of the 10th Augmented Human International Conference 2019. 2019: 1-9.
[23] HAO M, ZHANG J, CHEN K, et al. Design and basic control of extra robotic legs for dynamic walking assistance[C]//2019 IEEE International Conference on Advanced Robotics and its Social Impacts (ARSO). IEEE, 2019: 246-250.
[24] XU C, LIU Y, LI Z. Biomechtronic design of a supernumerary robotic limbs for industrial assembly[C]//2019 IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM). IEEE, 2019: 553-558.
[25] ZHANG Q, ZHU Y, ZHAO X, et al. Design of reconfigurable supernumerary robotic limb based on differential actuated joints[J]. International Journal of Computer and Information Engineering, 2020, 14(4): 115-122.
[26] VÉRONNEAU C, DENIS J, LEBEL L P, et al. Multifunctional remotely actuated 3-DOF supernumerary robotic arm based on magnetorheological clutches and hydrostatic transmission lines[J]. IEEE Robotics and Automation Letters, 2020, 5(2): 2546-2553.
[27] KHAZOOM C, CAILLOUETTE P, GIRARD A, et al. A supernumerary robotic leg powered by magnetorheological actuators to assist human locomotion[J]. IEEE Robotics and Automation Letters, 2020, 5(4): 5143-5150.
[28] 荆泓玮, 朱延河, 赵思恺, 等. 外肢体机器人研究现状及发展趋势[J]. 机械工程学报, 2020,v.56(07): 15-23.
[29] TONG Y, LIU J. Review of Research and Development of Supernumerary Robotic Limbs[J/OL]. IEEE/CAA Journal of Automatica Sinica, 2021, 8(5): 929-952. DOI: 10.1109/JAS.2021.1003961.
[30] PRATTICHIZZO D, POZZI M, BALDI T L, et al. Human augmentation by wearable supernumerary robotic limbs: review and perspectives[J]. Progress in Biomedical Engineering, 2021,3(4): 042005.
[31] 曾端. 水下助推机器人液压驱动系统控制方法研究[D]. 成都: 电子科技大学, 2016.
[32] 李兴勇. 水下助推机器人控制系统设计[D]. 成都: 电子科技大学, 2017.
[33] 高家葵. 水下助推机器人控制方法研究[D]. 成都: 电子科技大学, 2017.
[34] 王斌. 水下助推机器人的人体运动意图感知方法研究[D]. 成都: 电子科技大学, 2017.
[35] 秦睿. 水下外骨骼机器人动力学特性研究[D]. 成都: 电子科技大学, 2017.
[36] SATTAR J, DUDEK G. Where is your dive buddy: tracking humans underwater using spatiotemporal features[C]//2007 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2007: 3654-3659.
[37] VERZIJLENBERG B, JENKIN M. Swimming with robots: Human robot communication at depth[C]//2010 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2010: 4023-4028.
[38] CHIARELLA D, BIBULI M, BRUZZONE G, et al. Gesture-based language for diver-robot underwater interaction[C]//Oceans 2015-genova. IEEE, 2015: 1-9.
[39] CHAVEZ A G, MUELLER C A, DOERNBACH T, et al. Robust gesture-based communication for underwater human-robot interaction in the context of search and rescue diver missions[A]. 2018.
[40] CODD-DOWNEY R, JENKIN M. Human robot interaction using diver hand signals[C]//2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 2019: 550-551.
[41] NAđ Đ, WALKER C, KVASIĆ I, et al. Towards advancing diver-robot interaction capabilities[J]. IFAC-PapersOnLine, 2019, 52(21): 199-204.
[42] LIU T, ZHU Y, WU K, et al. Underwater Accompanying Robot Based on SSDLite Gesture Recognition[J]. Applied Sciences, 2022, 12(18): 9131.
[43] 李晗生. 蛙人推进器的水下航行数值模拟及实验研究[D]. 哈尔滨: 哈尔滨工程大学, 2020.
[44] 蒋新松. 水下机器人[M]. 辽宁科学技术出版社, 2000.
[45] 李晓晖, 朱玉泉, 聂松林. 喷水推进器的发展研究综述[J]. 液压与气动, 2007(7): 4.
[46] 刘锟. 混合型水下自航行器的概念设计与研究[D]. 天津大学, 2009.
[47] 臧鹏飞. 水下机器人动力学建模与运动稳定性分析[D]. 南京信息工程大学, 2018.
[48] BLUEROBOTICS. T200 Thruster — bluerobotics.com[EB/OL]. 2019. https://bluerobotics.com/store/thrusters/t100-t200-thrusters/t200-thruster-r2-rp/.
[49] ROBOTIS. ROBOTIS e-Manual — emanual.robotis.com[EB/OL]. 2020. https://emanual.robotis.com/docs/en/dxl/x/xw540-t140/.
[50] SKOGSTAD M, ERIKSEN T, Φ S, 等. 职业潜水员听力情况的12 年追踪研究[J]. 转化医学杂志, 2009, 22(004): 213-217.
[51] MACHANGPA J W, CHINGTHAM T S. Head gesture controlled wheelchair for quadriplegic patients[J]. Procedia computer science, 2018, 132: 342-351.
[52] LI P, MEZIANE R, OTIS M J D, et al. A Smart Safety Helmet using IMU and EEG sensors for worker fatigue detection[C]//2014 IEEE International Symposium on Robotic and Sensors Environments (ROSE) Proceedings. IEEE, 2014: 55-60.
[53] SEVERIN I C. Head Gesture-Based on IMU Sensors: a Performance Comparison Between the Unimodal and Multimodal Approach[C]//2021 International Symposium on Signals, Circuits and Systems (ISSCS). IEEE, 2021: 1-4.
[54] KANG M S, KANG H W, LEE C, et al. The gesture recognition technology based on IMU sensor for personal active spinning[C]//2018 20th International Conference on Advanced Communication Technology (ICACT). IEEE, 2018: 546-552.
[55] SEVERIN I C, DOBREA D M. 6DOF Inertial IMU Head Gesture Detection: Performance Analysis Using Fourier Transform and Jerk-Based Feature Extraction[C]//2020 IEEE Microwave Theory and Techniques in Wireless Communications (MTTW): volume 1. IEEE, 2020:118-123.
[56] TURAN M T, ERZIN E. Source and filter estimation for throat-microphone speech enhancement[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2015, 24(2): 265-275.
[57] AHMAD N, GHAZILLA R A R, KHAIRI N M, et al. Reviews on various inertial measurement unit (IMU) sensor applications[J]. International Journal of Signal Processing Systems, 2013, 1(2): 256-262.
[58] PRAYUDI I, KIM D. Design and implementation of IMU-based human arm motion capture system[C]//2012 IEEE International conference on mechatronics and automation. IEEE, 2012: 670-675.
[59] KONG W, SESSA S, COSENTINO S, et al. Development of a real-time IMU-based motion capture system for gait rehabilitation[C]//2013 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, 2013: 2100-2105.
[60] FAISAL I A, PURBOYO T W, ANSORI A S R. A Review of accelerometer sensor and gyroscope sensor in IMU sensors on motion capture[J]. J. Eng. Appl. Sci, 2019, 15(3): 826-829.
[61] IVANOV A V, ZHILENKOV A A. The use of IMU MEMS-sensors for designing of motion capture system for control of robotic objects[C]//2018 IEEE Conference of Russian Young Researchers in Electrical and Electronic Engineering (EIConRus). IEEE, 2018: 890-893.
[62] WADSN. TACTICAL THROAT MIC WADSN — wadsn.com[EB/OL]. 2020. https://www.wadsn.com/product/z033-fg.
[63] MICROSTRAIN. 3DM-GX3-25 Parker — microstrain.com[EB/OL]. 2012. https://www.microstrain.com/inertial/3dm-gx3-25?qt-product_quicktab=2#qt-product_quicktab.
[64] MURPHY-CHUTORIAN E, TRIVEDI M M. Head pose estimation in computer vision: A survey[J]. IEEE transactions on pattern analysis and machine intelligence, 2008, 31(4): 607-626.
[65] UNIVERSITY C. Cervical Flexion & Extension BIOMECHANICS[EB/OL]. YouTube(2020).https://youtu.be/EEW_aM-lpQk.
[66] FRESK E, NIKOLAKOPOULOS G. Full quaternion based attitude control for a quadrotor[C/OL]//2013 European Control Conference (ECC). 2013: 3864-3869. DOI: 10.23919/ECC.2013.6669617.
[67] HAMILTON W R. Ii. on quaternions; or on a new system of imaginaries in algebra[J]. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 1844, 25(163):10-13.
[68] SOLA J. Quaternion kinematics for the error-state Kalman filter[A]. 2017.
[69] CRAIG J J. 机器人学导论: 第3 版[M]. 机器人学导论: 第3 版, 2006.
[70] 刘玉焘. 基于可穿戴式传感器的人体动作捕获与识别研究[D]. 哈尔滨工业大学, 2020.
[71] 梅锋. 基于时序统计特征人体运动分割模型与姿态识别算法设计[D]. 华东交通大学,2022.
[72] SAKOE H, CHIBA S. Dynamic programming algorithm optimization for spoken word recognition[J]. IEEE transactions on acoustics, speech, and signal processing, 1978, 26(1): 43-49.
[73] ITAKURA F. Minimum prediction residual principle applied to speech recognition[J]. IEEE Transactions on acoustics, speech, and signal processing, 1975, 23(1): 67-72.
[74] SENIN P. Dynamic time warping algorithm review[J]. Information and Computer Science Department University of Hawaii at Manoa Honolulu, USA, 2008, 855(1-23): 40.
[75] BENESTY J, SONDHI M M, HUANG Y, et al. Springer handbook of speech processing:volume 1[M]. Springer, 2008.
[76] MAHENDRU H C. Quick review of human speech production mechanism[J]. International Journal of Engineering Research and Development, 2014, 9(10): 48-54.
[77] YANG Q, JIN W, ZHANG Q, et al. Mixed-modality speech recognition and interaction using a wearable artificial throat[J]. Nature Machine Intelligence, 2023, 5(2): 169-180.
[78] SAINBURG T, THIELK M, GENTNER T Q. Finding, visualizing, and quantifying latent structure across diverse animal vocal repertoires[J]. PLoS computational biology, 2020, 16(10):e1008228.
[79] KIAPUCHINSKI D M, LIMA C R E, KAESTNER C A A. Spectral noise gate technique applied to birdsong preprocessing on embedded unit[C]//2012 IEEE International Symposium on Multimedia. IEEE, 2012: 24-27.
[80] 李秀. 基于DTW 和GMM 的多维特征说话人识别[D]. 南京邮电大学, 2020.
[81] PRABHU K M. Window functions and their applications in signal processing[M]. Taylor & Francis, 2014.
[82] SMITH J O. Spectral Audio Signal Processing[M/OL]. W3K Publishing, 2011. https://ccrma.stanford.edu/~jos/sasp/.
[83] BäCKSTRöM T, RäSäNEN O, ZEWOUDIE A, et al. Introduction to Speech Processing[M/OL]. 2nd ed. 2022. https://speechprocessingbook.aalto.fi. DOI: 10.5281/zenodo.6821775.
[84] 高维深. 基于HMM/ANN 混合模型的非特定人语音识别研究[D]. 电子科技大学, 2013.
[85] 李艳花. 基于特征提取的智能轮椅语音识别控制技术的研究与实现[D]. 重庆邮电大学,2010.
[86] 谢世强. 基于神经网络非特定人语音识别与机器人控制研究[D]. 沈阳工业大学, 2013.
[87] 李永健. 基于DTW 和HMM 的语音识别算法仿真及软件设计[D]. 哈尔滨工程大学, 2009.
[88] GREENBERG S, AINSWORTH W, FAY R. Springer Handbook of Auditory Research: Speech Processing in the Auditory System[M/OL]. Springer New York, 2004. https://books.google.com/books?id=xWU2o08AxwwC.
[89] UMESH S, COHEN L, NELSON D. Fitting the mel scale[C]//1999 IEEE InternationalConference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258): volume 1. IEEE, 1999: 217-220.
[90] ZWICKER E. Masking and psychological excitation as consequences of the ear’s frequency analysis[J]. Frequency analysis and periodicity detection in hearing, 1970: 376-394.
[91] FOOK C, HARIHARAN M, YAACOB S, et al. A review: Malay speech recognition and audio visual speech recognition[C]//2012 International Conference on Biomedical Engineering (ICoBE). IEEE, 2012: 479-484.
[92] SMAGULOVA K, JAMES A P. A survey on LSTM memristive neural network architectures and applications[J]. The European Physical Journal Special Topics, 2019, 228(10): 2313-2324.
[93] STAUDEMEYER R C, MORRIS E R. Understanding LSTM–a tutorial into long short-term memory recurrent neural networks[A]. 2019.
[94] MICROSOFT. Xbox Wireless Controller — xbox.com[EB/OL]. 2022. https://www.xbox.com/zh-CN/accessories/controllers/xbox-wireless-controller.
[95] MAXIMO M, RIBEIRO C H, AFONSO R J. Modeling of a position servo used in robotics applications[J]. Proceedings of the 2017 Simpósio Brasileiro de Automaçao Inteligente (SBAI), Porto Alegre, SC, Brazil, 2017.
[96] AJITH A, RAJKUMAR R, SHIBU R M, et al. Design and Development of Novel System for Increasing Range of an Unmanned Underwater Vehicle[C/OL]//2019 IEEE Underwater Technology(UT). 2019: 1-8. DOI: 10.1109/UT.2019.8734399.
修改评论