Luu, Diu Khue2023-01-042023-01-042022-10https://hdl.handle.net/11299/250411University of Minnesota Ph.D. dissertation. 2022. Major: Biomedical Engineering. Advisors: Zhi Yang, Qi Zhao. 1 computer file (PDF); 153 pages.Abstract:A prosthetic hand ultimately seeks to replace the essential functions of a lost limb in activities of daily living. Yet most existing prostheses only allow limited movements and cannot provide sensory feedback to the amputee. These limitations make the user experience unnatural and unintuitive. Therefore, the next-generation artificial limb must be equipped with a neural interface that can facilitate bidirectional communication between the user's nervous system and the prosthesis's circuitry to create a substitution that genuinely feels and acts like a real hand. In this dissertation, we investigate diverse strategies to inform new designs of an advanced neural interface. Our approaches range from conventional statistical modeling to deep learning-based artificial intelligence (AI). First, we formulate a statistical model governing the mismatch error in real-world sensors and devices. Specifically, we derive a framework for manipulating random mismatches to achieve super-resolution where the system could gain an effective resolution 500 times more precise than the conventional limitation. This mechanism has been applied to design a super-resolution neurostimulator used by a human amputee in sensory restoring experiments. Second, we design a pseudo-online AI neural decoder to translate the amputee's movement intents from peripheral nerve data. Various decoding strategies, including one-step and two-step approaches and a data representation called feature extraction, are studied to optimize the decoder's performance. We show that utilizing feature extraction could help lower the decoder's complexity substantially, making the design feasible for real-time applications. We then demonstrate an AI neural decoder based on recurrent neural networks (RNN), which outperforms all other classic machine learning techniques in decoding a large nerve dataset. Third, we study the AI neural decoder's real-time performance and long-term longevity with three human amputees. The AI neural decoder allows the amputees to control individual fingers and the wrist of a robotic hand in real-time with 97-98% accuracy. We design a gesture-matching performance test; the amputee can achieve a reaction time of 0.8 seconds and an information throughput of 365 bit-per-minute. We also show that the decoder's predictive performance is robust over a 16-month implant duration. Our study lays the groundwork for the next-generation neuroprostheses that are enabled by a bidirectional, high-bandwidth, and intuitive neural interface. Our comprehensive investigation of various statistical and empirical approaches would inform the design of new neuro-sensors and neural decoders. We envision that AI technology, specifically deep learning, will be at the heart of the next-generation dexterous and intuitive prosthetic hands.enArtificial IntelligenceDeep LearningMotor DecodingNeuroprosthesisPeripheral Nerve InterfaceSuper ResolutionDesign The Next-Generation Neuroprostheses: From Statistical Modeling to Artificial IntelligenceThesis or Dissertation