Utilizing EEG Signal Data and Motion to Aid in Prosthetic Hand Motion
DOI:
https://doi.org/10.47611/jsrhs.v12i4.5883Keywords:
Artificial Intelligence, Machine Learning, 3D printing, EngineeringAbstract
Current prosthetic arm technologies are often difficult to use intuitively by amputees, require invasive surgical procedures, and can be extremely costly with prices ranging from $20,000 to $80,000. To address these challenges, the engineering goal of this project aims to design a smart, low-cost, mind-controlled transhumeral prosthesis by integrating the brain-interfacing capabilities of electroencephalography (EEG), the economical means of 3D printing technology, and gesture-detecting attributes of an accelerometer-gyroscope. Single-channel brain signals are transmitted through Bluetooth to be interpreted by a novel EEG decoding algorithm and head gestures from the inertial measurement unit actuate movement within the arm. Force sensitive resistors were employed to regulate force control in real-time to optimize grasp type. An LCD screen is integrated within the arm’s design to display the type of touch it is exerting on an object. The arm itself was printed with an original design utilizing PLA plastic filament making it extremely durable and lightweight. After thoroughly testing the prosthesis, the novel EEG decoding system boasts an accuracy of 94.7% and a user cognition to machine delay of 1.64 + - 0.37 seconds. The inertial measurement unit system recognizes user gestures with a delay of .136 seconds. With a bill of materials approximately $375 USD and ability to be moved in 4 degrees of motion, this novel upper limb prosthesis serves as a promising alternative to existing units on the market. The algorithms developed within this project have a wide range of use within other brain control interface projects.
Downloads
References or Bibliography
Durham, Emily. “First-ever noninvasive mind-controlled robotic arm” College of Engineering at Carnegie Mellon University, 20 June 2019, engineering.cmu.edu/news-events/news/2019/06/20-he-sci-robotics.html. Accessed 23 Jan. 2023.
Meng, Jianjun. “Noninvasive Electroencephalogram Based Control of a Robotic Arm for Reach and Grasp Tasks” Scientific Reports, 14 Dec. 2016, www.nature.com/articles/srep38565. Accessed 23 Jan. 2023.
“Prosthetic arm Controller Based on Brainwaves Spectrum EEG Sensor” IOPscience, iopscience.iop.org/article/10.1088/1757-899X/662/5/052017. Accessed 23 Jan. 2023.
www.astesj.com/publications/www.astesj.com/publications/ASTESJ_0203111.pdf. Accessed 23 Jan. 2023.
Lebedev, Mikhail A. “Electroencephalogram-Based Brain–Computer Interface and Lower-Limb Prosthesis Control: A Case Study”
Electroencephalogram-Based BrainComputer Interface, www.frontiersin.org/articles/10.3389/fneur.2017.00696/full. Accessed 23 Jan. 2023.
ieeexplore.ieee.org/abstract/document/7746219. Accessed 23 Jan. 2023.
Jan. 2023, www.ncbi.nlm.nih.gov/pmc/articles/PMC6211123/. Accessed 23 Jan. 2023.
“ScienceDirect” www.sciencedirect.com/science/article/pii/S1877050917302454. Accessed 23 Jan. 2023.
https://www.uh.edu/news-events/stories/2015/March/0331BionicHand.php
“Ceeol” Article Detail, www.ceeol.com/search/article-detail?id=1030684. Accessed 23 Jan. 2023.
“Utilizing Deep Neural Networks for Brain–Computer Interface-Based Prosthesis Control
” digitalcommons.georgefox.edu/cgi/viewcontent.cgi?article=1027&context=eecs_fac. Accessed 23 Jan. 2023.
News, Neuroscience. “First-ever successful mind-controlled robotic arm without brain implants” Neuroscience News, 20 June 2019, neurosciencenews.com/bci-prosthetic-arm-14283/. Accessed 23 Jan. 2023.
Akhondi, H. (n.d.). Similar articles in PubMed.
https://www.ncbi.nlm.nih.gov/books/NBK540962/
(n.d.). Request Rejected. https://ieeexplore.ieee.org/document/6991151
Published
How to Cite
Issue
Section
Copyright (c) 2023 Harshitha Jasti
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright holder(s) granted JSR a perpetual, non-exclusive license to distriute & display this article.