Electroencephalographic Signal Classification for Robotic Arm Control
Adam N Cassedy, Arlen J D'Arcy, Alyssa Katherine Morgan, Adam Patrick Van Camp
The University of Dayton (UD) Vision Lab is improving technology used to create a robotic prosthetic by utilizing electroencephalographic (EEG) user input. However, the accuracy and speed of the robotic prosthetic technology is not precise and fast enough to be valuable to disabled persons. Activities the robotic arm can perform are limited by user input delay and accuracy. The UD Vision Lab is developing a new alternative way of processing and classifying EEG signals in order to improve the response of the robot arm, encompassing data acquisition, preprocessing, feature extraction, and classification algorithms. Utilizing the Emotiv Insight headset, real-time data is sampled and preprocessed using noise reduction techniques. Certain features that are extracted from the signals include the average logarithmic power of the frequency and other salient features. These features are sent to a classification system such as an Extreme Learning Machine to distinguish the thoughts of the user. The user input EEG raw data is initially tested using MATLAB and rewritten in compiled (C/C++) code to reduce latency during real time data streaming from the user’s thoughts through the classification system and to the robotic arm. Through the use of a machine learning algorithm, the process by which the raw data is classified is quicker and allows for more user defined thoughts to be recognized by the Brain Machine Interface system, thus increasing the utility of the project as a dynamic prosthetic device and as a brainwave analysis system.
Capstone Project - Undergraduate
Vijayan K Asari, Theus H Aspiras, Garrett C Sargent
Primary Advisor's Department
Electrical and Computer Engineering
Stander Symposium poster
"Electroencephalographic Signal Classification for Robotic Arm Control" (2017). Stander Symposium Posters. 891.