Kelly Cashion, Theus Aspiras, Carly Gross, David Fan, Yicong Gong, Nathaniel Maas, Ahmed Nasrallah
Download Project (361 KB)
This project takes Electroencephalography (EEG) data and correlates it with specific robotic actions. The process is implemented using a 3 phase system that includes EEG signal acquisition, data classification, and robotic action encoding. This project utilizes the Emotiv EPOC headset that uses 14 electrodes which detects brain activity and wirelessly transmits raw data to a personal computer. The project utilizes Emotiv software to classify and translate and encode this raw EEG signal into a command to control a robotic arm. This Brain Machine Interface (BMI) research has many potential applications; for example, it could help the handicapped use robots to complete various task, or help the user use only their mind to control multiple devices like Google Glass, cell phones, wheelchairs and air conditioners, etc.
Primary Advisor's Department
Electrical and Computer Engineering
Stander Symposium project
Arts and Humanities | Business | Education | Engineering | Life Sciences | Medicine and Health Sciences | Physical Sciences and Mathematics | Social and Behavioral Sciences
"Brain Machine Interface for Controlling a Robotic Arm" (2014). Stander Symposium Projects. 403.
Arts and Humanities Commons, Business Commons, Education Commons, Engineering Commons, Life Sciences Commons, Medicine and Health Sciences Commons, Physical Sciences and Mathematics Commons, Social and Behavioral Sciences Commons