Robot navigation with brain activity of facial expressions and eye movements

Robot navigation with brain activity of facial expressions and eye movements

This application is based on identifying facial expressions and eye movements through EEG signals. The facial expressions such as smile, sad, teeth clench, smirk, and wink are classified.  Classification based control signals were then transmitted to robot for navigation. The robot is based on shared control which is safe and robust. The analysis of robot navigation for patients showed promising results.

1.    Brain controlled prosthetic arm

This application employs brain signals related to various cognitive states of a user. Cognitive states such as user’s focus level, stress level, engagement level, excitement level and interest level are classified. Based on the level of particular brain activity, prosthetic hand is controlled.

2.    Brain controlled Quadcopter

This application uses brain signals related to imagined movements (motor imagery) which are classified and are used to control the quadcopter.

3.    Machine Learning Tool Kit for BCI

We have developed a toolkit for Machine Learning (ML), which is specifically designed for the purpose of processing and classification of raw EEG data. Moreover, this tool kit can be used for examining various machine learning algorithms on any EEG datasets.

Machine learning toolkit

Posted on: May 7, 2020