Classification of Upper Limb Movements ewline Using Convolutional Neural Network ewline with 3D Inception Block


Abstract in English

A brain-machine interface (BMI) based on electroencephalography (EEG) can overcome the movement deficits for patients and real-world applications for healthy people. Ideally, the BMI system detects user movement intentions transforms them into a control signal for a robotic arm movement. In this study, we made progress toward user intention decoding and successfully classified six different reaching movements of the right arm in the movement execution (ME). Notably, we designed an experimental environment using robotic arm movement and proposed a convolutional neural network architecture (CNN) with inception block for robust classify executed movements of the same limb. As a result, we confirmed the classification accuracies of six different directions show 0.45 for the executed session. The results proved that the proposed architecture has approximately 6~13% performance increase compared to its conventional classification models. Hence, we demonstrate the 3D inception CNN architecture to contribute to the continuous decoding of ME.

Download