Predicting Auditory Spatial Attention from EEG using Single- and Multi-task Convolutional Neural Networks


Abstract in English

Recent behavioral and electroencephalograph (EEG) studies have defined ways that auditory spatial attention can be allocated over large regions of space. As with most experimental studies, behavior EEG was averaged over 10s of minutes because identifying abstract feature spatial codes from raw EEG data is extremely challenging. The goal of this study is to design a deep learning model that can learn from raw EEG data and predict auditory spatial information on a trial-by-trial basis. We designed a convolutional neural networks (CNN) model to predict the attended location or other stimulus locations relative to the attended location. A multi-task model was also used to predict the attended and stimulus locations at the same time. Based on the visualization of our models, we investigated features of individual classification tasks and joint feature of the multi-task model. Our model achieved an average 72.4% in relative location prediction and 90.0% in attended location prediction individually. The multi-task model improved the performance of attended location prediction by 3%. Our results suggest a strong correlation between attended location and relative location.

Download