Convolution spatial-temporal attention network for EEG emotion recognition

Physiol Meas. 2024 Nov 22. doi: 10.1088/1361-6579/ad9661. Online ahead of print.

Abstract

In recent years, emotion recognition using Electroencephalogram (EEG) signals has garnered significant interest due to its non-invasive nature and high temporal resolution. We introduced a groundbreaking method that bypasses traditional manual feature engineering, emphasizing data preprocessing and leveraging the topological relationships between channels to transform EEG signals from two-dimensional time sequences into three-dimensional spatio-temporal representations. Maximizing the potential of deep learning, our approach provides a data-driven and robust method for identifying emotional states. Leveraging the synergy between Convolutional Neural Network (CNN) and attention mechanisms facilitated automatic feature extraction and dynamic learning of inter-channel dependencies. Our method showcased remarkable performance in emotion recognition tasks, confirming the effectiveness of our approach, achieving average accuracy of 98.62% for arousal and 98.47% for valence, surpassing previous state-of-the-art results of 95.76% and 95.15%. Furthermore, we conducted a series of pivotal experiments that broadened the scope of emotion recognition research, exploring further possibilities in the field of emotion recognition.

Keywords: Attention Mechanisms; CNN; Data Preprocessing; EEG; Emotion Recognition.