GIGN: Learning Graph-in-graph Representations of EEG Signals for Continuous Emotion Recognition

Annu Int Conf IEEE Eng Med Biol Soc. 2023 Jul:2023:1-5. doi: 10.1109/EMBC40787.2023.10340644.

Abstract

Effectively learning the spatial topology information of EEG channels as well as the temporal contextual information underlying emotions is crucial for EEG emotion regression tasks. In this paper, we represent EEG signals as spatial graphs in a temporal graph (SGTG). A graph-in-graph neural network (GIGN) is proposed to learn the spatial-temporal information from the proposed SGTG for continuous EEG emotion recognition. A spatial graph neural network (GCN) with a learnable adjacency matrix is utilized to capture the dynamical relations among EEG channels. To learn the temporal contextual information, we propose to use GCN to combine the short-time emotional states of each spatial graph embeddings with the help of a learnable adjacency matrix. Experiments on a public dataset, MAHNOB-HCI, show the proposed GIGN achieves better regression results than recently published methods for the same task. The code of GIGN is available at: https://github.com/yi-ding-cs/GIGN.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Electroencephalography
  • Emotions*
  • Learning*
  • Neural Networks, Computer
  • Recognition, Psychology