Acta Scientific Medical Sciences (ASMS)(ISSN: 2582-0931)

Research Article Volume 7 Issue 6

Emotion Recognition from EEG Signals Based on Caps Net Neural Network

Maryam Omri* and Wided Lejouad Chaari

LARIA Laboratory, National School of Computer Sciences (ENSI), University of Manouba, Tunisia

*Corresponding Author: Maryam Omri, LARIA Laboratory, National School of Computer Sciences (ENSI), University of Manouba, Tunisia.

Received: September 28, 2022; Published: May 26, 2023

Abstract

Brain Computer Interfaces (BCI) are computer systems that capture and analyze brain signals. Out of the several techniques used for capturing brain signals, electroencephalogram (EEG) has been regarded as one of the most promising techniques. A lot of models have been investigated for extracting different emotions based on EEG signals. However, these models suffer from ignorance of spatial dimensions which indicate the location of each channel such as the asymmetry between the electrode pairs and other salient information related to emotional states. In this paper, a deep learning approach based on Fast Fourier Transform (FFT), Power Spectral Density (PSD) and a Capsule Network (CapsNet) is proposed. In order to represent the EEG signals better, we firstly apply FFT to extract the EEG frequency bands Delta, Thêta, Alpha, Bêta and Gamma. Then, Power Spectral Density (PSD) is introduced for describing the activation level of an EEG signal and improving the efficiency of emotion classification. Finally, features extracted are fed into CapsNet for the classification of emotional states. Experiments on the DEAP dataset show that the proposed method achieved 95% accuracy which improves the efficiency of capsule neural networks.

 Keywords: EEG Signal; Power Spectral Density; Deep Learning; CapsNet; Emotion Recognition

References

  1. Nancy L Stein and Linda J Levine. “Thinking about feelings: The development and organization of emotional knowledge”. In Aptitude, learning, and instruction (2021): 165-198.
  2. Sharmeen M., et al. “Multimodal emotion recognition using deep learning”. Journal of Applied Science and Technology Trends2 (2021): 52-58.
  3. Patrick D Ross., et al. “Developmental changes in emotion recognition from full-light and point-light displays of body movement”. PLoS One9 (2012): e44815.
  4. Monita Chatterjee., et al. “Voice emotion recognition by cochlear-implanted children and their normally-hearing peers”. Hearing Research 322 (2015): 151-162.
  5. Ralph Adolphs., et al. “Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala”. Nature 6507 (1994): 669-672.
  6. Zhong Yin., et al. “Recognition of emotions using multimodal physiological signals and an ensemble deep learning model”. Computer Methods and Programs in Biomedicine 140 (2017): 93-110.
  7. Mojtaba Khomami Abadi., et al. “Decoding affect in videos employing the meg brain signal”. In 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG) (2013): 1-6.
  8. Foteini Agrafioti., et al. “Ecg pattern analysis for emotion detection”. IEEE Transactions on Affective Computing1 (2011): 102-115.
  9. Sander Koelstra., et al. “Deap: A database for emotion analysis; using physiological signals”. IEEE Transactions on Affective Computing1 (2011): 18-31.
  10. Amir Jalilifard., et al. “Emotion classification using single-channel scalp-eeg recording”. In 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (2016): 845-849.
  11. Evi Septiana Pane., et al. “Identifying rules for electroencephalograph (eeg) emotion recognition and classification”. In 2017 5th International Conference on Instrumentation, Communications, Information Technology, and Biomedical Engineering (ICICI-BME) (2017): 167-172.
  12. Katerina Giannakaki., et al. “Emotional state recognition using advanced machine learning techniques on eeg data”. In 2017 IEEE 30th International Symposium on Computer-Based Medical Systems (CBMS) (2017): 337-342.
  13. Yilong Yang., et al. “Emotion recognition from multi-channel eeg through parallel convolutional recurrent neural network”. In 2018 International Joint Conference on Neural Networks (IJCNN) (2018): 1-7.
  14. Samarth Tripathi., et al. “Using deep and convolutional neural networks for accurate emotion classification on deep dataset”. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (2017): 4746-4752.
  15. Zhiyuan Wen., et al. “A novel convolutional neural networks for emotion recognition based on eeg signal”. In 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC) (2017): 672-677.
  16. Gopal Chandra Jana., et al. “Capsule neural networks on spatio-temporal eeg frames for cross-subject emotion recognition”. Biomedical Signal Processing and Control 72 (2022): 103361.
  17. Nattapong Thammasan., et al. “Continuous music-emotion recognition based on electroencephalogram”. IEICE TRANSACTIONS on Information and Systems4 (2016): 1234-1241.
  18. Sara Sabour., et al. “Dynamic routing between capsules”. In Advances in neural information processing systems (2017): 3856-3866.
  19. Jon D Morris. “Observations: Sam: the self-assessment manikin; an efficient cross-cultural measurement of emotional response”. Journal of Advertising Research6 (1995): 63-68.
  20. Gyanendra K Verma and Uma Shanker Tiwary. “Affect representation and recognition in 3d continuous valence-arousal-dominance space”. Multimedia Tools and Applications2 (2017): 2159-2183.
  21. Robert Jenke., et al. “Feature extraction and selection for emotion recognition from eeg”. IEEE Transactions on Affective Computing 3 (2014): 327-339.
  22. Zhong Yin., et al. “Cross subject eeg feature selection for emotion recognition using transfer recursive feature elimination”. Frontiers in Neurorobotics 11 (2017): 19.
  23. Wei-Long Zheng., et al. “Eeg-based emotion classification using deep belief networks”. In 2014 IEEE International Conference on Multimedia and Expo (ICME) (2014): 1-6.
  24. Ghulam Muhammad., et al. “Automatic seizure detection in a mobile multimedia framework”. IEEE Access 6 (2018): 45372-45383.
  25. GUO Jinliang., et al. “Eeg emotion recognition based on granger causality and capsnet neural network”. In 2018 5th IEEE International Conference on Cloud Computing and Intelligence Systems (CCIS) (2018): 47-52.

Citation

Citation: Maryam Omri and Wided Lejouad Chaari. “Emotion Recognition from EEG Signals Based on Caps Net Neural Network”.Acta Scientific Medical Sciences 7.6 (2023): 106-113.

Copyright

Copyright: © 2023 Maryam Omri and Wided Lejouad Chaari. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.




Metrics

Acceptance rate30%
Acceptance to publication20-30 days
Impact Factor1.403

Indexed In





Contact US