Acta Scientific Computer Sciences

Research Article Volume 7 Issue 3

Convolution Neural Network Approaches for Facial Emotion Recognition

Shivamma D1, Nitheesh Kumar G2*, Deepika P2, Dr. Shreedhara K S2

1Assistant Professor, Dept. of CSE(DS), School of Engineering, DSU, India
2PG Student, Dept. of CSE UBDT College of Engineering, VTU, Davanagere, India

*Corresponding Author: Nitheesh Kumar G, PG Student, Dept. of CSE UBDT College of Engineering, VTU, Davanagere, India.

Received: May 21, 2024; Published: June 09, 2025

Abstract

A face detection model based on the mapping of behaviors with physical aspects is offered by the facial emotion recognition system described in this study. Geometric structures that have been reconstructed serve as the foundation matching template for the identification system and are linked to the physical characteristics of the human face that correspond to different expressions including happy, sad, fear, angry, surprise, and disgust. These days, face expression recognition is popular due to its broad range of applications. Because of its many uses, emotion recognition is widely used. As the science of learning has advanced, emotion detection has become increasingly important in business. Feeling recognition allows one to perceive a person's feelings. To detect emotional states in pictures, several methods have been developed.

Keywords: CNN (Convolutional Neural Network); LBP (Local Binary Patterns); Emotion Detection; Facial Expression; SVM (Support Vector Machine); KNN (K-Nearest Neighbor)

References

  1. V Suresh. “Facial emotion recognition using cnn”. International Journal of Creative Research Thoughts 11 (2023): 2320-2882.
  2. Apeksha Khopkar. “Facial expression recognition using cnn with keras”. Research Gate 14 (2021): 47- 50.
  3. Shivam Singh. “Face recognition system”. International Journal of Creative Research Thoughts 8 (2019): 2278-0181.
  4. Imane Lasri. “Facial emotion recognition of students using convolutional neural network”. Research Gate (2022).
  5. Rishab Kumar. “Facial emotion detection”. International Journal of Creative Research Thoughts 10 (2022): 2320-2882.
  6. Nur Alia Syahirah Badrulhisham. “Emotion recognition using convolutional neural network”. Iop (2021).
  7. V Umarani. “Face emotion recognition”. International Journal of Creative Research Thoughts 9 (2021): 2320-2882.
  8. Ninad Mehendale. “Facial emotion recognition using convolutional neural networks”. Springer Nature (2020).
  9. Suci Dwijayanti. “Facial Expression Recognition and Face Recognition Using a Convolutional Neural Network”. Isriti (2020): 978-1-7281-8406-7.
  10. Mohammad A Haghpanah. “Real-Time Facial Expression Recognition using Facial Landmarks and Neural Networks”. (2021).
  11. Sabrina Begaj. “Emotion Recognition Based on Facial Expressions Using Convolutional Neural Network (CNN)”. IEEE Xplore (2021).
  12. Zi-Yu Huang. “A study on computer vision for facial emotion recognition”. Scientific Reports (2021): 41598-02335446-4.
  13. Pengyun Hu. “LCANet: amodelforanalysis of students real-time sentiment by integrating attention mechanism and joint loss function”. Springer (2024).
  14. Ning Zhou. “A Lightweight Convolutional Neural Network for Real-Time Facial Expression Detection”. IEEE Access (2021).
  15. MAH Akhand. “Facial Emotion Recognition Using Transfer Learning in the Deep CNN”. Electronics 10 (2021): 1036.

Citation

Citation: Shreedhara KS., et al. “Convolution Neural Network Approaches for Facial Emotion Recognition".Acta Scientific Computer Sciences 7.3 (2025): 14-19.

Copyright

Copyright: © 2025 Shreedhara KS., et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.




Metrics

Acceptance rate35%
Acceptance to publication20-30 days

Indexed In




News and Events


  • Certification for Review
    Acta Scientific certifies the Editors/reviewers for their review done towards the assigned articles of the respective journals.

Contact US