Facial Emotion Recognition Using Convolutional Brain Emotional Learning (CBEL) Model

Document Type : Original Article

Authors

1 Assistant Professor, Department of Computer, Fouman & Shaft Branch, Islamic Azad University, Fouman, Iran

2 Department of Computer Engineering, Science and Research Branch, Islamic Azad University, Tehran, Iran

Abstract

Facial expression is considered one of the most important ways of communication and human response to its environment. Recognition of facial emotional expression is used in many research fields, such as psychological studies, robotics, identity recognition, disease diagnosis, etc. This paper, due to the importance of recognition of facial emotional expression, presents a new and efficient method based on learning and recognition of facial emotional expression, which is a combination of the limbic system of the human brain and the convolutional neural network. In the proposed model, first, the facial emotional expression images are normalized, and after reducing the dimensions of implicit features, proper and practical features are classified using the convolutional brain emotional learning (CBEL) model, and facial emotional expressions are recognized. Moreover, the performance of the proposed model is compared with BEL, CNN, SVM, MLP, and KNN models. After examining the results, it is concluded that the accuracy of facial emotional expression recognition rate is higher in the CBEL learning model.

Keywords

Main Subjects


[1]     Slavova V, Sahli H, Verhelst W. Multi-modal emotion recognition- more cognitive machines. New Trends Intell Technol 2009;70. https://doi.org/10.13140/2.1.5132.1924.
[2]     Cohn JF, Reed LI, Ambadar Z, Jing Xiao, Moriyama T. Automatic analysis and recognition of brow actions and head motion in spontaneous facial behavior. 2004 IEEE Int. Conf. Syst. Man Cybern. (IEEE Cat. No.04CH37583), vol. 1, IEEE; n.d., p. 610–6. https://doi.org/10.1109/ICSMC.2004.1398367.
[3]     Laird JE, Newell A, Rosenbloom PS. SOAR: An architecture for general intelligence. Artif Intell 1987;33:1–64. https://doi.org/10.1016/0004-3702(87)90050-6.
[4]     Wan V, Ordelman R, Moore J, Muller R. AMI Deliverable Report, Describing emotions in meetings. Intern Proj Report, Line Available Http//Www Amiproject Org 2005.
[5]     Shivappa ST, Trivedi MM, Rao BD. Audiovisual Information Fusion in Human–Computer Interfaces and Intelligent Environments: A Survey. Proc IEEE 2010;98:1692–715. https://doi.org/10.1109/JPROC.2010.2057231.
[6]     Martinez AM, Kak AC. PCA versus LDA. IEEE Trans Pattern Anal Mach Intell 2001;23:228–33. https://doi.org/10.1109/34.908974.
[7]     Guo G, Dyer CR. Learning From Examples in the Small Sample Case: Face Expression Recognition. IEEE Trans Syst Man Cybern Part B 2005;35:477–88. https://doi.org/10.1109/TSMCB.2005.846658.
[8]     Yongjin Wang, Ling Guan. Recognizing Human Emotion from Audiovisual Informaiton. Proceedings. (ICASSP ’05). IEEE Int. Conf. Acoust. Speech, Signal Process. 2005., vol. 2, IEEE; n.d., p. 1125–8. https://doi.org/10.1109/ICASSP.2005.1415607.
[9]     Patel MB, Agrawal DL. A Survey Paper on Facial Expression Recognition System. J Emerg Technol Innov Res 2016;3:44–6.
[10]   Kragel PA, LaBar KS. Decoding the Nature of Emotion in the Brain. Trends Cogn Sci 2016;20:444–55. https://doi.org/10.1016/j.tics.2016.03.011.
[11]   Barsalou LW, Kyle Simmons W, Barbey AK, Wilson CD. Grounding conceptual knowledge in modality-specific systems. Trends Cogn Sci 2003;7:84–91. https://doi.org/10.1016/S1364-6613(02)00029-3.
[12]   Dols JMF, Russell JA. The Science of Facial Expression. Oxford University Press; 2017.
[13]   Li J, Zhang D, Zhang J, Zhang J, Li T, Xia Y, et al. Facial Expression Recognition with Faster R-CNN. Procedia Comput Sci 2017;107:135–40. https://doi.org/10.1016/j.procs.2017.03.069.
[14]   Li W, Tsangouri C, Abtahi F, Zhu Z. A recursive framework for expression recognition: from web images to deep models to game dataset. Mach Vis Appl 2018;29:489–502. https://doi.org/10.1007/s00138-017-0904-9.
[15]   Xie S, Hu H. Facial expression recognition with FRR‐CNN. Electron Lett 2017;53:235–7. https://doi.org/10.1049/el.2016.4328.
[16]   Mao Q, Rao Q, Yu Y, Dong M. Hierarchical Bayesian Theme Models for Multipose Facial Expression Recognition. IEEE Trans Multimed 2017;19:861–73. https://doi.org/10.1109/TMM.2016.2629282.
[17]   Ekman P. Pictures of facial affect. Consult Psychol Press 1976.
[18]   Abdullah M. Optimizing Face Recognition Using PCA. Int J Artif Intell Appl 2012;3:236–31. https://doi.org/10.5121/ijaia.2012.3203.
[19]   Murtaza M, Sharif M, Raza M, Shah JH. Analysis of face recognition under varying facial expression: a survey. Int Arab J Inf Technol 2013;10:378–88.
[20]   Kumari J, Rajesh R, Pooja KM. Facial Expression Recognition: A Survey. Procedia Comput Sci 2015;58:486–91. https://doi.org/10.1016/j.procs.2015.08.011.
[21]   Mehendale N. Facial emotion recognition using convolutional neural networks (FERC). SN Appl Sci 2020;2:446. https://doi.org/10.1007/s42452-020-2234-1.
[22]   Roopa N. Emotion recognition from facial expression using deep learning. Int J Eng Adv Technol ISSN 2019:2249–8958.
[23]   Pramerdorfer C, Kampel M. Facial expression recognition using convolutional neural networks: state of the art. ArXiv Prepr ArXiv161202903 2016.
[24]   Zhang T. Facial Expression Recognition Based on Deep Learning: A Survey, 2018, p. 345–52. https://doi.org/10.1007/978-3-319-69096-4_48.
[25]   Ko B. A Brief Review of Facial Emotion Recognition Based on Visual Information. Sensors 2018;18:401. https://doi.org/10.3390/s18020401.
[26]   Lotfi E. Brain-inspired emotional learning for image classification. Majlesi J Multimed Process 2013;2.
[27]   Lotfi E. Mathematical modeling of emotional brain for classification problems. Proc IAM 2013;2:60–71.
[28]   Lotfi E, Akbarzadeh-T. M-R. Adaptive brain emotional decayed learning for online prediction of geomagnetic activity indices. Neurocomputing 2014;126:188–96. https://doi.org/10.1016/j.neucom.2013.02.040.
[29]   Burkhardt F, Paeschke A, Rolfes M, Sendlmeier WF, Weiss B. A database of German emotional speech. Interspeech, vol. 5, 2005, p. 1517–20.
[30]   Carrier PL, Courville A, Goodfellow IJ, Mirza M, Bengio Y. FER-2013 Face Database. Technical Report, Université de Montréal: 2013.