TECHNOLOGICAL BASIS OF “INDUSTRY 4.0”
A system for classification of human facial and body emotions based on deep learning neural networks
- 1 Department of Informatics, University of Chemical Technology and Metallurgy, Sofia,
Abstract
Current paper presents development of system intended to classify human facial and body emotions. It is based on two deep learning neural networks (DNN): – first one used for facial emotion recognition (FER) and second one for body gesture emotion recognition (BER). Combination of the results obtained by the two modalities (facial expression data and body gestures language data) provides more accurate results instead of these obtained using only one modality. After brief analysis of the available pre-trained DNN and datasets for facial and body emotions recognition, based on previous authors’ developments, the selection of two DNN models has been done. They are used in the development and verification of present system.
Keywords
References
- R. Raynova, A. Aleksieva-Petrova, M. Lazarova, Multi-agent Multimodal Human Emotion Recognition Architecture, Proc. 28-th National Conference with International Participation "Telecom 2020", October 29 - 30, 2020, Sofia, Bulgaria
- A. Atanassov and D. Pilev, "Pre-trained Deep Learning Models for Facial Emotions Recognition," 2020 International Conference Automatics and Informatics (ICAI), Varna, Bulgaria, 2020, pp. 1-6
- A. Atanssov, D. Pilev, F. Tomova, V. Kuzmanova, Hybrid System for Emotion Recognition Based on Facial Expressions and Body Gesture Recognition, ,"`2021 International Conference Automatics and Informatics (ICAI), Varna, Bulgaria, 2021
- P. Ekman, “Facial action coding system (facs),” A human face, 2002
- B. Farnsworth, “Facial action coding system (facs) – a visual guidebook,” August 18th, 2019, https://imotions.com/blog/facial-action-coding-system/#main-action-units
- F. Noroozi, C. A. Corneanu, D. Kaminska, T. Sapinski, S. Escalera and G. Anbarjafari "Survey on Emotional Body Gesture Recognition," Journal of IEEE transactions on affective computing, arxiv.org/pdf/1801.07481.pdf , 2018 pp. 1-20
- S. Serengil, “Face recognition with facenet in keras,” Sep. 2018 https://sefiks.com/2018/09/03/face-recognition-with-facenet-in-keras/
- S. Li and W. Deng, “Deep facial expression recognition: a survey,” IEEE Transactions on Affective Computing 2020, in arXiv preprint arXiv:1804.08348v2, Oct. 2018
- C. Wu and L. Chen, “Facial emotion recognition using deep learning,” arXiv, preprint arXiv:1910.11113v1, Oct. 2019
- A. Ravi, “Pre-trained convolutional neural network features for facial expression recognition,” arXiv preprint arXiv:1812.06387v1, Dec. 2018
- J. A. Russell, "Core affect and the psychological construction of emotion,". Psychological Review. 110 (1), 2003 pp. 145–172.
- R. Kosti, J. M. Alvarez, A. Recasens, A. Lapedriza, “Context Based Emotion Recognition using EMOTIC Dataset,” IEEE Transactions On Pattern Analysis And Machine Intelligence, arXiv:2003.13401v1 [cs.CV] 30 Mar 2020, pp.1-12.
- A. Fathallah, L. Abdi and A. Douik, “Facial expression recognition via deep learning,” IEEE/ACS 14th International Conference on Computer Systems and Applications, Tunisia, Nov. 2017 pp.745-750
- H. Gunes, C. Shan, S. Chen, Y. Tian, Bodily expression for automatic affect recognition, Emotion recognition: A pattern analysis approach 2015, pp.343-377
- P. P. Filntisis, N. Efthymiou, P. Koutras, G. Potamianos, P. Maragos, Fusing Body Posture with Facial Expressions for Joint Recognition of Affect in Child-Robot Interaction, IEEE Robotics and automation letters. preprint version. Accepted July, 2019
- A. R. V. Nunes, “Deep Emotion Recognition through Upper Body Movements and Facial Expression,” Master’s Thesis,Aalborg University 2019, pp. 23-55
- Arriaga, O., Valdenegro-Toro, M., & Plöger, P. (2017). Real-time convolutional neural networks for emotion and gender classification. arXiv preprint arXiv:1710.07557.