This is the source code for paper
Hanwei Liu, Huiling Cai, Qingcheng Lin, Zhang Xiwen, Xuefeng Li, Hui Xiao, FEDA: Fine-grained Emotion Difference Analysis For Facial Expression Recognition, Biomedical Signal Processing and Control. doi:10.1016/j.bspc.2022.104209
Facial expression recognition (FER) plays an important role in intelligent human-computer interaction. The complexity and confusion of target emotions as well as the subjectivity of observers make the definition of emotion categories controversial, low accuracy has become a bottleneck problem in facial expression recognition analysis. To establish an emotion representation model with efficient facial expression recognition, this study presents FEDA, a fine-grained emotion differences analysis based on correlation, and explores the issue of emotion categories with appropriate intra-class correlation and inter-class differences. First, a clustering algorithm is used to obtain variable fine-grained emotion representations, and then the correlation is analyzed objectively through recognition and facial action units, and finally an emotion representation model that supports high-efficiency FER is obtained. Through the test on the FERPlus public data set, the recognition accuracy rate reached 91.5% for the first time, verifying the rationality of our emotion representation model. Our experiment result can also support the effective establishment of emotion representation models based on facial expression recognition. It plays a role in promoting the diagnosis and treatment of mental illness, as well as technological development in the fields of human-computer interaction, security, and robotics services.
@article{LIU2023104209,
title = {FEDA: Fine-grained emotion difference analysis for facial expression recognition},
journal = {Biomedical Signal Processing and Control},
volume = {79},
pages = {104209},
year = {2023},
issn = {1746-8094},
doi = {https://doi.org/10.1016/j.bspc.2022.104209},
url = {https://www.sciencedirect.com/science/article/pii/S1746809422006632},
author = {Hanwei Liu and Huiling Cai and Qincheng Lin and Xiwen Zhang and Xuefeng Li and Hui Xiao}
