|
Emotion Recognition Based on Multi-gait Feature Fusion
PENG Tao, TANG Jing, HE Kai, HU Xinrong, LIU Junping, HE Ruhan
Journal of Guangxi Normal University(Natural Science Edition). 2022, 40 (3):
104-111.
DOI: 10.16088/j.issn.1001-6600.2021071406
Emotion recognition based on gait features is considered to have a wide range of applications in emotion computing, psychotherapy, robotics, surveillance and audience understanding. Existing methods show that combining the context information such as gesture position can significantly improve the performance of emotion recognition, and spatiotemporal information can significantly improve the accuracy of emotion recognition. However, the emotional information in gait can not be fully expressed only by using bone spatial information. In order to make good use of the gait features, an adaptive fusion method is proposed in this paper, which combines the spatiotemporal information of the skeleton with the rotation angle of the skeleton, and improves the emotion recognition accuracy of the existing models. The model uses the Autoencoder to learn the bone rotation information of human walking, uses the spatio-temporal convolution neural network to extract the spatio-temporal information of bone points, inputs the bone rotation information and spatio-temporal information into the adaptive fusion network, and obtains the final feature for classification. The model is tested on the Emotion-gait data set, and the experimental results show that the AP values of sadness, anger and neutral emotion have increased by 5, 8 and 5 percentage point respectively compared with the latest HAP method, and the average map value of the overall classification has increased by 5 percentage point.
References |
Related Articles |
Metrics
|