Outlier-Suppressed Triplet Loss With Adaptive Class-Aware Margins For Facial Expression Recognition

2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP)(2019)

Cited 9|Views19
No score
Abstract
Triplet loss has been proposed to increase the inter-class distance and decrease the intra-class distance for various tasks of image recognition. However, for facial expression recognition (FER) problem, the fixed margin parameter does not fit the diversity of scales between different expressions. Meanwhile, the strategy of selecting the hardest triplets can introduce noisy guidance information since various persons may present significantly different expressions. In this work, we propose a new triplet loss based on class-aware margins and outlier-suppressed triplet for FER, where each pair of expressions, e.g. 'happy' and 'fear', is assigned with an adaptive margin parameter and the abnormal hard triplets are discarded according to the feature distance distribution. Experimental results of the proposed triplet loss on the FER2013 and CK+ expression databases show that the proposed network achieves much better accuracy than the original triplet loss and the network without using the proposed strategies, and competitive performance compared with the state-of-the-art algorithms.
More
Translated text
Key words
triplet loss, class-aware margin, outlier suppression, facial expression recognition
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined