Lincos-Softmax: Learning Angle-Discriminative Face Representations With Linearity-Enhanced Cosine Logits

IEEE ACCESS(2020)

引用 7|浏览38
暂无评分
摘要
In recent years, the angle-based softmax losses have significantly improved the performance of face recognition whereas these loss functions are all based on cosine logit. A potential weakness is that the nonlinearity of the cosine function may undesirably saturate the angular optimization between the features and the corresponding weight vectors, thereby preventing the network from fully learning to maximize the angular discriminability of features. As a result, the generalization of learned features may be compromised. To tackle this issue, we propose a Linear-Cosine Softmax Loss (LinCos-Softmax) to more effectively learn angle-discriminative facial features. The main characteristic of the loss function we propose is the use of an approximated linear logit. Compared with the conventional cosine logit, it has a stronger linear relationship with the angle on enhancing angular discrimination through Taylor expansion. We also propose an automatic scale parameter selection scheme, which can conveniently provide an appropriate scale for different logits without the need for exhaustive parameter search to improve performance. In addition, we propose a margin-enhanced Linear-Cosine Softmax Loss (m-LinCos-Softmax) to further enlarge inter-class distances and reduce intra-class variations. Experimental results on several face recognition benchmarks (LFW, AgeDB-30, CFP-FP, MegaFace Challenge 1) demonstrate the effectiveness of the proposed method and its superiority to existing angular softmax loss variants.
更多
查看译文
关键词
Face, Training, Face recognition, Feature extraction, Taylor series, Measurement, Optimization, Face recognition, loss function, feature representations, cosine logits, softmax
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要