Disappeared Face: A Physical Adversarial Attack Method on Black-Box Face Detection Models

INFORMATION AND COMMUNICATIONS SECURITY (ICICS 2021), PT I(2021)

引用 1|浏览20
暂无评分
摘要
Face detection is a classical problem in the field of computer vision. It has significant application value in face recognition and face recognition related applications such as face-scan payment, identity authentication, and other areas. The emergence of adversarial algorithms on face detection poses a substantial threat to the security of face recognition. The current adversarial attacks on face detection have the limitations of the need to fully understand the attacked face detection model's structure and parameters. Therefore, these methods' transferability, which can measure the attack's effectiveness across many other models, is not high. Moreover, due to the consideration of commercial confidentiality, commercial face detection models deployed in real-world applications cannot be accessed, so we cannot directly launch white-box adversarial attacks against these models. Aiming at solving the above problems, we propose a Black-Box Physical Attack Method on face detection. Through ensemble learning, we can extract the public weakness of the face detection models. The attack against the public weakness has high transferability across models and makes escaping black-box face detection models possible. Our method realizes the successful escape of both the white-box and black-box face detection models in both the PC terminal and the mobile terminal, including the camera module, mobile payment module, selfie beauty module, and official face detection models.
更多
查看译文
关键词
Adversarial attack, Face detection, Black-box attack, Real-world attack
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要