Effective, Efficient and Robust Neural Architecture Search

2022 International Joint Conference on Neural Networks (IJCNN)(2022)

Cited 0|Views1
No score
Abstract
Designing neural network architecture for embedded devices is practical but challenging because the models are expected to be not only accurate but also enough lightweight and robust. However, it is challenging to balance those trade-offs manually because of the large search space. To solve this problem, we propose an Effective, Efficient, and Robust Neural Architecture Search (E2RNAS) method to automatically search a neural network architecture that balances the performance, robustness, and resource consumption. Unlike previous studies, the objective function of the proposed E2RNAS method is formulated as a multi-objective bi-level optimization problem with the upper-level subproblem as a multi-objective optimization problem that considers the performance, robustness, and resource consumption. To solve the proposed objective function, we integrate the multiple-gradient descent algorithm, a widely studied gradient-based multi-objective optimization algorithm, with the bi-level optimization. Experiments on benchmark datasets show that the proposed E2RNAS method can find robust architecture with low resource consumption and comparable classification accuracy.
More
Translated text
Key words
neural architecture search,adversarial robustness,out-of-distribution
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined