Agnostic Sharpness-Aware Minimization
arxiv(2024)
Abstract
Sharpness-aware minimization (SAM) has been instrumental in improving deep
neural network training by minimizing both the training loss and the sharpness
of the loss landscape, leading the model into flatter minima that are
associated with better generalization properties. In another aspect,
Model-Agnostic Meta-Learning (MAML) is a framework designed to improve the
adaptability of models. MAML optimizes a set of meta-models that are
specifically tailored for quick adaptation to multiple tasks with minimal
fine-tuning steps and can generalize well with limited data. In this work, we
explore the connection between SAM and MAML, particularly in terms of enhancing
model generalization. We introduce Agnostic-SAM, a novel approach that combines
the principles of both SAM and MAML. Agnostic-SAM adapts the core idea of SAM
by optimizing the model towards wider local minima using training data, while
concurrently maintaining low loss values on validation data. By doing so, it
seeks flatter minima that are not only robust to small perturbations but also
less vulnerable to data distributional shift problems. Our experimental results
demonstrate that Agnostic-SAM significantly improves generalization over
baselines across a range of datasets and under challenging conditions such as
noisy labels and data limitation.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined