Simulated Overparameterization
CoRR(2024)
Abstract
In this work, we introduce a novel paradigm called Simulated
Overparametrization (SOP). SOP merges the computational efficiency of compact
models with the advanced learning proficiencies of overparameterized models.
SOP proposes a unique approach to model training and inference, where a model
with a significantly larger number of parameters is trained in such a way that
a smaller, efficient subset of these parameters is used for the actual
computation during inference. Building upon this framework, we present a novel,
architecture agnostic algorithm called "majority kernels", which seamlessly
integrates with predominant architectures, including Transformer models.
Majority kernels enables the simulated training of overparameterized models,
resulting in performance gains across architectures and tasks. Furthermore, our
approach adds minimal overhead to the cost incurred (wall clock time) at
training time. The proposed approach shows strong performance on a wide variety
of datasets and models, even outperforming strong baselines such as
combinatorial optimization methods based on submodular optimization.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined