In-Context Learning with Transformers: Softmax Attention Adapts to Function Lipschitzness
CoRR(2024)
Abstract
A striking property of transformers is their ability to perform in-context
learning (ICL), a machine learning framework in which the learner is presented
with a novel context during inference implicitly through some data, and tasked
with making a prediction in that context. As such that learner must adapt to
the context without additional training. We explore the role of softmax
attention in an ICL setting where each context encodes a regression task. We
show that an attention unit learns a window that it uses to implement a
nearest-neighbors predictor adapted to the landscape of the pretraining tasks.
Specifically, we show that this window widens with decreasing Lipschitzness and
increasing label noise in the pretraining tasks. We also show that on low-rank,
linear problems, the attention unit learns to project onto the appropriate
subspace before inference. Further, we show that this adaptivity relies
crucially on the softmax activation and thus cannot be replicated by the linear
activation often studied in prior theoretical analyses.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined