Optimal tuning of feature-based attention warps the perception of visual features

crossref(2022)

Cited 0|Views0
No score
Abstract
Selective attention improves sensory processing of relevant information but can also impact the quality of perception. For example, attention increases visual discrimination performance and at the same time boosts apparent stimulus contrast of attended relative to unattended stimuli. Can attention also lead to perceptual distortions of visual feature representations? Optimal tuning accounts of attention suggest that processing is biased towards “off-tuned” features to maximize the signal-to-noise ratio in favor of the target, especially when targets and distractors are confusable. Here, we tested whether such optimal tuning gives rise to phenomenological changes of visual features. We instructed participants to select a color among other colors in a visual search display and subsequently asked them to judge the appearance of the target color in a 2-alternative forced choice (2-AFC) task. Participants consistently judged the target color to appear more dissimilar from the distractor color in feature space. The magnitude of these biases varied systematically with the similarity between target and distractor colors during search, indicating that attention is tuned flexibly according to current task demands. In control experiments we rule out possible non-attentional explanations such as color contrast or memory effects. Overall, our results demonstrate that feature-based attention warps the perception of visual features across large swaths of feature space, indicating that efficient attentional selection can come at a perceptual cost by distorting our sensory experience.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined