Perceptual priors add sensory detail to contextual feedback processing in V1

bioRxiv (Cold Spring Harbor Laboratory)(2023)

Cited 0|Views6
No score
Abstract
How do we develop models of the world? Contextualising ambiguous information with previous experience allows us to form an enriched perception. Contextual information and prior knowledge facilitate perceptual processing, improving our recognition of even distorted or obstructed visual inputs. As a result, neuronal processing elicited by identical sensory inputs varies depending on the context in which we encounter those inputs. This modulation is in line with predictive processing accounts of vision which suggest that the brain uses internal models of the world to predict sensory inputs, with cortical feedback processing in sensory areas encoding beliefs about those inputs. As such, acquiring knowledge should enhance our internal models that we use to resolve sensory ambiguities, and feedback signals should encode more accurate estimates of sensory inputs. We used partially occluded Mooney images, ambiguous two-tone images which are difficult to recognise without prior knowledge of the image content, in behavioural and 3T fMRI experiments to measure if contextual feedback signals in early visual areas are modulated by learning. We show that perceptual priors add sensory detail to contextual feedback processing in early visual areas in response to subsequent presentations of previously ambiguous images. ### Competing Interest Statement The authors have declared no competing interest.
More
Translated text
Key words
perceptual priors,sensory detail,contextual feedback processing
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined