Tracking transitional probabilities and segmenting auditory sequences are dissociable processes in adults and neonates.

Developmental science(2022)

引用 4|浏览16
暂无评分
摘要
Since speech is a continuous stream with no systematic boundaries between words, how do pre-verbal infants manage to discover words? A proposed solution is that they might use the transitional probability between adjacent syllables, which drops at word boundaries. Here, we tested the limits of this mechanism by increasing the size of the word-unit to four syllables, and its automaticity by testing asleep neonates. Using markers of statistical learning in neonates' EEG, compared to adult behavioral performances in the same task, we confirmed that statistical learning is automatic enough to be efficient even in sleeping neonates. We also revealed that: (1) Successfully tracking transition probabilities (TP) in a sequence is not sufficient to segment it. (2) Prosodic cues, as subtle as subliminal pauses, enable to recover words segmenting capacities. (3) Adults' and neonates' capacities to segment streams seem remarkably similar despite the difference of maturation and expertise. Finally, we observed that learning increased the overall similarity of neural responses across infants during exposure to the stream, providing a novel neural marker to monitor learning. Thus, from birth, infants are equipped with adult-like tools, allowing them to extract small coherent word-like units from auditory streams, based on the combination of statistical analyses and auditory parsing cues. RESEARCH HIGHLIGHTS: Successfully tracking transitional probabilities in a sequence is not always sufficient to segment it. Word segmentation solely based on transitional probability is limited to bi- or tri-syllabic elements. Prosodic cues, as subtle as subliminal pauses, enable to recover chunking capacities in sleeping neonates and awake adults for quadriplets.
更多
查看译文
关键词
EEG,language learning,neonates,prosody,sequence learning,statistical learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要