谷歌浏览器插件
订阅小程序
在清言上使用

Automated and objective analysis of speech in premanifest and early-stage Huntington’s disease

medrxiv(2022)

引用 0|浏览5
暂无评分
摘要
Background Clinical markers that show change in performance in people with Huntington’s disease (HD) during the presymptomatic and prodromal stages remain a target of investigation in clinical medicine. Alongside genetic and neuroimaging initiatives, digital speech analytics has shown promise as a sensitive clinical marker of premanifest HD. Objective To investigate the sensitivity of digital speech measures for detecting subtle cognitive-linguistic and fine motor features in people carrying the expanded HD gene, with and without symptoms. Methods Speech data were acquired from 110 participants (55 people with the expanded HD gene including 16 presymptomatic HD; 16 prodromal HD; 14 early-stage HD; 9 mid-stage HD; and 55 matched healthy controls). Objective digital speech measures were derived from speech tasks that fit along a continuum of motor and cognitive complexity. Acoustic features quantified speakers’ articulatory agility, voice quality and speech-timing. Subjects also completed the tests of cognition and upper limb motor function. Results Some presymptomatic HD (furthest from disease onset) differed to healthy controls on timing measures derived from the syllable repetition and monologue. Prodromal HD presented with reduced articulatory agility, reduced speech rate and longer and variable pauses. Speech agility correlated with poorer performance on the upper limb motor test. Conclusion Tasks with a mix of cognitive and motor demands differentiated prodromal HD from their matched control groups. Motor speech tasks alone did not differentiate groups until participants were relatively closer to disease onset or symptomatic. Data demonstrated how ubiquitous behaviors like speech, when analyzed objectively, provide insight into disease related change. ### Competing Interest Statement A. P. Vogel is Chief Science Officer of Redenlab Inc. C. S. J. Chan reports no disclosures. G. W. Stuart is Director of Statistics for Redenlab Inc. Y. Lie reports no disclosures. P. Maruff is Chief Innovation Officer of Cogstate Inc. J. Stout is Director of Zindametrix Pty Ltd. ### Funding Statement A. P. Vogel received salaried support from the National Health and Medical Research Council, Australia (#1082910) and institutional support from The University of Melbourne. J. Stout is funded by a National Health and Medical Research Council, Australia, Investigator Grant (#1173472). ### Author Declarations I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained. Yes The details of the IRB/oversight body that provided approval or exemption for the research described are given below: Ethics committee/IRB of The University of Melbourne, and Calvary Healthcare Bethlehem gave ethical approval for this work I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals. Yes I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance). Yes I have followed all appropriate research reporting guidelines and uploaded the relevant EQUATOR Network research reporting checklist(s) and other pertinent material as supplementary files, if applicable. Yes Data form part of consortia and ongoing initiatives
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要