谷歌浏览器插件
订阅小程序
在清言上使用

Identifying Major Depressive Disorder From Clinical Notes Using Neural Language Models with Distant Supervision.

AMIA(2023)

引用 0|浏览10
暂无评分
摘要
Major Depressive Disorder (MDD) is the leading cause of disability in the world and its prevalence continues to increase. Phenotyping and relevance classification of prevalent but underdiagnosed conditions from clinical notes using Natural Language Processing (NLP) methods has been an effective way to learn valuable clinical insights as well as for performance on multiple clinical domain-specific NLP tasks. Recent NLP advancements in using self-supervision techniques with transformer architectures, such as Bidirectional Encoder Representations from Transformers (BERT), pioneered a new standard of contextual representation for fine-tuning models toward domain-specific tasks. However, supervised fine-tuning of neural models typically requires too many labeled samples to feasibly annotate entirely manually, constricting most research in clinical applications to shared, general purpose datasets. In this work, we propose using distant supervision to generate adequate data for training transformer-based language models for MDD-relatedness sentence classification. We evaluate the performance of two transformer instances - BERT and Bio-clinical BERT - for classifying sentences into one of four classes representing different MDD concept contexts, then compare against three traditional machine learning baseline models. The Bio-Clinical BERT model, pre-trained on biomedical abstracts then further on clinical domain corpora, achieved the highest F1-score for all class labels. Our results show that the neural language models, especially those with relevant domain-based pretraining, are superior at discerning semantic contexts and that distant supervision is an effective tactic for improving the accuracy of the pretrained transformer models.
更多
查看译文
关键词
depression,Major Depressive Disorder,clinical notes,natural language processing,NLP,distant supervision,weak supervision,deep learning,transformer,BERT
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要