Do Domain-Specific Protein Language Models Outperform General Models on Immunology-Related Tasks?

Nicolas Deutchmann,Aurelien Pelissier,Anna Weber, Shuaiju Gao, Jasmin Bogojeska,Maria Rodriguez Martinez

biorxiv(2023)

引用 0|浏览13
暂无评分
摘要
Deciphering the antigen recognition capabilities by T~cell and B~cell receptors (antibodies) is essential for advancing our understanding of adaptive immune system responses. In recent years, the development of protein language models (PLMs) has facilitated the development of bioinformatic pipelines where complex amino acid sequences are transformed into vectorized embeddings, which are then applied to a range of downstream analytical tasks. With their success, we have witnessed the emergence of domain-specific PLMs tailored to specific proteins, such as immune receptors. Domain-specific models are often assumed to possess enhanced representation capabilities for targeted applications, however, this assumption has not been thoroughly evaluated. In this manuscript, we assess the efficacy of both generalist and domain-specific transformer-based embeddings in characterizing B and T cell receptors. Specifically, we assess the accuracy of models that leverage these embeddings to predict antigen specificity and elucidate the evolutionary changes that B cells undergo during an immune response. We demonstrate that the prevailing notion of domain-specific models outperforming general models requires a more nuanced examination. We also observe remarkable differences between generalist and domain-specific PLMs, not only in terms of performance but also in the manner they encode information. Finally, we observe that the choice of the size and the embedding layer in PLMs are essential model hyperparameters in different tasks. Overall, our analyzes reveal the promising potential of PLMs in modeling protein function while providing insights into their information handling capabilities. We also discuss the crucial factors that should be taken into account when selecting a PLM tailored to a particular task. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
关键词
language models,protein,domain-specific,immunology-related
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要