Data standardization in the omics field

Judit Kumuthini,Lyndon Zass,Melek Chaouch, Zoe Gill,Verena Ras,Zahra Mungloo-Dilmohamud, Dassen Sathan, Anisah W. Ghoorah,Faisal M. Fadlelmola, Christopher J. Fields, John Van Horn,Fouzia Radouani,Melissa Konopko, Emile R. Chimusa,Shakuntala Baichoo

Elsevier eBooks(2023)

引用 0|浏览3
暂无评分
摘要
In the past decade, the decreased cost of advanced high-throughput technologies has revolutionized biomedical sciences in terms of data volume and diversity. To handle the sheer volumes of sequencing data, quantitative techniques such as machine learning have been employed to handle and find meaning in these data. The need for the integration of complex and multidimensional datasets poses one of the grand challenges of modern bioinformatics. Integrating data from various sources to create larger datasets can allow for greater knowledge transfer and reuse following publication, whether data are submitted to a public repository or shared directly. Standardized procedures, data formats, and comprehensive quality management considerations are the cornerstones of data integration. Combining data from multiple sources can expand the knowledge of a subject. This chapter discusses the importance of incorporating data standardization and good data governance practices in the biomedical sciences. The chapter also describes existing standardization resources and efforts, as well as the challenges related to these practices, emphasizing the critical role of standardization in the omics era. The discussion has been supplemented with practical examples from different “omics” fields.
更多
查看译文
关键词
omics field,standardization,data
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要