No-code machine learning in radiology: implementation and validation of a platform that allows clinicians to train their own models

Daniel C. Elton,Giridhar Dasegowda, James Y. Sato, Emiliano G. Frias,Christopher P. Bridge, Artem B. Mamonov, Mark Walters, Martynas Ziemelis,Thomas J. Schultz,Bernardo C. Bizzo,Keith J. Dreyer,Mannudeep K. Kalra

medrxiv(2024)

Cited 0|Views7
No score
Abstract
Machine learning models can assist clinicians and researchers in many tasks within radiology such as diagnosis, triage, segmentation/measurement, and quality assurance. To better leverage machine learning we have developed a platform that allows users to label data and train models without requiring any programming knowledge. The technology stack consists of a TypeScript web application running on .NET for user interaction, Python, PyTorch, and MONAI for machine learning, DICOM WADO-RS to retrieve data from clinical systems, and Docker for model management. As a first trial of the system, researchers used it to train a model for clavicle fracture detection as part of an IRB-approved retrospective study. The researchers labeled 4,135 clavicle radiographs from 2,039 patients across 13 sites. The platform automatically split the data into training, validation, and test sets and trained a model until the validation loss plateaued. The system then returned a receiver operating characteristic curve, AUC, F1, and other metrics. The resulting model identifies clavicle fractures with 90% sensitivity, 87% specificity, and 88% accuracy with an AUC of 0.95. This model performance is equivalent to or better than similar models reported in the literature. More recently, our system was used to train a model to identify if ultrasound frames that contain personally identifiable information (PII). After validation, the model was used to help de-identify a large dataset that was to be used for research. This first-of-its-kind system streamlines model development and deployment and opens up an exciting new pathway for the use of AI within healthcare. ### Competing Interest Statement Mannudeep K. Kalra reports a relationship with Siemens Healthineers that includes: funding grants. ### Funding Statement This study did not receive any funding ### Author Declarations I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained. Yes The details of the IRB/oversight body that provided approval or exemption for the research described are given below: IRB of Mass General Brigham gave ethical approval for this work. Protocol #2023P000205. I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals. Yes I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance). Yes I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable. Yes All data produced in the present work are contained in the manuscript. Supporting spreadsheets are available upon reasonable request. The imaging data used for training will not be available.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined