MaxCorrMGNN: A Multi-graph Neural Network Framework for Generalized Multimodal Fusion of Medical Data for Outcome Prediction

MACHINE LEARNING FOR MULTIMODAL HEALTHCARE DATA, ML4MHD 2023(2024)

引用 0|浏览22
暂无评分
摘要
With the emergence of multimodal electronic health records, the evidence for an outcome may be captured across multiple modalities ranging from clinical to imaging and genomic data. Predicting outcomes effectively requires fusion frameworks capable of modeling finegrained and multi-faceted complex interactions between modality features within and across patients. We develop an innovative fusion approach called MaxCorr MGNN that models non-linear modality correlations within and across patients through Hirschfeld-Gebelein-Renyi maximal correlation (MaxCorr) embeddings, resulting in a multi-layered graph that preserves the identities of the modalities and patients. We then design, for the first time, a generalized multi-layered graph neural network (MGNN) for task-informed reasoning in multi-layered graphs, that learns the parameters defining patient-modality graph connectivity and message passing in an end-to-end fashion. We evaluate our model an outcome prediction task on a Tuberculosis (TB) dataset consistently out-performing several state-of-the-art neural, graph-based and traditional fusion techniques.
更多
查看译文
关键词
Multimodal Fusion,Hirschfeld-Gebelein-Renyi (HGR) maximal correlation,Multi-Layered Graphs,Multi-Graph Neural Networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要