GPU accelerated estimation of a shared random effect joint model for dynamic prediction

COMPUTATIONAL STATISTICS & DATA ANALYSIS(2022)

引用 0|浏览12
暂无评分
摘要
In longitudinal cohort studies, it is often of interest to predict the risk of a terminal clinical event using longitudinal predictor data among subjects at risk by the time of the prediction. The at-risk population changes over time; so does the association between predictors and the outcome, as well as the accumulating longitudinal predictor history. The dynamic nature of this prediction problem has received increasing interest in the literature, but computation often poses a challenge. The widely used joint model of longitudinal and survival data often comes with intensive computation and excessive model fitting time, due to numerical optimization and the analytically intractable high-dimensional integral in the likelihood function. This problem is exacerbated when the model is fit to a large dataset or the model involves multiple longitudinal predictors with nonlinear trajectories. This challenge can be addressed from an algorithmic perspective, by a novel two-stage estimation procedure, and from a computing perspective, by Graphics Processing Unit (GPU) programming. The latter is implemented through PyTorch, an emerging deep learning framework. The numerical studies demonstrate that the proposed algorithm and software can substantially speed up the estimation of the joint model, particularly with large datasets. The numerical studies also concluded that accounting for nonlinearity in longitudinal predictor trajectories can improve the prediction accuracy in comparison to joint modeling that ignore nonlinearity.
更多
查看译文
关键词
Electronic health records, Graphics Processing Unit (GPU) computing, Joint modeling, Longitudinal and survival data, Numerical integration, Parallel computing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要