Chrome Extension
WeChat Mini Program
Use on ChatGLM

A Deep Learning Approach for Rapid and Generalizable Denoising of Photon-Counting Micro-CT Images

Tomography (Ann Arbor, Mich.)(2023)

Cited 0|Views3
No score
Abstract
Photon-counting CT (PCCT) is powerful for spectral imaging and material decomposition but produces noisy weighted filtered backprojection (wFBP) reconstructions. Although iterative reconstruction effectively denoises these images, it requires extensive computation time. To overcome this limitation, we propose a deep learning (DL) model, UnetU, which quickly estimates iterative reconstruction from wFBP. Utilizing a 2D U-net convolutional neural network (CNN) with a custom loss function and transformation of wFBP, UnetU promotes accurate material decomposition across various photon-counting detector (PCD) energy threshold settings. UnetU outperformed multi-energy non-local means (ME NLM) and a conventional denoising CNN called UnetwFBP in terms of root mean square error (RMSE) in test set reconstructions and their respective matrix inversion material decompositions. Qualitative results in reconstruction and material decomposition domains revealed that UnetU is the best approximation of iterative reconstruction. In reconstructions with varying undersampling factors from a high dose ex vivo scan, UnetU consistently gave higher structural similarity (SSIM) and peak signal-to-noise ratio (PSNR) to the fully sampled iterative reconstruction than ME NLM and UnetwFBP. This research demonstrates UnetU's potential as a fast (i.e., 15 times faster than iterative reconstruction) and generalizable approach for PCCT denoising, holding promise for advancing preclinical PCCT research.
More
Translated text
Key words
denoising,deep learning,preclinical,micro-CT,photon-counting CT,contrast agents
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined