A Physics-based Generative Model to Synthesize Training Datasets for MRI-based Fat Quantification
Furkejuvvon:
| Publikašuvnnas: | arXiv.org (Dec 11, 2024), p. n/a |
|---|---|
| Váldodahkki: | |
| Eará dahkkit: | , , , |
| Almmustuhtton: |
Cornell University Library, arXiv.org
|
| Fáttát: | |
| Liŋkkat: | Citation/Abstract Full text outside of ProQuest |
| Fáddágilkorat: |
Eai fáddágilkorat, Lasit vuosttaš fáddágilkora!
|
| Abstrákta: | Deep learning-based techniques have potential to optimize scan and post-processing times required for MRI-based fat quantification, but they are constrained by the lack of large training datasets. Generative models are a promising tool to perform data augmentation by synthesizing realistic datasets. However no previous methods have been specifically designed to generate datasets for quantitative MRI (q-MRI) tasks, where reference quantitative maps and large variability in scanning protocols are usually required. We propose a Physics-Informed Latent Diffusion Model (PI-LDM) to synthesize quantitative parameter maps jointly with customizable MR images by incorporating the signal generation model. We assessed the quality of PI-LDM's synthesized data using metrics such as the Fréchet Inception Distance (FID), obtaining comparable scores to state-of-the-art generative methods (FID: 0.0459). We also trained a U-Net for the MRI-based fat quantification task incorporating synthetic datasets. When we used a few real (10 subjects, \(~200\) slices) and numerous synthetic samples (\(>3000\)), fat fraction at specific liver ROIs showed a low bias on data obtained using the same protocol than training data (\(0.10\%\) at \(\hbox{ROI}_1\), \(0.12\%\) at \(\hbox{ROI}_2\)) and on data acquired with an alternative protocol (\(0.14\%\) at \(\hbox{ROI}_1\), \(0.62\%\) at \(\hbox{ROI}_2\)). Future work will be to extend PI-LDM to other q-MRI applications. |
|---|---|
| ISSN: | 2331-8422 |
| Gáldu: | Engineering Database |