A dynamic reconstruction and motion estimation framework for cardiorespiratory motion-resolved real-time volumetric MR imaging (DREME-MR)
Abstract: Based on a 3D pre-treatment magnetic resonance (MR) scan, we developed DREME-MR to jointly reconstruct the reference patient anatomy and a data-driven, patient-specific cardiorespiratory motion model. Via a motion encoder simultaneously learned during the reconstruction, DREME-MR further enables real-time volumetric MR imaging and cardiorespiratory motion tracking with minimal intra treatment k-space data. From a 3D radial-spoke-based pre-treatment MR scan, DREME-MR uses spatiotemporal implicit-neural-representation (INR) to reconstruct pre-treatment dynamic volumetric MR images (learning task 1). The INR-based reconstruction takes a joint image reconstruction and deformable registration approach, yielding a reference anatomy and a corresponding cardiorespiratory motion model. The motion model adopts a low-rank, multi-resolution representation to decompose motion fields as products of motion coefficients and motion basis components (MBCs). Via a progressive, frequency-guided strategy, DREME-MR decouples cardiac MBCs from respiratory MBCs to resolve the two distinct motion modes. Simultaneously with the pre-treatment dynamic MRI reconstruction, DREME-MR also trains an INR-based motion encoder to infer cardiorespiratory motion coefficients directly from the raw k-space data (learning task 2), allowing real-time, intra-treatment volumetric MR imaging and motion tracking with minimal k-space data (20-30 spokes) acquired after the pre-treatment MRI scan. Evaluated using data from a digital phantom (XCAT) and a human scan, DREME-MR solves real-time 3D cardiorespiratory motion with a latency of < 165 ms (= 150-ms data acquisition + 15-ms inference time), fulfilling the temporal constraint of real-time imaging.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.