TY - GEN
T1 - Temporal Periodic Image Registration with Implicit Neural Representations
AU - Lowes, Mathias
AU - Sørensen, Kristine Aavild
AU - Sermesant, Maxime
AU - Paulsen, Rasmus R.
PY - 2026
Y1 - 2026
N2 - Implicit Neural Representations (INRs) have recently gained popularity for their ability to model functions using simple and lightweight networks. These methods have also proven effective in deformable image registration. In this work, we extend INRs for deformable image registration to the domain of temporal image registration, focusing on periodic temporal image sequences. Our approach, Temporal-IDIR, optimizes a single INR to model deformations across all frames in a temporal image sequence simultaneously, allowing for self-regularization through its own deformation predictions. To achieve this, we introduce a temporal consistency loss that penalizes discrepancies between direct source-to-target transformations and those traversing intermediate frames. We evaluate our framework on the DIR-LAB dataset, using the target registration error (TRE) between annotated and moved landmarks as the metric. Here, we achieve a TRE of 1.03 mm, outperforming other INR-based registration methods. Additionally, our framework supports smooth interpolation between time frames by estimating deformations between the given input frames. (Code is publicly available at https://github.com/MMLowes/Temporal_INR.
AB - Implicit Neural Representations (INRs) have recently gained popularity for their ability to model functions using simple and lightweight networks. These methods have also proven effective in deformable image registration. In this work, we extend INRs for deformable image registration to the domain of temporal image registration, focusing on periodic temporal image sequences. Our approach, Temporal-IDIR, optimizes a single INR to model deformations across all frames in a temporal image sequence simultaneously, allowing for self-regularization through its own deformation predictions. To achieve this, we introduce a temporal consistency loss that penalizes discrepancies between direct source-to-target transformations and those traversing intermediate frames. We evaluate our framework on the DIR-LAB dataset, using the target registration error (TRE) between annotated and moved landmarks as the metric. Here, we achieve a TRE of 1.03 mm, outperforming other INR-based registration methods. Additionally, our framework supports smooth interpolation between time frames by estimating deformations between the given input frames. (Code is publicly available at https://github.com/MMLowes/Temporal_INR.
KW - 4D CT
KW - Deep Learning
KW - Deformable Image Registration
KW - Implicit Neural Representations
KW - Neural Fields
U2 - 10.1007/978-3-032-09513-8_43
DO - 10.1007/978-3-032-09513-8_43
M3 - Article in proceedings
SN - 978-3-032-09512-1
VL - 16241
T3 - Lecture Notes in Computer Science
SP - 442
EP - 451
BT - Proceedings of the 16th International Workshop on Machine Learning in Medical Imaging, MLMI 2025
PB - Springer
T2 - 16th International Workshop on Machine Learning in Medical Imaging
Y2 - 23 September 2025 through 23 September 2025
ER -