This repository contains the source code for our paper:
Learning KAN-based Implicit Neural Representations for Deformable Image Registration
Nikita Drozdov, Marat Zinovev, Dmitry Sorokin
https://arxiv.org/abs/2509.22874
Updates:
- 🔥 September 2025 — Preprint of our paper is available on arXiv!
Follow the instructions on the official PyTorch website to install PyTorch with CUDA support. We used PyTorch 2.3.0 with CUDA 12.1 and Python 3.8.10.
Run the following command:
pip install -r requirements.txt
To prepare the data, follow the instructions in the DIR-Lab, OASIS-1, and ACDC folders (detailed instructions are coming soon).
To reproduce the results from our paper, run the corresponding script with additional arguments:
python run_<dataset>.py --model model_name --runs N_runs
Where:
datasetis one of:dirlab,oasis, oracdcmodel_nameis one of:kan(for KAN-IDIR),rand_kan(for RandKAN-IDIR),a_kan(for A-KAN-IDIR), oridir(for IDIR)N_runsspecifies the number of runs with different random seeds
To reproduce our results, we recommend 10 runs for DIR-Lab and 3-5 runs for OASIS/ACDC.
For example, to replicate the results of the RandKAN-IDIR model on the DIR-Lab dataset, run:
python run_dirlab.py --model rand_kan --runs 10
Configurations for all models are stored in configs/config.py.
Note: By default, all metrics are computed on the GPU, and memory consumption is higher during the evaluation phase than during training. To mitigate this, you can reduce the batch size for the model during validation by adjusting the SEG_BS and NJD_BS constants in this file.
Our work builds upon the IDIR codebase. ChebyKAN repo was also very useful for implementing the KAN-based models.
@misc{drozdov2025learningkanbasedimplicitneural,
title={Learning KAN-based Implicit Neural Representations for Deformable Image Registration},
author={Nikita Drozdov and Marat Zinovev and Dmitry Sorokin},
year={2025},
eprint={2509.22874},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2509.22874},
}