A Julia package for solving nonlinear inverse problems using iterative regularization methods, following the SciML interface standards.
IterativeRegularization.jl provides fast, flexible implementations of standard and cutting-edge iterative regularization methods for nonlinear ill-posed inverse problems. The package follows the familiar interface of DifferentialEquations.jl, making it easy to use for anyone familiar with the SciML ecosystem.
using Pkg
Pkg.add("https://github.com/JuliaDifferentialGames/IterativeRegularization.jl.git")Currently Implemented:
- ✅ TODO
In Progress:
- 🚧 Package structure following SciML standards
- 🚧 Problem types and solution interface
- 🚧 Algorithm type hierarchy
- 🚧 Callback system
- 🚧 Reference implementation (Nonlinear Landweber)
- 🚧 Analysis tools (residual history, convergence rate, L-curve)
- 🚧 Full algorithm implementations
- 🚧 Automatic differentiation integration
- 🚧 Neural network regularizers
- 🚧 GPU support
- 🚧 Comprehensive documentation
Planned:
- 📋 PINNs integration
- 📋 Tensor decomposition methods
- 📋 Uncertainty quantification
- 📋 Parallel solvers
- Nonlinear Landweber - Gradient descent-based method with step size control
- Derivative-free Landweber - Uses finite differences for derivative-free optimization
- Landweber-Kaczmarz - Component-wise iterative updates
- Levenberg-Marquardt - Adaptive damping for nonlinear least squares
- Iteratively Regularized Gauss-Newton (IRGN) - Decreasing regularization with inner linear solves
- Broyden's Method - Quasi-Newton approach with Jacobian approximation
- Nonlinear Multigrid - Multigrid acceleration for inverse problems
- Nonlinear Full Multigrid - Full multigrid with nested iterations
- Level Set Methods - For problems with sharp interfaces
- Adversarial Regularization - GAN-based regularization
- NETT (Neural Network Tikhonov) - Learned regularization functionals
- LISTA - Learned Iterative Soft-Thresholding Algorithm
- Learned Proximal Operators - Neural network proximal operators
InverseProblem: General nonlinear inverse problemsLinearInverseProblem: Specialized type for linear problems- Support for priors, constraints, and noise models
DiscrepancyPrinciple: Morozov's discrepancy principle for stoppingResidualCallback: Monitor convergenceSolutionSaver: Save intermediate solutions- Regularization path computation and L-curve analysis
- Automatic differentiation via ForwardDiff.jl and Zygote.jl (planned)
- Neural network regularizers via Lux.jl (planned)
- GPU acceleration support (planned)
- Statistical priors via Distributions.jl (planned)
The package implements methods from:
- Kaltenbacher, B., Neubauer, A., & Scherzer, O. (2008). Iterative Regularization Methods for Nonlinear Ill-Posed Problems. Walter de Gruyter.
Contributions are welcome! The package is structured to make adding new algorithms straightforward:
- Define your algorithm struct inheriting from
AbstractIterativeRegularizationAlgorithm - Implement a
_solvemethod for your algorithm - Add tests and documentation
- Submit a pull request
See src/algorithms.jl and the Nonlinear Landweber implementation in src/solve.jl as examples.
If you use this package in your research, please cite:
@software{iterativeregularization_jl,
author = {Outland, Bennet},
title = {IterativeRegularization.jl: Iterative Regularization Methods for Inverse Problems},
year = {2025},
url = {https://github.com/BennetOutland/IterativeRegularization.jl}
}MIT License - see LICENSE file for details.
This package follows the design principles of the SciML ecosystem and draws inspiration from:
- DifferentialEquations.jl
- Optimization.jl
- RegularizationTools.jl
Generative AI, Claude, was used in the creation of this library as a programming aid including guided code generation, assistance with performance optimization, and for assistance in writing documentation. All code and documentation included in this repository, whether written by the author(s) or generative AI, has been reviewed by the author(s) for accuracy and has completed a verification and validation process upon release.