M. Axel Giebelhaus
CHISI Research
[email protected]
This repository contains the complete LaTeX source, experimental data, and figures for the paper "Geometric Invariants of Neural Network Training: Certifying Quasi-Symplectic Dynamics".
We characterize a quasi-symplectic operating regime in neural network training by formalizing and validating three geometric probes that certify structure at small step sizes:
- Symmetric round-trip test (ΔNEGdt): measures reversibility
- Paired-subspace Ω-invariance error (εΩ): tests local symplecticity
- Tail-median energy drift (D): quantifies stability
- GroupNorm and calibrate-then-freeze BatchNorm satisfy all probes at small Δt
- Standard BatchNorm (train mode) systematically fails, breaking local reversibility and Ω-invariance
- Causal intervention: Sweeping BN running-statistics momentum produces a clean dose-response curve
- Mass scaling: SimpleCNN follows expected dt_zero ∝ √M relationship; deeper architectures deviate
These results provide operational probes that certify near-reversibility, local symplecticity, and low drift in neural network optimization—characterizing the anti-dissipative dynamics discovered in prior work.
.
├── paper/ # LaTeX source files
│ ├── main.tex # Main document
│ ├── macros.tex # Custom macros and notation
│ ├── Makefile # Build system
│ ├── sections/ # Individual paper sections
│ ├── tables/ # Generated tables (LaTeX + CSV)
│ └── biblio/ # Bibliography (references.bib)
├── figures/ # All paper figures
│ └── figs/ # Generated plots and diagrams
├── data/ # Experimental results
│ └── computed/ # Processed data files (CSV)
└── README.md # This file
- TeX Live 2020+ or equivalent LaTeX distribution
pdflatex,bibtex, andmake
cd paper/
makeThis will compile main.tex and produce main.pdf.
cd paper/
make clean
makeAll experimental results are provided in data/computed/:
summary_*.csv: Per-architecture summary statisticsmerged_curves_*.csv: Training curves for different configurationsaccept_rtvol_*.csv: Acceptance rate analysisphase3_*.csv: BatchNorm momentum sweep results
- SimpleCNN: Simple convolutional baseline (with/without BN)
- ResNet-18: Standard ResNet-18 architecture with:
- BatchNorm (training mode)
- BatchNorm (freeze after epoch 200)
- GroupNorm (32 groups)
All figures are provided as publication-ready PDFs/PNGs in figures/figs/:
fig_abs_drift_*.png: Absolute energy drift tracesfig_mass_slope_*.png: Mass scaling analysisfig_accept_probes*.png: Acceptance rate visualizationsfig_omega_per_seed.png: Ω-invariance per random seedphase3_*.png: Phase 3 intervention results
If you use this work, please cite:
@misc{giebelhaus2025geometric,
title={Geometric Invariants of Neural Network Training: Certifying Quasi-Symplectic Dynamics},
author={Giebelhaus, M. Axel},
year={2025},
publisher={Zenodo},
doi={10.5281/zenodo.XXXXXXX},
note={DOI will be assigned upon publication}
}This paper builds on prior research investigating Hamiltonian structure in neural network optimization:
- [Paper A: Characterization work on anti-dissipative dynamics]
- [Paper C: Hamiltonian Memory - symplectic approaches to inference]
This work is licensed under the MIT License - see LICENSE for details.
The LaTeX source, experimental data, and figures are provided for reproducibility and further research.
M. Axel Giebelhaus
Independent Researcher, CHISI Research
Email: [email protected]
Location: Beech Mountain, North Carolina, USA
This research was conducted independently with computational resources generously provided through academic GPU compute grants.
Special thanks to the open-source scientific computing community for developing the tools that made this work possible: PyTorch, NumPy, Matplotlib, and the entire Python scientific stack.
All experimental configurations, hyperparameters, and random seeds are documented in the paper's Methods section. The processed data in data/computed/ allows reproduction of all figures and tables without re-running experiments.
For questions about experimental methodology or data provenance, please contact the author.