PR #3427: NNX migration prep (1/N): pure_nnx flag and init_state_fn scaffolding#3609
Open
copybara-service[bot] wants to merge 1 commit intomainfrom
Open
PR #3427: NNX migration prep (1/N): pure_nnx flag and init_state_fn scaffolding#3609copybara-service[bot] wants to merge 1 commit intomainfrom
copybara-service[bot] wants to merge 1 commit intomainfrom
Conversation
…caffolding Imported from GitHub PR #3427 # NNX Migration Route Map 1. ✅ Add NNX support function and utils / pure_nnx flag=False. Won't affect current Linen workflow. 2. ❌ NNX fully supported. pure_nnx flag=False, but user can do NNX runs/tests. 3. ❌ NNX unit tests and performance tests are completed and verified. pure_nnx flag=True. 4. ❌ Remove Linen code related NNX flags. # Description > **Note:** This is the first in a series of NNX migration PRs. Pure NNX training is **not yet implemented** — all NNX code paths currently raise `NotImplementedError`. This PR only introduces the structural scaffolding needed for subsequent patches to plug in NNX logic without modifying shared infrastructure. This PR introduces two abstractions that enable incremental NNX migration while keeping existing Linen code fully functional. ## `pure_nnx` config flag A boolean (`configs/base.yml`, `configs/types.py`) that will route all major code paths — training, compilation, inference, RL, and utilities — to pure-NNX logic when `True`, falling back to Linen otherwise. Defaults to `False` so all existing behaviour is unchanged. ## `init_state_fn` A pluggable callable for initializing the model training state, threaded through `create_checkpoint_manager`, `setup_training_state`, `setup_decode_state`, and `get_abstract_state`. This decouples state initialization from shared infrastructure so future NNX and Linen paths can provide their own implementations without forking utilities. ## Other structural changes - `create_training_tools` is split into `create_training_optimizer` and `create_checkpoint_manager` for cleaner separation of concerns. - `jit_train_step` gains a `mesh` parameter to accommodate NNX callers where `model` is a `GraphDef` with no `.mesh` attribute. - `get_shaped_inputs` in `train_compile.py` adds a `pure_nnx` branch that omits `example_rng`, matching the future NNX `train_step(state, batch)` signature. - `get_first_step` restored to the two-argument `(model, state)` form to support both Linen `TrainState` and NNX `TrainStateNNX` step retrieval. - All entry points (`grpo_trainer`, `maxengine`, `generate_param_only_checkpoint`, `layerwise_quantization`, `lora_utils`, `standalone_checkpointer`, integration tests) updated to accept and pass `init_state_fn`. ## New files - `src/maxtext/layers/train_state_nnx.py` — NNX `TrainStateNNX` container wrapping `nnx.Module` + `nnx.Optimizer` (mirrors Linen `TrainState`). - `src/maxtext/utils/maxtext_utils_nnx.py` — NNX-specific utilities: abstract state, named sharding helpers, and sharded model creation. ## Lint / test fixes - `maxtext_utils.py` — remove unused `ShardMode` import. - `maxtext_utils_test.py` — fix duplicate `Any` import; restore `Transformer` alias; update `get_abstract_state` call to new `(config, mesh, init_state_fn)` signature. - `sharding_compare_test.py` — add `pure_nnx=False`/`enable_nnx=False`/ `pure_nnx_decoder=False` to config params; update `get_abstract_state` and `get_logical_annotations` calls to new API. - `state_dtypes_test.py` — update `get_abstract_state` call to new API. # Tests ```bash python3 -m pytest tests/unit/train_utils_test.py -v python3 -m pytest tests/unit/train_compile_test.py -v python3 -m pytest tests/unit/maxtext_utils_test.py -v python3 -m pytest tests/unit/state_dtypes_test.py -v python3 -m pytest tests/unit/sharding_compare_test.py -v ``` # Checklist Before submitting this PR, please make sure (put X in square brackets): - [x] I have performed a self-review of my code. For an optional AI review, add the `gemini-review` label. - [x] I have necessary comments in my code, particularly in hard-to-understand areas. - [x] I have run end-to-end tests tests and provided workload links above if applicable. - [x] I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in [our documentation](https://maxtext.readthedocs.io/en/latest/development.html#adding-new-documentation-files). Copybara import of the project: -- 3d17df0 by Xibin Liu <xibin@google.com>: NNX migration preparation: pure_nnx flag and init_state_fn - pure_nnx: a flag to to choose pure NNX logic when NNX and linen models co-exist. - init_state_fn: a function to initialize the model state for the training. It will be set to different function for NNX and Linen. Merging this change closes #3427 Reverts b6bdfc4 PiperOrigin-RevId: 896668323
Codecov Report❌ Patch coverage is 📢 Thoughts on this report? Let us know! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
PR #3427: NNX migration prep (1/N): pure_nnx flag and init_state_fn scaffolding
Imported from GitHub PR #3427
NNX Migration Route Map
Description
This PR introduces two abstractions that enable incremental NNX migration while keeping existing Linen code fully functional.
pure_nnxconfig flagA boolean (
configs/base.yml,configs/types.py) that will route all major code paths — training, compilation, inference, RL, and utilities — to pure-NNX logic whenTrue, falling back to Linen otherwise. Defaults toFalseso all existing behaviour is unchanged.init_state_fnA pluggable callable for initializing the model training state, threaded through
create_checkpoint_manager,setup_training_state,setup_decode_state, andget_abstract_state. This decouples stateinitialization from shared infrastructure so future NNX and Linen paths can provide their own implementations without forking utilities.
Other structural changes
create_training_toolsis split intocreate_training_optimizerandcreate_checkpoint_managerfor cleaner separation of concerns.jit_train_stepgains ameshparameter to accommodate NNX callers wheremodelis aGraphDefwith no.meshattribute.get_shaped_inputsintrain_compile.pyadds apure_nnxbranch that omitsexample_rng, matching the future NNXtrain_step(state, batch)signature.get_first_steprestored to the two-argument(model, state)form to support both LinenTrainStateand NNXTrainStateNNXstep retrieval.grpo_trainer,maxengine,generate_param_only_checkpoint,layerwise_quantization,lora_utils,standalone_checkpointer, integration tests) updated to accept and passinit_state_fn.New files
src/maxtext/layers/train_state_nnx.py— NNXTrainStateNNXcontainer wrappingnnx.Module+nnx.Optimizer(mirrors LinenTrainState).src/maxtext/utils/maxtext_utils_nnx.py— NNX-specific utilities: abstract state, named sharding helpers, and sharded model creation.Lint / test fixes
maxtext_utils.py— remove unusedShardModeimport.maxtext_utils_test.py— fix duplicateAnyimport; restoreTransformeralias; updateget_abstract_statecall to new(config, mesh, init_state_fn)signature.sharding_compare_test.py— addpure_nnx=False/enable_nnx=False/pure_nnx_decoder=Falseto config params; updateget_abstract_stateandget_logical_annotationscalls to new API.state_dtypes_test.py— updateget_abstract_statecall to new API.Tests
Checklist
Before submitting this PR, please make sure (put X in square brackets):
gemini-reviewlabel.Copybara import of the project:
--
3d17df0 by Xibin Liu xibin@google.com:
NNX migration preparation: pure_nnx flag and init_state_fn
co-exist.
training. It will be set to different function for NNX and Linen.
Merging this change closes #3427
Reverts b6bdfc4