Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions docs/reference/core_concepts/checkpoints.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ MaxText automatically saves checkpoints periodically during a training run. Thes

- `base_output_directory`: Specifies the GCS bucket directory where checkpoints will be saved.
- `enable_checkpointing`: A boolean to enable or disable checkpointing.
- `async_checkpoint`: Support training and checkpoint saving at the same time.
- `async_checkpointing`: Support training and checkpoint saving at the same time.
- `checkpoint_period`: The interval, in training steps, at which to save a new checkpoint.

Furthermore, MaxText supports emergency checkpointing, which saves a local copy of the checkpoint that can be restored quickly after an interruption.
Expand All @@ -99,6 +99,6 @@ Furthermore, MaxText supports emergency checkpointing, which saves a local copy
- `local_checkpoint_directory`: The local path for storing emergency checkpoints.
- `local_checkpoint_period`: The interval, in training steps, for saving local checkpoints.

More configs about checkpoints can be found in [here](https://github.com/AI-Hypercomputer/maxtext/blob/fafdeaa14183a8f5ca7b9f7b7542ce1655237574/src/MaxText/configs/base.yml#L23-L65).
Additional configs related to checkpoints can be found [here](https://github.com/AI-Hypercomputer/maxtext/blob/main/src/maxtext/configs/base.yml#L32-L88).

For practical guides on checkpointing, please refer to [](checkpointing_solutions).
For more extensive information and practical guides on checkpointing, please refer to [](checkpointing_solutions).
Loading