Skip to content

Conversation

@hxbai
Copy link
Contributor

@hxbai hxbai commented Dec 25, 2025

What does this PR do ?

main branch: #2811

PR Description: Optimizer State Offloading for DistributedOptimizer

Summary

This PR introduces optimizer state offloading to CPU for the DistributedOptimizer, enabling significant GPU memory savings during training by temporarily moving optimizer states (exp_avg, exp_avg_sq) and master weights to CPU memory when not in use.

Motivation

During the forward and backward passes, optimizer states occupy GPU memory but are not actively used. For large models, these states can consume a substantial portion of GPU memory. By offloading optimizer states to CPU after optimizer.step() and reloading them before the next step, we can reclaim this GPU memory for other operations like activation checkpointing or larger batch sizes.

Performance & Memory Savings

Memory Savings:

  • On DeepSeek-V3, this feature saves 15-20GB of GPU memory (with 0.1-0.2s/iter overhead on GB200)

Comparison with Optimizer CPU Offloading:

Aspect CPU Offloading (ZeRO-Offload style) State Offloading (This PR)
Where optimizer runs CPU GPU
D2H/H2D frequency Every step (gradients + params) Every step (states only)
Compute location Adam step on CPU Adam step on GPU
Best for Memory-constrained, bandwidth-limited High-bandwidth interconnects (NVLink, GB200)

More details:

  • With higher H2D/D2H bandwidth such as GB200, the state offloading has significantly less overhead
  • Async transfer overlapping
  • Optimizer step still runs on GPU, avoiding CPU compute bottleneck
  • Pinned memory enables maximum PCIe/NVLink bandwidth utilization
  • Currently requires TE FusedAdam optimizer

⚠️ For major changes (either in lines of code or in its impact), please make sure to first share discuss a design-doc with the team.

Contribution process

flowchart LR
    A[Pre-checks] --> B[PR Tests]
    subgraph Code Review/Approval
        C1[Expert Review] --> C2[Final Review]
    end
    B --> C1
    C2 --> D[Merge]
Loading

Pre-checks

  • I want this PR in a versioned release and have added the appropriate Milestone (e.g., Core 0.8)
  • I have added relevant unit tests
  • I have added relevant functional tests
  • I have added proper typing to my code Typing guidelines
  • I have added relevant documentation
  • I have run the autoformatter.sh on my PR

Code review

The following process is enforced via the CODEOWNERS file for changes into megatron/core. For changes outside of megatron/core, it is up to the PR author whether or not to tag the Final Reviewer team.

For MRs into `main` branch

(Step 1): Add PR label Expert Review

(Step 2): Collect the expert reviewers reviews

  1. Attach the Expert Review label when your PR is ready for review.
  2. GitHub auto-assigns expert reviewers based on your changes. They will get notified and pick up your PR soon.

⚠️ Only proceed to the next step once all reviewers have approved, merge-conflict are resolved and the CI is passing.
Final Review might get declined if these requirements are not fulfilled.

(Step 3): Final Review

  1. Add Final Review label
  2. GitHub auto-assigns final reviewers based on your changes. They will get notified and pick up your PR soon.

(Optional Step 4): Cherry-pick into release branch

If this PR also needs to be merged into core_r* release branches, after this PR has been merged, select Cherry-pick to open a new PR into the release branch.

For MRs into `dev` branch The proposed review process for `dev` branch is under active discussion.

MRs are mergable after one approval by either [email protected] or [email protected].

Merging your PR

Any member of core-adlr and core-nemo will be able to merge your PR.

@hxbai hxbai requested review from a team as code owners December 25, 2025 10:10
@copy-pr-bot
Copy link

copy-pr-bot bot commented Dec 25, 2025

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@github-actions github-actions bot requested a review from Phlip79 December 25, 2025 10:10
@hxbai hxbai marked this pull request as draft December 25, 2025 10:11
@hxbai hxbai self-assigned this Dec 25, 2025
@hxbai hxbai added the dev branch Dev branch related issues and development label Dec 25, 2025
@hxbai hxbai added this to the Core 0.16 milestone Dec 25, 2025
@hxbai
Copy link
Contributor Author

hxbai commented Dec 29, 2025

/ok to test b3f0ab3

@hxbai
Copy link
Contributor Author

hxbai commented Dec 31, 2025

/ok to test a27aa49

@hxbai
Copy link
Contributor Author

hxbai commented Dec 31, 2025

/ok to test 3ec46ba

@hxbai
Copy link
Contributor Author

hxbai commented Jan 4, 2026

/ok to test d2e4773

@hxbai
Copy link
Contributor Author

hxbai commented Jan 5, 2026

/ok to test 39d5ba5

@hxbai
Copy link
Contributor Author

hxbai commented Jan 5, 2026

/ok to test cea4340

@hxbai hxbai force-pushed the opt_state_offload branch from cea4340 to cecd4fc Compare January 5, 2026 14:00
@hxbai hxbai marked this pull request as ready for review January 5, 2026 14:01
@hxbai
Copy link
Contributor Author

hxbai commented Jan 5, 2026

/ok to test cecd4fc

@hxbai hxbai added the Expert Review Apply this label to indicate that your PR is ready for expert review. label Jan 5, 2026
@hxbai hxbai requested a review from kunlunl January 6, 2026 02:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dev branch Dev branch related issues and development Expert Review Apply this label to indicate that your PR is ready for expert review.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant