Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable passing initial optimizer state while creating training session #5869

Merged
merged 11 commits into from
Dec 9, 2020

Conversation

ashbhandare
Copy link
Contributor

This change enables passing initial states for optimizers Adam and Lamb while creating training session. This is the first part of the changes required to enable loading optimizer state from a checkpoint into a model-parallel run(Zero/Megatron).

Also includes moving some common test functions to a separate training_session_test_utils.h.

@ashbhandare ashbhandare force-pushed the aibhanda/load_optim_state branch 4 times, most recently from d649a2a to 82376dc Compare December 1, 2020 00:26
baijumeswani
baijumeswani previously approved these changes Dec 4, 2020
Copy link
Contributor

@baijumeswani baijumeswani left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good to me.

thiagocrepaldi
thiagocrepaldi previously approved these changes Dec 7, 2020
edgchen1
edgchen1 previously approved these changes Dec 8, 2020
@ashbhandare ashbhandare dismissed stale reviews from edgchen1 and thiagocrepaldi via 479a62b December 8, 2020 22:31
@ashbhandare ashbhandare force-pushed the aibhanda/load_optim_state branch from b8bed92 to 479a62b Compare December 8, 2020 22:31
@ashbhandare ashbhandare merged commit b1a75d0 into master Dec 9, 2020
@ashbhandare ashbhandare deleted the aibhanda/load_optim_state branch December 9, 2020 02:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants