Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean up optimizer code #3587

Merged
merged 8 commits into from
Oct 21, 2020
Merged

Clean up optimizer code #3587

merged 8 commits into from
Oct 21, 2020

Conversation

carmocca
Copy link
Contributor

@carmocca carmocca commented Sep 21, 2020

What does this PR do?

  • Cleans up some of the OptimizerConnector logic to avoid duplicated code.
  • Checks if there are extra keys in the lr_scheduler dict. Adds a test for this warning.
  • Cleans up the init_optimizers function so configure_optimizers is only called at the end.
  • Fixes bug where if configure_optimizers returned ([optimizer], scheduler), trainer.optimizers became [optimizer, scheduler]
  • Updates exception message for invalid configure_optimizers returns. Adds a test for this exception.
  • Updates the default scheduler config to include strict.
  • Check for extra unsupported keys in the lr_scheduler dict. Adds a test for this warning.
  • Improves logic to check if there is a monitor when a ReduceLROnPlateau scheduler is used. both in the case of {"optimizer": optimizer, "lr_scheduler": scheduler, "monitor": "your_loss"} and {"optimizer": optimizer, "lr_scheduler": {"scheduler": scheduler, "monitor": "your_loss"}}. Adds tests for these exceptions.
  • Updates docs to reflect how we no longer support automatically monitoring val_loss.

No issue to tag. Just some stuff I found while coding #3586

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

@mergify mergify bot requested a review from a team September 21, 2020 16:55
@carmocca carmocca changed the title [blocked by #3586] Clean up code optimizer code [blocked by #3586] Clean up optimizer code Sep 23, 2020
@mergify
Copy link
Contributor

mergify bot commented Sep 28, 2020

This pull request is now in conflict... :(

@edenlightning
Copy link
Contributor

Hi @carmocca! Mind rebasing your PR?

@carmocca
Copy link
Contributor Author

Hi @carmocca! Mind rebasing your PR?

Waiting for #3586. This PR is low priority

@carmocca carmocca marked this pull request as draft October 21, 2020 12:41
@pep8speaks
Copy link

pep8speaks commented Oct 21, 2020

Hello @carmocca! Thanks for updating this PR.

Line 116:121: E501 line too long (121 > 120 characters)

Comment last updated at 2020-10-21 18:08:24 UTC

@carmocca carmocca changed the title [blocked by #3586] Clean up optimizer code Clean up optimizer code Oct 21, 2020
@codecov
Copy link

codecov bot commented Oct 21, 2020

Codecov Report

Merging #3587 into master will decrease coverage by 2%.
The diff coverage is 94%.

@@           Coverage Diff           @@
##           master   #3587    +/-   ##
=======================================
- Coverage      93%     91%    -2%     
=======================================
  Files         103     103            
  Lines        7847    8041   +194     
=======================================
+ Hits         7282    7333    +51     
- Misses        565     708   +143     

@carmocca carmocca marked this pull request as ready for review October 21, 2020 16:02
@carmocca
Copy link
Contributor Author

@edenlightning @Borda rebased, unblocked and ready to review! 😄

Copy link
Contributor

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice

tests/trainer/test_optimizers.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team October 21, 2020 16:19
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
tests/trainer/test_optimizers.py Show resolved Hide resolved
tests/trainer/test_optimizers.py Show resolved Hide resolved
tests/trainer/test_optimizers.py Show resolved Hide resolved
tests/trainer/test_optimizers.py Outdated Show resolved Hide resolved
tests/trainer/test_optimizers.py Outdated Show resolved Hide resolved
tests/trainer/test_optimizers.py Outdated Show resolved Hide resolved
pytorch_lightning/trainer/optimizers.py Outdated Show resolved Hide resolved
pytorch_lightning/trainer/optimizers.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team October 21, 2020 17:52
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Copy link
Contributor

@rohitgr7 rohitgr7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice!

@mergify mergify bot requested a review from a team October 21, 2020 18:31
@rohitgr7 rohitgr7 added feature Is an improvement or enhancement and removed Blocked on ... labels Oct 21, 2020
Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@SkafteNicki SkafteNicki merged commit 2549ca4 into Lightning-AI:master Oct 21, 2020
@carmocca carmocca deleted the feature/optimizer-cleanup branch October 21, 2020 19:50
@carmocca carmocca self-assigned this Nov 1, 2023
@mergify mergify bot added the ready PRs ready to be merged label Nov 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement ready PRs ready to be merged refactor
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants