-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ref: bug fix with logging val epoch end + monitor #3812
Conversation
Hello @williamFalcon! Thanks for updating this PR.
Comment last updated at 2020-10-03 15:44:45 UTC |
there are test failures related to the order of metrics needing to be logged. does swapping the order here override what's sent to the logger? |
the order doesn't affect the logger. It affects the case where you log in training and use that somehow during validation for monitoring or something. In this case you want to make sure to have the latest value (not a batch before), and in some cases the metrics may not even be available yet for the val loop. |
m = "ReduceLROnPlateau requires returning a dict from configure_optimizers with the keyword " \ | ||
"monitor=. For example:" \ | ||
"return {'optimizer': optimizer, 'lr_scheduler': scheduler, 'monitor': 'your_loss'}" | ||
raise MisconfigurationException(m) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can do raise MisconfigurationException(m) from e
to preserve the stacktrace better
@@ -94,13 +95,18 @@ def init_optimizers( | |||
' a list of `torch.optim.lr_scheduler`' | |||
' * multiple outputs, dictionaries as described with an optional `frequency` key (int)') | |||
|
|||
def configure_schedulers(self, schedulers: list): | |||
def configure_schedulers(self, schedulers: list, monitor: str = None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
monitor: Optional[str] = None
ref: bug fix with logging val epoch end + monitor
Fixes #3811