Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhance] Support pass arugments with update_params #796

Merged
merged 3 commits into from
Dec 8, 2022

Conversation

twmht
Copy link
Contributor

@twmht twmht commented Dec 7, 2022

Motivation

optimizer.step() supports kwargs (https://github.com/open-mmlab/mmengine/blob/main/mmengine/optim/optimizer/optimizer_wrapper.py#L211), but update_params not (https://github.com/open-mmlab/mmengine/blob/main/mmengine/optim/optimizer/optimizer_wrapper.py#L176).

this would be a problem if I want to write my own model wrapper.

Modification

Add kwargs to update_params

BC-breaking (Optional)

Does the modification introduce changes that break the backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.

Checklist

  1. Pre-commit or other linting tools are used to fix the potential lint issues.
  2. The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  3. If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDet or MMCls.
  4. The documentation has been modified accordingly, like docstring or example tutorials.

@twmht twmht requested a review from HAOCHENYE as a code owner December 7, 2022 07:06
@twmht twmht force-pushed the add_kwargs_to_optim_wrapper branch from 89d99e4 to e6e4a6a Compare December 7, 2022 07:10
zhouzaida
zhouzaida previously approved these changes Dec 7, 2022
Copy link
Collaborator

@HAOCHENYE HAOCHENYE left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contribution! We should also update the OptimWrapper.update_params

@@ -161,7 +161,7 @@ def __init__(self,
# the loss factor will always be the same as `_accumulative_counts`.
self._remainder_counts = -1

def update_params(self, loss: torch.Tensor) -> None:
def update_params(self, loss: torch.Tensor, **kwargs) -> None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def update_params(self, loss: torch.Tensor, **kwargs) -> None:
def update_params(self, loss: torch.Tensor, step_kwargs=None, zero_grad=None) -> None:

Consider update_params will call step and zero_grad in order, it could be better to provide step_kwargs and zero_kwargs arguments.

BTW, we should also update the description of the argument in docstring,

@twmht twmht force-pushed the add_kwargs_to_optim_wrapper branch 6 times, most recently from 63267a7 to 97e2c43 Compare December 8, 2022 03:04
Support step arugments and zero arguments with update_params
@twmht twmht force-pushed the add_kwargs_to_optim_wrapper branch from 97e2c43 to 9ad7e87 Compare December 8, 2022 03:29
Comment on lines 172 to 173
step_kwargs (dict): arguments for optimizer.step
zero_kwargs (dict): arguments for optimizer.zero_grad
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
step_kwargs (dict): arguments for optimizer.step
zero_kwargs (dict): arguments for optimizer.zero_grad
step_kwargs (dict, optional): Arguments for optimizer.step.
Defaults to None.
New in version v0.4.0.
zero_kwargs (dict, optional): Arguments for optimizer.zero_grad.
Defaults to None.
New in version v0.4.0.

@twmht
Copy link
Contributor Author

twmht commented Dec 8, 2022

@HAOCHENYE

I have Updated the code as you suggested. thank you

@zhouzaida zhouzaida merged commit 381c5f1 into open-mmlab:main Dec 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants