Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Support Adafactor Optimizer #1361

Merged
merged 3 commits into from
Sep 21, 2023

Conversation

okotaku
Copy link
Contributor

@okotaku okotaku commented Sep 21, 2023

Motivation

transformers provides Adafactor optimizer.
It is often used when training Stable Diffusion XL.

Use cases (Optional)

pip install transformers
optim_wrapper = dict(
    optimizer=dict(
        type='Adafactor',
        lr=1e-5,
        weight_decay=1e-2,
        scale_parameter=False,
        relative_step=False))

Checklist

  1. Pre-commit or other linting tools are used to fix the potential lint issues.
  2. The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  3. If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDet or MMCls.
  4. The documentation has been modified accordingly, like docstring or example tutorials.

Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
@zhouzaida zhouzaida merged commit d617bca into open-mmlab:main Sep 21, 2023
@fanqiNO1 fanqiNO1 mentioned this pull request Oct 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants