Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Add colossalai strategy #1299

Merged
merged 14 commits into from
Aug 18, 2023
Prev Previous commit
Next Next commit
Apply suggestions from code review
  • Loading branch information
zhouzaida authored Aug 18, 2023
commit 03f3bce474b753e025b5d1cdad806d927f3dde76
20 changes: 10 additions & 10 deletions mmengine/_strategy/colossalai.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,24 +72,24 @@ class ColossalAIOpitmWrapper(OptimWrapper):
"""OptimWrapper for ColossalAI.

The available optimizers are:
- CPUAdam
- FusedAdam
- FusedLAMB
- FusedSGD
- HybridAdam
- Lamb
- Lars
- CPUAdam
- FusedAdam
- FusedLAMB
- FusedSGD
- HybridAdam
- Lamb
- Lars

You can find more details in the `colossalai tutorial`_

.. _colossalai tutorial: https://github.com/hpcaitech/ColossalAI/tree/main/colossalai/nn/optimizer

Args:
optimizer (dict or collossal.booster.Booster): The optimizer to be
wrapped.
accumulative_counts (int): The number of iterations to accumulate
gradients. The parameters will be updated per
``accumulative_counts``.

.. _colossalai tutorial: https://github.com/hpcaitech/ColossalAI/tree/main/colossalai/nn/optimizer
""" # noqa: E501

def __init__(self,
Expand Down Expand Up @@ -309,7 +309,7 @@ def prepare(
optim_wrapper_type = OPTIM_WRAPPERS.get(optim_wrapper['type'])
if optim_wrapper_type is None:
raise ValueError(
'Failed to find `optim_wrapper` in `OPTIM_WRAPPERS`.')
f'Failed to find {optim_wrapper["type"]} in `OPTIM_WRAPPERS`.')
if 'clip_grad' in optim_wrapper:
raise ValueError('`Please configure `clip_grad` in `plugin`')
if not issubclass(optim_wrapper_type, ColossalAIOpitmWrapper):
Expand Down