Optimizer is None when trying to finetune a pretrained model #28
Open
Description
Thanks for the repo!
When trying to finetune one of the provided pretrained models, I was getting an unintuitive error. This was because the models were saved without optimizer and when trying to load the checkpoint, in line 76 in training/trainer.py, the check wouldnt stop it from loading the optimizer as checkpoint['optimizer']
existed in the dict with None
value
optimizer = Adam(model.parameters())
if 'optimizer' in checkpoint:
optimizer.load_state_dict(checkpoint['optimizer'])
for g in optimizer.param_groups:
g['lr'] = config['training']['learning_rate']
changing the line to if 'optimizer' in checkpoint and checkpoint['optimizer']:
should fix it.
Metadata
Assignees
Labels
No labels
Activity
cschaefer26 commentedon Feb 23, 2023
Hi, thanks for the hint. I will update this if I have time :)