You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When trying to finetune one of the provided pretrained models, I was getting an unintuitive error. This was because the models were saved without optimizer and when trying to load the checkpoint, in line 76 in training/trainer.py, the check wouldnt stop it from loading the optimizer as checkpoint['optimizer'] existed in the dict with None value
optimizer = Adam(model.parameters())
if 'optimizer' in checkpoint:
optimizer.load_state_dict(checkpoint['optimizer'])
for g in optimizer.param_groups:
g['lr'] = config['training']['learning_rate']
changing the line to if 'optimizer' in checkpoint and checkpoint['optimizer']: should fix it.
The text was updated successfully, but these errors were encountered:
Thanks for the repo!
When trying to finetune one of the provided pretrained models, I was getting an unintuitive error. This was because the models were saved without optimizer and when trying to load the checkpoint, in line 76 in training/trainer.py, the check wouldnt stop it from loading the optimizer as
checkpoint['optimizer']
existed in the dict withNone
valuechanging the line to
if 'optimizer' in checkpoint and checkpoint['optimizer']:
should fix it.The text was updated successfully, but these errors were encountered: