Open
Description
I have the opposite situation.
RuntimeError: Error(s) in loading state_dict for UPDETR:
size mismatch for class_embed.weight: copying a param with shape torch.Size([3, 256]) from checkpoint, the shape in current model is torch.Size([92, 256]).
size mismatch for class_embed.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([92]).
How to fix it?
Originally posted by @liuchengying758650786 in #13 (comment)
Metadata
Assignees
Labels
No labels