Skip to content

Latest commit

 

History

History
 
 

SwAV

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 

SwAV in VISSL

Unsupervised Learning of Visual Features by Contrasting Cluster Assignments

Mathilde Caron, Ishan Misra, Julien Mairal, Priya Goyal, Piotr Bojanowski, Armand Joulin

[SwAV] [arXiv] [BibTeX]

SwAV Illustration

In this repository, we implement SwAV in VISSL. To train a model, use the configs specified here.

Model Zoo

To use a pre-trained SwAV ResNet-50 model, simply do:

import torch
model = torch.hub.load('facebookresearch/swav', 'resnet50')

We provide several baseline SwAV pre-trained models with ResNet-50 architecture in torchvision format. We also provide models pre-trained with DeepCluster-v2 and SeLa-v2 obtained by applying improvements from the self-supervised community to DeepCluster and SeLa (see details in the appendix of our paper).

method epochs batch-size multi-crop ImageNet top-1 acc. url
SwAV 800 4096 2x224 + 6x96 75.3 model
SwAV 400 4096 2x224 + 6x96 74.6 model
SwAV 200 4096 2x224 + 6x96 73.9 model
SwAV 100 4096 2x224 + 6x96 72.1 model
SwAV 200 256 2x224 + 6x96 72.7 model
SwAV 400 256 2x224 + 6x96 74.3 model
SwAV 400 4096 2x224 70.1 model
DeepCluster-v2 800 4096 2x224 + 6x96 75.2 model
DeepCluster-v2 400 4096 2x160 + 4x96 74.3 model
DeepCluster-v2 400 4096 2x224 70.2 model
SeLa-v2 400 4096 2x160 + 4x96 71.8 model
SeLa-v2 400 4096 2x224 67.2 model

Larger architectures

We provide SwAV models with ResNet-50 networks where we multiply the width by a factor ×2, ×4, and ×5.

network parameters epochs ImageNet top-1 acc. url
RN50-w2 94M 400 77.3 model
RN50-w4 375M 400 77.9 model
RN50-w5 586M 400 78.5 model

Citing SwAV

If you use SwAV, please use the following BibTeX entry.

@article{caron2020unsupervised,
  title={Unsupervised Learning of Visual Features by Contrasting Cluster Assignments},
  author={Caron, Mathilde and Misra, Ishan and Mairal, Julien and Goyal, Priya and Bojanowski, Piotr and Joulin, Armand},
  booktitle={Proceedings of Advances in Neural Information Processing Systems (NeurIPS)},
  year={2020}
}