You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
tiankeyu c92fa51844 [upd] readmes 2 years ago
models [initial commit] 2 years ago
scripts [upd] readme 2 years ago
utils [upd] readme 2 years ago
.gitignore [initial commit] 2 years ago
INSTALL.md [upd] readmes 2 years ago
LICENSE [initial commit] 2 years ago
PRETRAIN.md [upd] readmes 2 years ago
README.md [upd] readmes 2 years ago
decoder.py [initial commit] 2 years ago
dist.py [initial commit] 2 years ago
encoder.py [initial commit] 2 years ago
launch.py [initial commit] 2 years ago
main.py [initial commit] 2 years ago
requirements.txt [upd] readme & requirements 2 years ago
sampler.py [initial commit] 2 years ago
spark.py [initial commit] 2 years ago

README.md

SparK: the first successful BERT-style pre-training on any convolutional networks arXiv

This is an official implementation of the paper "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling". (submitted to openreview ICLR'23 in Oct. 2022)

What's new here?

🔥 On ResNets, generative pre-training surpasses contrastive learning for the first time:

🔥 ConvNeXt gains more from pre-training than Swin-Transformer, up to +3.5 points:

🔥 Larger models benefit more from SparK pre-training, showing a scaling behavior:

🔥 Pre-trained model can make reasonable predictions:

See our paper for more analysis, discussions, and evaluations.

Catalog

  • Pre-training code
  • Fine-tuning code
  • Colab playground
  • Inference and visualization demo

Install

Check INSTALL.md to install all dependencies. Our implementation is based on torch==1.10.0+cu113, torchvision==0.11.1+cu113, and timm==0.5.4. This sparse convolution framework is an optional library.

Pre-training

See PRETRAIN.md to pre-train models on ImageNet.

Fine-tuning

  • Models on ImageNet: after installation, check downstream_imagenet for subsequent instructions.
  • ResNets on COCO: install detectron2 and see downstream_d2 for more details.
  • ConvNeXts on COCO: install mmcv and mmdetection then see downstream_mmdet for more details.

Acknowledgement

We heavily referred to these useful codebases:

We also appreciate these elegant frameworks:

License

This project is under the CC-BY 4.0 license. See LICENSE for more details.

Citation

If you found this project useful, please consider adding a star , or citing us 📖:

@Article{tian2023designing,
  author  = {Keyu Tian and Yi Jiang and Qishuai Diao and Chen Lin and Liwei Wang and Zehuan Yuan},
  title   = {Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling},
  journal = {arXiv:2301.03580},
  year    = {2023},
}