tiankeyu
e18dca6aa6
|
2 years ago | |
---|---|---|
models | 2 years ago | |
scripts | 2 years ago | |
utils | 2 years ago | |
.gitignore | 2 years ago | |
LICENSE | 2 years ago | |
PRETRAIN.md | 2 years ago | |
README.md | 2 years ago | |
decoder.py | 2 years ago | |
dist.py | 2 years ago | |
encoder.py | 2 years ago | |
launch.py | 2 years ago | |
main.py | 2 years ago | |
requirements.txt | 2 years ago | |
sampler.py | 2 years ago | |
spark.py | 2 years ago |
README.md
SparK🔥: "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
Introduction
This is an official implementation of the paper: "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling". We'll be updating frequently these days, so you might consider star ⭐ or watch 👓 this repo to get the latest information. Updates including downstream implementations, Colab tutorial, inference and visualization codes will come soon!
In this work we designed a BERT-style pre-training framework (a.k.a. masked image modeling) for any hierarchical (multi-scale) convnets. As shown above, it gathers all unmasked patches to form a sparse image and uses sparse convolution for encoding. A dense, hierarchical decoder is applied then, to reconstruct all masked pixels. This method is general and powerful: it can be used directly on any convolutional backbones such as classical ResNets (the right) and modern ConvNeXts (left), and can bring a leap in their performance:
See our paper for more analysis, discussion, and evaluation.
Pre-training
See PRETRAIN.md for preparation and pre-training.
ImageNet Fine-tuning
After finishing the preparation in PRETRAIN.md, check downstream_imagenet for subsequent instructions.
Fine-tuning ResNets on COCO
Install detectron2
and see downstream_d2 for more details.
Fine-tuning ConvNeXts on COCO
Install mmcv
and mmdetection
then see downstream_mmdet for more details.
Acknowledgement
We heavily referred to these useful codebases:
We also appreciate these elegant frameworks:
License
This project is under the CC-BY 4.0 license. See LICENSE for more details.
Citation
If you found this project useful, please consider adding a star ⭐, or citing us 📖:
@Article{tian2023designing,
author = {Keyu Tian and Yi Jiang and Qishuai Diao and Chen Lin and Liwei Wang and Zehuan Yuan},
title = {Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling},
journal = {arXiv:2301.03580},
year = {2023},
}