README.md updates

feature/first_batch_of_model_usability_upgrades
SkalskiP 2 years ago
parent 29cc3846fd
commit 78362b1ea3
  1. 34
      README.md

@ -1,14 +1,21 @@
# Grounding DINO
---
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/zero-shot-object-detection-with-grounding-dino.ipynb)
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/grounding-dino-marrying-dino-with-grounded/zero-shot-object-detection-on-mscoco)](https://paperswithcode.com/sota/zero-shot-object-detection-on-mscoco?p=grounding-dino-marrying-dino-with-grounded) \
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/grounding-dino-marrying-dino-with-grounded/zero-shot-object-detection-on-odinw)](https://paperswithcode.com/sota/zero-shot-object-detection-on-odinw?p=grounding-dino-marrying-dino-with-grounded) \
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/grounding-dino-marrying-dino-with-grounded/object-detection-on-coco-minival)](https://paperswithcode.com/sota/object-detection-on-coco-minival?p=grounding-dino-marrying-dino-with-grounded) \
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/grounding-dino-marrying-dino-with-grounded/object-detection-on-coco)](https://paperswithcode.com/sota/object-detection-on-coco?p=grounding-dino-marrying-dino-with-grounded)
# Grounding DINO
Official pytorch implementation of [Grounding DINO](https://arxiv.org/abs/2303.05499), a stronger open-set object detector. Code is available now!
## Highlight
- **Open-Set Detection.** Detect **everything** with language!
- **High Performancce.** COCO zero-shot **52.5 AP** (training without COCO data!). COCO fine-tune **63.0 AP**.
- **Flexible.** Collaboration with Stable Diffusion for Image Editting.
@ -23,21 +30,22 @@ Description
<img src=".asset/hero_figure.png" alt="ODinW" width="100%">
</details>
## TODO List
## TODO
- [x] Release inference code and demo.
- [x] Release checkpoints.
- [ ] Grounding DINO with Stable Diffusion and GLIGEN demos.
## Install
## Usage
### 1. Install
If you have a CUDA environment, please make sure the environment variable `CUDA_HOME` is set.
```bash
pip install -e .
```
### 2. Run an inference demo
## Demo
See the `demo/inference_on_a_image.py` for more details.
```bash
CUDA_VISIBLE_DEVICES=6 python demo/inference_on_a_image.py \
@ -48,7 +56,8 @@ CUDA_VISIBLE_DEVICES=6 python demo/inference_on_a_image.py \
-t "cat ear."
```
### Checkpoints
## Checkpoints
<!-- insert a table -->
<table>
<thead>
@ -74,6 +83,7 @@ CUDA_VISIBLE_DEVICES=6 python demo/inference_on_a_image.py \
</table>
## Results
<details open>
<summary><font size="4">
COCO Object Detection Results
@ -102,11 +112,6 @@ Marrying Grounding DINO with <a href="https://github.com/gligen/GLIGEN">GLIGEN</
<img src=".asset/GD_GLIGEN.png" alt="GD_GLIGEN" width="100%">
</details>
## Model
Includes: a text backbone, an image backbone, a feature enhancer, a language-guided query selection, and a cross-modality decoder.
@ -114,7 +119,8 @@ Includes: a text backbone, an image backbone, a feature enhancer, a language-gui
![arch](.asset/arch.png)
# Links
## Acknowledgement
Our model is related to [DINO](https://github.com/IDEA-Research/DINO) and [GLIP](https://github.com/microsoft/GLIP). Thanks for their great work!
We also thank great previous work including DETR, Deformable DETR, SMCA, Conditional DETR, Anchor DETR, Dynamic DETR, DAB-DETR, DN-DETR, etc. More related work are available at [Awesome Detection Transformer](https://github.com/IDEACVR/awesome-detection-transformer). A new toolbox [detrex](https://github.com/IDEA-Research/detrex) is available as well.
@ -122,8 +128,10 @@ We also thank great previous work including DETR, Deformable DETR, SMCA, Conditi
Thanks [Stable Diffusion](https://github.com/Stability-AI/StableDiffusion) and [GLIGEN](https://github.com/gligen/GLIGEN) for their awesome models.
# Bibtex
## Citation
If you find our work helpful for your research, please consider citing the following BibTeX entry.
```bibtex
@inproceedings{ShilongLiu2023GroundingDM,
title={Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection},

Loading…
Cancel
Save