From 261baf641cb9ada07dd9746e420ada7fe8a03152 Mon Sep 17 00:00:00 2001 From: YuAng Date: Fri, 25 Jun 2021 14:53:37 +0800 Subject: [PATCH] Update dataset set-up guide in TRAINING.md --- docs/TRAINING.md | 30 ++++++++++++++++++++---------- 1 file changed, 20 insertions(+), 10 deletions(-) diff --git a/docs/TRAINING.md b/docs/TRAINING.md index 6ceac7d..2c2735e 100644 --- a/docs/TRAINING.md +++ b/docs/TRAINING.md @@ -4,7 +4,22 @@ ## Dataset setup Generally, two parts of data are needed for training LoFTR, the original dataset, i.e., ScanNet and MegaDepth, and the offline generated dataset indices. The dataset indices store scenes, image pairs, and other metadata within each dataset used for training/validation/testing. For the MegaDepth dataset, the relative poses between images used for training are directly cached in the indexing files. However, the relative poses of ScanNet image pairs are not stored due to the enormous resulting file size. -**Download the dataset indices** +### Download datasets +#### MegaDepth +We use depth maps provided in the [original MegaDepth dataset](https://www.cs.cornell.edu/projects/megadepth/) as well as undistorted images, corresponding camera intrinsics and extrinsics preprocessed by [D2-Net](https://github.com/mihaidusmanu/d2-net#downloading-and-preprocessing-the-megadepth-dataset). You can download them separately from the following links. +- [MegaDepth undistorted images and processed depths](https://www.cs.cornell.edu/projects/megadepth/dataset/Megadepth_v1/MegaDepth_v1.tar.gz) + - Note that we only use depth maps. + - Path of the download data will be referreed to as `/path/to/megadepth` +- [D2-Net preprocessed images](https://drive.google.com/drive/folders/1hxpOsqOZefdrba_BqnW490XpNX_LgXPB) + - Images are undistorted manually in D2-Net since the undistorted images from MegaDepth do not come with corresponding intrinsics. + - Path of the download data will be referreed to as `/path/to/megadepth_d2net` + +#### ScanNet +Please set up the ScanNet dataset following [the official guide](https://github.com/ScanNet/ScanNet#scannet-data) +> NOTE: We use the [python exported data](https://github.com/ScanNet/ScanNet/tree/master/SensReader/python), +instead of the [c++ exported one](https://github.com/ScanNet/ScanNet/tree/master/SensReader/c%2B%2B). + +### Download the dataset indices You can download the required dataset indices from the [following link](https://drive.google.com/drive/folders/1DOcOPZb3-5cWxLqn256AhwUVjBPifhuf). After downloading, unzip the required files. @@ -20,12 +35,9 @@ tar xf testdata/megadepth_test_1500.tar tar xf testdata/scannet_test_1500.tar ``` -**Build the dataset symlinks** +### Build the dataset symlinks -We symlink the datasets to the /data directory under the main LoFTR project directory. - -> NOTE: For the ScanNet dataset, we use the [python exported data](https://github.com/ScanNet/ScanNet/tree/master/SensReader/python), -instead of the [c++ exported one](https://github.com/ScanNet/ScanNet/tree/master/SensReader/c%2B%2B). +We symlink the datasets to the `data` directory under the main LoFTR project directory. ```shell # scannet @@ -37,10 +49,8 @@ ln -s /path/to/scannet_indices/* /path/to/LoFTR/data/scannet/index # megadepth # -- # train and test dataset (train and test share the same dataset) -ln -s /path/to/megadepth/Undistorted_SfM /path/to/LoFTR/data/megadepth/train -ln -s /path/to/megadepth/phoenix /path/to/LoFTR/data/megadepth/train -ln -s /path/to/megadepth/Undistorted_SfM /path/to/LoFTR/data/megadepth/test -ln -s /path/to/megadepth/phoenix /path/to/LoFTR/data/megadepth/test +ln -sv /path/to/megadepth/phoenix /path/to/megadepth_d2net/Undistorted_SfM /path/to/LoFTR/data/megadepth/train +ln -sv /path/to/megadepth/phoenix /path/to/megadepth_d2net/Undistorted_SfM /path/to/LoFTR/data/megadepth/test # -- # dataset indices ln -s /path/to/megadepth_indices/* /path/to/LoFTR/data/megadepth/index ```