SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation

September 27, 2024 Β· View on GitHub

Changhong Fu*, Liangliang Yao, Haobo Zuo, Guangze Zheng, Jia Pan

  • * Corresponding author.

Vision4robotics

πŸ“’ News

  • SAM-DA is accepted by IEEE ICARM.
  • The paper β€œSAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation” is awarded the Toshio Fukuda Best Paper Award in Mechatronics of ICARM 2024!

πŸ—οΈ Framework

Framework

πŸ‘€ Visualization of SAM-DA

One-to-many generation

πŸ› οΈ Installation

This code has been tested on Ubuntu 18.04, Python 3.8.3, Pytorch 1.13.1, and CUDA 11.6. Please install related libraries before running this code:

Install Segment Anything:

bash install.sh

Install SAM-DA-Track:

pip install -r requirements.txt

πŸ˜€ Getting started

Test SAM-DA

cd tracker/BAN
python tools/test.py 
python tools/eval.py
  • (optional) Test with other checkpoints (e.g., sam-da-track-s):
cd tracker/BAN
python tools/test.py --snapshot sam-da-track-s
python tools/eval.py

Train SAM-DA

  • SAM-powered target domain training sample swelling on NAT2021-train.

    1. Download original nighttime dataset NAT2021-train and put it in ./tracker/BAN/train_dataset/sam_nat.
    2. Sam-powered target domain training sample swelling!
    bash swell.sh
    

    ⚠️ warning: A huge passport is necessary for saving data.

    Training jsons are here: Baidu.

  • Prepare daytime dataset [VID] and [GOT-10K].

    1. Download VID and GOT-10K and put them in ./tracker/BAN/train_dataset/vid and ./tracker/BAN/train_dataset/got10k, respectively.
    2. Crop data following the instruction for VID and GOT-10k.
  • Train sam-da-track-b (default) and other models.

    cd tracker/BAN
    python tools/train.py --model sam-da-track-b
    

🌈 Fewer data, better performance

SAM-DA aims to reach the few-better training for quick deployment of night-time tracking methods for UAVs.

  • SAM-DA enriches the training samples and attributes (ambient intensity) of target domain.

  • SAM-DA can achieve better performance on fewer raw images with quicker training.

    MethodTraining dataImagesPropotionTrainingAUC (NUT-L)
    BaselineNAT2021-train276k100%12h0.377
    SAM-DASAM-NAT-N28k10%2.4h0.411
    SAM-DASAM-NAT-T92k33%4h0.414
    SAM-DASAM-NAT-S138k50%6h0.419
    SAM-DASAM-NAT-B276k100%12h0.430

    For more details, please refer to the paper.

Training duration on a single A100 GPU.

License

The model is licensed under the Apache License 2.0 license.

Citations

Please consider citing the related paper(s) in your publications if it helps your research.

@Inproceedings{Yao2023SAMDA,
  title={{SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation}},
  author={Fu, Changhong and Yao, Liangliang and Zuo, Haobo and Zheng, Guangze and Pan, Jia},
  booktitle={Proceedings of the IEEE  International Conference on Advanced Robotics and Mechatronics (ICARM)},
  year={2024}
  pages={1-8}
}
@article{kirillov2023segment,
  title={{Segment Anything}},
  author={Kirillov, Alexander and Mintun, Eric and Ravi, Nikhila and Mao, Hanzi and Rolland, Chloe and Gustafson, Laura and Xiao, Tete and Whitehead, Spencer and Berg, Alexander C and Lo, Wan-Yen and others},
  journal={arXiv preprint arXiv:2304.02643},
  year={2023}
  pages={1-30}
}
@Inproceedings{Ye2022CVPR,
title={{Unsupervised Domain Adaptation for Nighttime Aerial Tracking}},
author={Ye, Junjie and Fu, Changhong and Zheng, Guangze and Paudel, Danda Pani and Chen, Guang},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2022},
pages={1-10}
}

Acknowledgments

We sincerely thank the contribution of following repos: SAM, SiamBAN, and UDAT.

Contact

If you have any questions, please contact Liangliang Yao at 1951018@tongji.edu.cn or Changhong Fu at changhongfu@tongji.edu.cn.