Introduction

August 2, 2025 ยท View on GitHub

img The official implementation of the paper Cross-modulated Attention Transformer for RGBT Tracking.

device

We train and test our model on two Nvidia 2080ti GPU.

Train

Download SOT pretrained model from OSTrack. Or download from CAFormer_SOTPretrained.pth.tar

cd your_proj_path
python utils/make_pretrained.py  # if you download model weight from ostrack
sh experiments/caformer/train.sh

Test

cd your_proj_path
# For RGBT234
sh experiments/caformer/test234.sh
# For LasHeR
sh experiments/caformer/test245.sh

evaluation

You can use the files in eval_tracker to quickly evaluate the tracking results.

download

Training DatasetRGBT234LasHeRVTUAV-STmodel & result
LasHeR88.3/66.470.0/55.6-/-download
VTUAV-/--/-88.6/76.2download

Citation

@inproceedings{CAFormer,
  title={Cross-modulated Attention Transformer for RGBT Tracking},
  author={Yun Xiao, Jiacong Zhao, Andong Lu, Chenglong Li, Bing Yin, Yin Lin, Cong Liu},
  booktitle={AAAI Conference on Artificial Intelligence},
  year={2025}
}

Acknowledgments

Thanks for the OSTrack, which helps us to quickly implement our ideas.