Hashing with Mutual Information

September 21, 2021 · View on GitHub

This repository contains Matlab implementation of the below papers:

  1. "MIHash: Online Hashing with Mutual Information",

    Fatih Cakir*, Kun He*, Sarah Adel Bargal, and Stan Sclaroff. (* Equal contribution)

    International Conference on Computer Vision (ICCV), 2017 (arXiv)

  2. "Hashing with Mutual Information",

    Fatih Cakir*, Kun He*, Sarah Adel Bargal, and Stan Sclaroff. (* Equal contribution)

    IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI) 2019 (arXiv)

The repo includes:

  • Both online/batch versions of our MIHash method from above paper. :warning: Note: The batch (deep) learning of MIHash implementation is updated and moved to deep-mihash.
  • An experimental framework for online hashing methods
  • Implementations of several online hashing techniques

Preparation

  • Create or symlink a directory cachedir under the main directory to hold experimental results
  • Install or symlink VLFeat at ./vlfeat (for computing performance metrics)
  • Install or symlink MatConvNet at ./matconvnet (for batch hashing experiments)
  • Download datasets and pretrained models: see data/README.md.

Usage

  • In the main folder, run startup.m.
  • For online hashing experiments: cd online-hashing, and run demo_online.m with appropriate input arguments (see online-hashing/README.md).
  • For batch hashing experiments: cd batch-hashing, and run demo_cifar.m with appropriate input arguments (see batch-hashing/README.md).

Batch Results

:warning: This portion of the repo is outdated with inferior results. Please refer to the new repo for the latest results for deep/batch learning of MIHash!

Here we provide the latest results of MIHash and other competing work on CIFAR-10. For reproducibility, we also provide the parameters for MIHash we used to obtain these results (see batch-hashing/opts_batch.m).

CIFAR-10

The standard setup for CIFAR-10 has two distinct settings (as specified in the papers DTSH and MIHash). The results shown here uses the VGG-F deep learning architecture and learning is done in an end-to-end fashion. For non-deep methods this corresponds to using the features at the penultimate layer of VGG-F. (Note that differently, in the MIHash paper, we do VGG-16 single-layer experiments for setting-1).

Please refer to the above papers for details regarding setting 1 and 2.

Setting 1: Mean Average Precision

Method12-Bits24-Bits32-Bits48-Bits
BRE0.3610.4480.5020.533
MACHash0.6280.7070.7260.734
FastHash0.6780.7290.7420.757
StructHash0.6640.6930.6910.700
DPSH0.7200.7570.7570.767
DTSH0.7250.7730.7810.810
MIHash0.6870.7880.78990.826

MIHash Parameters:

Setting 2: Mean Average Precision

Method16-Bits24-Bits32-Bits48-Bits
DPSH0.9080.9090.9170.932
DTSH0.9160.9240.9270.934
MIHash0.9220.9310.9400.942

MIHash Parameters:

NOTE: These diaries are from older versions of the repo, where different parameter names might be used. By inspection the parameters can easily be matched to opts_batch.m. Notably sigscale is equal to sigmf(1).

Contact

Please email fcakirs@gmail.com or kunhe@fb.com if you have any questions.

License

BSD License, see LICENSE

If you use this code in your research, please cite:

@inproceedings{mihash,
  title={MIHash: Online Hashing with Mutual Information},
  author={Fatih Cakir and Kun He and Sarah A. Bargal and Stan Sclaroff},
  booktitle={IEEE International Conference on Computer Vision (ICCV)},
  year={2017}
}

References

  • BRE: Brian Kulis and Trevor Darrell. Learning to hash with binary reconstructive embeddings. In Advances in Neural Information Processing Systems (NIPS), 2009.
  • MACHash: Ramin Raziperchikolaei and Miguel A Carreira-Perpinán. Optimizing affinity-based binary hashing using auxiliary coordinates. In Advances in Neural Information Processing Systems (NIPS), 2016
  • FastHash: Guosheng Lin, Chunhua Shen, Qinfeng Shi, Anton van den Hengel, and David Suter. Fast supervised hashing with decision trees for high-dimensional data. In Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014.
  • StructHash: Guosheng Lin, Fayao Liu, Chunhua Shen, Jianxin Wu, and Heng Tao Shen. Structured learning of binary codes with column generation for optimizing ranking measures. International Journal of Computer Vision (IJCV), 2016.
  • DPSH: Wu-Jun Li, Sheng Wang, and Wang-Cheng Kang. Feature learning based deep supervised hashing with pairwise labels. In Proc. International Joint Conference on Artificial Intelligence (IJCAI), 2016.
  • DTSH: Yi Wang, Xiaofang Shi and Kris M Kitani. Deep supervised hashing with triplet labels. In Proc. Asian Conference on Computer Vision (ACCV), 2016