Literature DB >> 26441458

Structure Sensitive Hashing With Adaptive Product Quantization.

Xianglong Liu, Bowen Du, Cheng Deng, Ming Liu, Bo Lang.   

Abstract

Hashing has been proved as an attractive solution to approximate nearest neighbor search, owing to its theoretical guarantee and computational efficiency. Though most of prior hashing algorithms can achieve low memory and computation consumption by pursuing compact hash codes, however, they are still far beyond the capability of learning discriminative hash functions from the data with complex inherent structure among them. To address this issue, in this paper, we propose a structure sensitive hashing based on cluster prototypes, which explicitly exploits both global and local structures. An alternating optimization algorithm, respectively, minimizing the quantization loss and spectral embedding loss, is presented to simultaneously discover the cluster prototypes for each hash function, and optimally assign unique binary codes to them satisfying the affinity alignment between them. For hash codes of a desired length, an adaptive bit assignment is further appended to the product quantization of the subspaces, approximating the Hamming distances and meanwhile balancing the variance among hash functions. Experimental results on four large-scale benchmarks CIFAR-10, NUS-WIDE, SIFT1M, and GIST1M demonstrate that our approach significantly outperforms state-of-the-art hashing methods in terms of semantic and metric neighbor search.

Year:  2015        PMID: 26441458     DOI: 10.1109/TCYB.2015.2474742

Source DB:  PubMed          Journal:  IEEE Trans Cybern        ISSN: 2168-2267            Impact factor:   11.448


  1 in total

1.  Triplet Deep Hashing with Joint Supervised Loss Based on Deep Neural Networks.

Authors:  Mingyong Li; Ziye An; Qinmin Wei; Kaiyue Xiang; Yan Ma
Journal:  Comput Intell Neurosci       Date:  2019-10-09
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.