Latent structure preserving hashing

Liu, Li, Yu, Mengyang and Shao, Ling (2017) Latent structure preserving hashing. International Journal of Computer Vision, 122 (3). 439–457. ISSN 0920-5691

[thumbnail of Published manuscript]
Preview
PDF (Published manuscript) - Published Version
Available under License Creative Commons Attribution.

Download (4MB) | Preview

Abstract

Aiming at efficient similarity search, hash functions are designed to embed high-dimensional feature descriptors to low-dimensional binary codes such that similar descriptors will lead to binary codes with a short distance in the Hamming space. It is critical to effectively maintain the intrinsic structure and preserve the original information of data in a hashing algorithm. In this paper, we propose a novel hashing algorithm called Latent Structure Preserving Hashing (LSPH), with the target of finding a well-structured low-dimensional data representation from the original high-dimensional data through a novel objective function based on Nonnegative Matrix Factorization (NMF) with their corresponding Kullback-Leibler divergence of data distribution as the regularization term. Via exploiting the joint probabilistic distribution of data, LSPH can automatically learn the latent information and successfully preserve the structure of high-dimensional data. To further achieve robust performance with complex and nonlinear data, in this paper, we also contribute a more generalized multi-layer LSPH (ML-LSPH) framework, in which hierarchical representations can be effectively learned by a multiplicative up-propagation algorithm. Once obtaining the latent representations, the hash functions can be easily acquired through multi-variable logistic regression. Experimental results on three large-scale retrieval datasets, i.e., SIFT 1M, GIST 1M and 500 K TinyImage, show that ML-LSPH can achieve better performance than the single-layer LSPH and both of them outperform existing hashing techniques on large-scale data.

Item Type: Article
Uncontrolled Keywords: hashing,nonnegative matrix factorization,latent structure,dimensionality reduction,multi-layer extension
Faculty \ School: Faculty of Science > School of Computing Sciences
Related URLs:
Depositing User: Pure Connector
Date Deposited: 28 Jan 2017 02:18
Last Modified: 03 Jul 2023 10:30
URI: https://ueaeprints.uea.ac.uk/id/eprint/62218
DOI: 10.1007/s11263-016-0931-4

Downloads

Downloads per month over past year

Actions (login required)

View Item View Item