Wang, Wenguan, Shen, Jianbing, Shao, Ling and Porikli, Fatih (2016) Correspondence driven saliency transfer. IEEE Transactions on Image Processing, 25 (11). pp. 5025-5034. ISSN 1057-7149
Preview |
PDF (Accepted manuscript)
- Accepted Version
Download (1MB) | Preview |
Abstract
In this paper, we show that large annotated data sets have great potential to provide strong priors for saliency estimation rather than merely serving for benchmark evaluations. To this end, we present a novel image saliency detection method called saliency transfer. Given an input image, we first retrieve a support set of best matches from the large database of saliency annotated images. Then, we assign the transitional saliency scores by warping the support set annotations onto the input image according to computed dense correspondences. To incorporate context, we employ two complementary correspondence strategies: a global matching scheme based on scene-level analysis and a local matching scheme based on patch-level inference. We then introduce two refinement measures to further refine the saliency maps and apply the random-walk-with-restart by exploring the global saliency structure to estimate the affinity between foreground and background assignments. Extensive experimental results on four publicly available benchmark data sets demonstrate that the proposed saliency algorithm consistently outperforms the current state-of-the-art methods.
Item Type: | Article |
---|---|
Faculty \ School: | Faculty of Science > School of Computing Sciences |
Depositing User: | Pure Connector |
Date Deposited: | 09 Mar 2017 01:41 |
Last Modified: | 21 Oct 2022 09:30 |
URI: | https://ueaeprints.uea.ac.uk/id/eprint/62909 |
DOI: | 10.1109/TIP.2016.2601784 |
Downloads
Downloads per month over past year
Actions (login required)
View Item |