Visual tracking under motion blur

Ma, Bo, Huang, Lianghua, Shen, Jianbing, Shao, Ling, Yang, Ming-Hsuan and Porikli, Fatih (2016) Visual tracking under motion blur. IEEE Transactions on Image Processing, 25 (12). pp. 5867-5876. ISSN 1057-7149

[img]
Preview
PDF (Accepted manuscript) - Submitted Version
Download (994kB) | Preview

Abstract

Most existing tracking algorithms do not explicitly consider the motion blur contained in video sequences, which degrades their performance in real-world applications where motion blur often occurs. In this paper, we propose to solve the motion blur problem in visual tracking in a unified framework. Specifically, a joint blur state estimation and multi-task reverse sparse learning framework are presented, where the closed-form solution of blur kernel and sparse code matrix is obtained simultaneously. The reverse process considers the blurry candidates as dictionary elements, and sparsely represents blurred templates with the candidates. By utilizing the information contained in the sparse code matrix, an efficient likelihood model is further developed, which quickly excludes irrelevant candidates and narrows the particle scale down. Experimental results on the challenging benchmarks show that our method performs well against the state-of-the-art trackers.

Item Type: Article
Faculty \ School: Faculty of Science > School of Computing Sciences
Depositing User: Pure Connector
Date Deposited: 09 Mar 2017 01:41
Last Modified: 22 Apr 2020 02:31
URI: https://ueaeprints.uea.ac.uk/id/eprint/62908
DOI: 10.1109/TIP.2016.2615812

Actions (login required)

View Item View Item