Perspective model-based visual tracking scheme for robust tracking of objects in complex environs

Multimedia Tools Appl.(2017)

Cited 1|Views2
No score
Abstract
Robust tracking of moving objects is still an open problem in computer vision. The problem and its difficulty depend on many factors, which includes the availability of prior knowledge of the target. Occlusions and fast motions are among the major challenges in the process. When such hindrances occur, misjudgment of the target causes incorrect updating of target template model. Therefore, preserving the stability and precision of the tracker becomes crucial. This paper addresses the problem of handling fast moving objects and dynamic occlusions in visual tracking. The novelty of the proposed method is in the usage of an active perspective learning and a new adaptive incremental model update mechanism. A consistency measure acquired using the information from the spatial viewpoint model and the perspective model differentiates the target from its surrounding at every instance of tracking. The Discrete Hartley transform (DHT) is applied for image transformation, learning and target prediction. To auto-adjust the bounding box during tracking, a new technique of dynamic scale fill-in is used. Experimental results of the proposed method, when tested on a number of challenging datasets with occlusion, non-rigid deformation, and other major challenges highlights the better ability and robustness of the proposed method under tough conditions.
More
Translated text
Key words
Visual tracking, Perspective model, Occlusion handling, Adaptive learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined