Tampere University of Technology

TUTCRIS Research Portal

How to Make an RGBD Tracker?

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Standard

How to Make an RGBD Tracker? / Kart, Ugur; Kämäräinen, Joni; Matas, Jiri.

European Conference on Computer Vision (ECCV) Workshop on Visual Object Tracking. Springer, 2018. p. 148-161 (Lecture Notes in Computer Science; Vol. 11129).

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Harvard

Kart, U, Kämäräinen, J & Matas, J 2018, How to Make an RGBD Tracker? in European Conference on Computer Vision (ECCV) Workshop on Visual Object Tracking. Lecture Notes in Computer Science, vol. 11129, Springer, pp. 148-161, European Conference on Computer Vision, Munich, Germany, 8/09/18. https://doi.org/10.1007/978-3-030-11009-3_8

APA

Kart, U., Kämäräinen, J., & Matas, J. (2018). How to Make an RGBD Tracker? In European Conference on Computer Vision (ECCV) Workshop on Visual Object Tracking (pp. 148-161). (Lecture Notes in Computer Science; Vol. 11129). Springer. https://doi.org/10.1007/978-3-030-11009-3_8

Vancouver

Kart U, Kämäräinen J, Matas J. How to Make an RGBD Tracker? In European Conference on Computer Vision (ECCV) Workshop on Visual Object Tracking. Springer. 2018. p. 148-161. (Lecture Notes in Computer Science). https://doi.org/10.1007/978-3-030-11009-3_8

Author

Kart, Ugur ; Kämäräinen, Joni ; Matas, Jiri. / How to Make an RGBD Tracker?. European Conference on Computer Vision (ECCV) Workshop on Visual Object Tracking. Springer, 2018. pp. 148-161 (Lecture Notes in Computer Science).

Bibtex - Download

@inproceedings{c0326ce1604b48218f3bbfdd8740576a,
title = "How to Make an RGBD Tracker?",
abstract = "We propose a generic framework for converting an arbitrary short-term RGB tracker into an RGBD tracker. The proposed framework has two mild requirements – the short-term tracker provides a bounding box and its object model update can be stopped and resumed. The core of the framework is a depth augmented foreground segmentation which is formulated as an energy minimization problem solved by graph cuts. The proposed framework offers two levels of integration. The first requires that the RGB tracker can be stopped and resumed according to the decision on target visibility. The level-two integration requires that the tracker accept an external mask (foreground region) in the target update. We integrate in the proposed framework the Discriminative Correlation Filter (DCF), and three state-of-the-art trackers – Efficient Convolution Operators for Tracking (ECOhc, ECOgpu) and Discriminative Correlation Filter with Channel and Spatial Reliability (CSR-DCF). Comprehensive experiments on Princeton Tracking Benchmark (PTB) show that level-one integration provides significant improvements for all trackers: DCF average rank improves from 18th to 17th, ECOgpu from 16th to 10th, ECOhc from 15th to 5th and CSR-DCF from 19th to 14th. CSR-DCF with level-two integration achieves the top rank by a clear margin on PTB. Our framework is particularly powerful in occlusion scenarios where it provides 13.5{\%} average improvement and 26{\%} for the best tracker (CSR-DCF).",
author = "Ugur Kart and Joni K{\"a}m{\"a}r{\"a}inen and Jiri Matas",
year = "2018",
doi = "10.1007/978-3-030-11009-3_8",
language = "English",
isbn = "978-3-030-11008-6",
series = "Lecture Notes in Computer Science",
publisher = "Springer",
pages = "148--161",
booktitle = "European Conference on Computer Vision (ECCV) Workshop on Visual Object Tracking",

}

RIS (suitable for import to EndNote) - Download

TY - GEN

T1 - How to Make an RGBD Tracker?

AU - Kart, Ugur

AU - Kämäräinen, Joni

AU - Matas, Jiri

PY - 2018

Y1 - 2018

N2 - We propose a generic framework for converting an arbitrary short-term RGB tracker into an RGBD tracker. The proposed framework has two mild requirements – the short-term tracker provides a bounding box and its object model update can be stopped and resumed. The core of the framework is a depth augmented foreground segmentation which is formulated as an energy minimization problem solved by graph cuts. The proposed framework offers two levels of integration. The first requires that the RGB tracker can be stopped and resumed according to the decision on target visibility. The level-two integration requires that the tracker accept an external mask (foreground region) in the target update. We integrate in the proposed framework the Discriminative Correlation Filter (DCF), and three state-of-the-art trackers – Efficient Convolution Operators for Tracking (ECOhc, ECOgpu) and Discriminative Correlation Filter with Channel and Spatial Reliability (CSR-DCF). Comprehensive experiments on Princeton Tracking Benchmark (PTB) show that level-one integration provides significant improvements for all trackers: DCF average rank improves from 18th to 17th, ECOgpu from 16th to 10th, ECOhc from 15th to 5th and CSR-DCF from 19th to 14th. CSR-DCF with level-two integration achieves the top rank by a clear margin on PTB. Our framework is particularly powerful in occlusion scenarios where it provides 13.5% average improvement and 26% for the best tracker (CSR-DCF).

AB - We propose a generic framework for converting an arbitrary short-term RGB tracker into an RGBD tracker. The proposed framework has two mild requirements – the short-term tracker provides a bounding box and its object model update can be stopped and resumed. The core of the framework is a depth augmented foreground segmentation which is formulated as an energy minimization problem solved by graph cuts. The proposed framework offers two levels of integration. The first requires that the RGB tracker can be stopped and resumed according to the decision on target visibility. The level-two integration requires that the tracker accept an external mask (foreground region) in the target update. We integrate in the proposed framework the Discriminative Correlation Filter (DCF), and three state-of-the-art trackers – Efficient Convolution Operators for Tracking (ECOhc, ECOgpu) and Discriminative Correlation Filter with Channel and Spatial Reliability (CSR-DCF). Comprehensive experiments on Princeton Tracking Benchmark (PTB) show that level-one integration provides significant improvements for all trackers: DCF average rank improves from 18th to 17th, ECOgpu from 16th to 10th, ECOhc from 15th to 5th and CSR-DCF from 19th to 14th. CSR-DCF with level-two integration achieves the top rank by a clear margin on PTB. Our framework is particularly powerful in occlusion scenarios where it provides 13.5% average improvement and 26% for the best tracker (CSR-DCF).

U2 - 10.1007/978-3-030-11009-3_8

DO - 10.1007/978-3-030-11009-3_8

M3 - Conference contribution

SN - 978-3-030-11008-6

T3 - Lecture Notes in Computer Science

SP - 148

EP - 161

BT - European Conference on Computer Vision (ECCV) Workshop on Visual Object Tracking

PB - Springer

ER -